Originally posted by: yacoub
Can anyone explain why these geniuses who overclock with ridiculous cooling devices STILL don't bother to put any heatsinks on the RAM chips? How do you expect to get a higher RAM OC if you have nothing taking the heat away from the RAM chips?! This used to be a common sense part of OC'ing a videocard but I've seen a couple reviews now try overclocking without that and then wonder why the RAM seems to be at its max already.
Originally posted by: Wreckage
Originally posted by: yacoub
Can anyone explain why these geniuses who overclock with ridiculous cooling devices STILL don't bother to put any heatsinks on the RAM chips? How do you expect to get a higher RAM OC if you have nothing taking the heat away from the RAM chips?! This used to be a common sense part of OC'ing a videocard but I've seen a couple reviews now try overclocking without that and then wonder why the RAM seems to be at its max already.
I saw a thread (I think it was at Hardocp) where someone made ramsinks outta pennies. :laugh:
Originally posted by: Extelleron
Originally posted by: coldpower27
Originally posted by: Extelleron
Originally posted by: Matt2
Originally posted by: Extelleron
The GeForce 7 had fine image quality if you turned all the driver settings up, but then let's see the performance. The 7800/7900 could barely compete in performance WITH the default driver settings... turn everything up and lets see how playable it is.
OMG, are you serious right now?
Turning the quality setting to HQ did incur a performance hit, but in no way shape or form made it "unplayable".
I can't say I've ever had a 7900GTX to try it out, but those tests that were run at high IQ settings (high quality AA +AF) saw a huge difference between X1900 series cards and the 7900 series, much larger than with normal IQ settings.
Depends on the game, there are some cases of 20% differences between the X1950 XTX vs the 7900 GTX, talking about 4xAA/16XAF settings. But even with those differences the 7900 GTX was not unplayable.
The below review illustrates the differences between the X1950 XTX and 7900 GTX even with HQ Settings.
http://www.xbitlabs.com/articles/video/display/gf8800-games.html
When you go for the High IQ, with 6xAA (ATI) or 8xAA (nVidia), the 7900GTX is less than half as fast as the X1950XTX.
Originally posted by: CaiNaM
still, despite nvidia's market superiority, the high end x18/19xx cards were clearly superior to nvidia's 7 series in both performance and image quality. even w/o the shimmering nvidia could not match ati's HQAF.
Originally posted by: Falloutboy
looks actaully pretty decent if the 2900xt streets at 350 and with a bit more driver work it will be on part with the GTS. I was hoping for the 2600 numbers since I"m in the market for something in the sub 200 range and was hoping to get a ballpark on those cards before I make a choice this week
Originally posted by: nullpointerus
Originally posted by: Falloutboy
looks actaully pretty decent if the 2900xt streets at 350 and with a bit more driver work it will be on part with the GTS. I was hoping for the 2600 numbers since I"m in the market for something in the sub 200 range and was hoping to get a ballpark on those cards before I make a choice this week
The 2600-series cards were delayed for several weeks, weren't they?
Originally posted by: n7
Not much for high resolution results, where we already know the HD 2900 XT does better, & far too many stupid benches w/o AA.
Originally posted by: n7
Not very impressed with this review.
Not much for high resolution results, where we already know the HD 2900 XT does better, & far too many stupid benches w/o AA.
Who the eff uses no AA these days...
No Vista results either :frown:
I am sick of seeing reviewing assuming people are still stuck using a 5 year old OS.
As for IQ, you guys are f*cking retarded.
AF was comparable between both 2900 & 8800s, maybe slightly better when blown up on 8800s.
How much of the time playing games are you going to be saving screenshots, & then blowing them up to gaze at the AF features for hours? :roll:
AA was arguably slightly better on the 2900 when blown up.
Again, since when do people spending hours staring @ enlarged screenshots of games? :roll:
Stop being f*cking morons, seriously.
I'll be waiting for more reviews showing the cards @ 2560x1600 (where i want to be able to play at) in Vista
Originally posted by: coldpower27
Originally posted by: apoppin
Originally posted by: Matt2
Very disappointing.
Nvidia has a faster card and better IQ. When was the last time that happened?
with x1800
with r8500
with Radeon DDR
and with Rage Fury 32
only the nvidia IQ is NOT better ... thos pics last night are photo chop
and *remember* we have a $400 card competing pretty well with a $650 one
-with *major* driver improvements coming and likely better DX10 perf too
not to mention those gawd-awful nvidia drivers that they STILL can''t get right --- after SIX long months
i know which one i will pick
[2950xt]![]()
Did you read the VR-Zone review? Right now the X2900 XT is competing fairly against the 8800 GTS 640, it's faster in some things and even in others and ALOT slower in some due to driver imaturity. It also consumes about 65W more then the GTS so it at least needs a 500W PS, and 2x6Pin PCI-E connectors to run minimum as that is the typical mininum level before you seen 2x6Pin PCI-E Connectors.
Nvidia has sharper AF quality this generation, with ATI having the better MSAA quality as their 24xAA vs 16xQ AA setting so overall. So image quality would depend on what you prefer AA or AF, depending on which settings as well, I wonder what things are like at 8XQ vs ATi's 8XAA.
I think this card overall performance wise should be quicker then the 8800 GTS 640 once the drivers mature more, but once again you have the issue, of a card that consumes about 50% more juice that is not significantly faster in games.
MSRP of the 8800 GTX is 599USD not sure where your getting 650 from, and it can be had for as cheap as 550USD nowadays due to the sheer amount of time it's been on the market.
Given the right price I would say depending on your needs the X2900 XT could be good, but it really depends on what you prefer. I probably can't consider this card until the drivers mature a bit, same idea with Vista.
I am not willing to play guinea pig for Microsoft or ATi/AMD.
Originally posted by: Nightmare225
Originally posted by: n7
Not much for high resolution results, where we already know the HD 2900 XT does better, & far too many stupid benches w/o AA.
I'm guessing you didn't read the article and vr-zone felt too embarrassed to post AA benchmarks because the HD2900 had some driver bugs, making it stay below even the X1950 in terms of performance. :laugh:
Originally posted by: Extelleron
Image quality wise it's clear that ATI has better AA this gen (16x vs 16x, HD 2900 is better, with 24xAA it will be even better) and nVidia has *slightly* better AF.
Originally posted by: n7
Originally posted by: Nightmare225
Originally posted by: n7
Not much for high resolution results, where we already know the HD 2900 XT does better, & far too many stupid benches w/o AA.
I'm guessing you didn't read the article and vr-zone felt too embarrassed to post AA benchmarks because the HD2900 had some driver bugs, making it stay below even the X1950 in terms of performance. :laugh:
ROFL @ bringing up driver bugs :roll:
I want a card that works well & with all capabilities in Vista.
Newsflash, nVidia has been sucking large nuts when it comes to drivers lately, especially in Vista, & unfortunately, i can't trust them to actually fix issues or release fixes, since improvements just aren't the nVidia way.
ATi has drivers issues now with the HD 2900, yes, i am well aware of that.
But at least i can count on a new set of drivers from them every month, & what's likely going to be much better support in Vista.
Pardon the my pissiness, but i'm not impressed with either choice i have right now :frown:
An 8800 GTX that's guaranteed to be a bugfest, or a poorer performing HD 2900 XT that's slower, even after a half a year delay :roll:
And don't start with the Vista bashing; some of us prefer an improved OS.
I am very unimpressed w/ both nV & ATi & review sites for ignoring the fact that XP = soon to be irrelevent, as every new PC sold in the last few months has Vista.
Originally posted by: Extelleron
Image quality wise it's clear that ATI has better AA this gen (16x vs 16x, HD 2900 is better, with 24xAA it will be even better) and nVidia has *slightly* better AF.
Originally posted by: Nightmare225
Originally posted by: Extelleron
Image quality wise it's clear that ATI has better AA this gen (16x vs 16x, HD 2900 is better, with 24xAA it will be even better) and nVidia has *slightly* better AF.
What games will be able to run smoothly at a decent resolution with that level of AA?![]()
Originally posted by: Extelleron
Anyway, before everyone jumps on the "8800 IQ is better" bandwagon... I don't see the difference between any of those shots and apparently neither did the reviewer.... he said their MIGHT have been a very small difference in favor of nVidia... because of those words everyone on here starts talking about how sucky ATI IQ is.
Originally posted by: Wreckage
Originally posted by: Extelleron
Image quality wise it's clear that ATI has better AA this gen (16x vs 16x, HD 2900 is better, with 24xAA it will be even better) and nVidia has *slightly* better AF.
You must be joking. It looks like they used "Vasoline AA". The screenshots so far show major blurring and obscuring of detail. I guess if you call "better AA" taking your contacts out or rubbing hot peppers into your eyes.
Originally posted by: keysplayr2003
Originally posted by: Extelleron
Anyway, before everyone jumps on the "8800 IQ is better" bandwagon... I don't see the difference between any of those shots and apparently neither did the reviewer.... he said their MIGHT have been a very small difference in favor of nVidia... because of those words everyone on here starts talking about how sucky ATI IQ is.
Even the ones that were circled for you? And who circled them? Had to be the reviewer, yes?
So apparently, the reviewer DID see some difference. Which review are you referring to?
