- Jan 4, 2004
- 21,281
- 4
- 81
Which leads me to the next note: try as I might I couldn't get the 7950 GX2 to run in dual-GPU mode on the A8N-SLI (even with the beta BIOS(s) that supposed added support for the card). I ended up benchmarking the GX2 on the A8R32-MVP instead, which worked fine.
Originally posted by: Kyanzes
I love this review. The 7950 GX2 shines, very nice results indeed.
Originally posted by: Ackmed
The benchmarks were done on "Quality", or on the 3rd notch out of 4 in the drivers for NV quality wise. And done on the highest driver setting for ATi. Hardly the same setting. Im doubtful NV would have kept its lead with the drivers set to HQ, and optimizations turned off, but I guess we wont know.
Originally posted by: Wreckage
Originally posted by: Kyanzes
I love this review. The 7950 GX2 shines, very nice results indeed.
In the single card benchmarks it passes the X1900XTX by 28fps at 1600x1200 with AA/AF. :Q
People who picked up one of these are going to be very happy with it.
Originally posted by: Cookie Monster
Any IQ comparisons. To my knowledge, NV cards always looked better on OpenGl or doom3 engine based games.
Originally posted by: Ackmed
Originally posted by: Cookie Monster
Any IQ comparisons. To my knowledge, NV cards always looked better on OpenGl or doom3 engine based games.
Where do you get that impression? NV has typically been faster, but they have looked the same from what I recall.
We know that many of you are concerned with the image quality provided between these two manufacturers' video cards. In our evaluation, which consisted of swapping many video cards back and forth, we did not notice any differences in image quality between the cards at all. With both brands of cards, DOOM 3 was being presented exactly the way it was meant to.
In the rest of the screenshots shown above, we did not see any differences in image quality. We even took a close look at the lighting, as you can see in the fifth picture, and at High Quality the lightmaps and specular are compressed, yet we still see no difference in image quality between the video cards. Whether you have a NVIDIA or ATI based video card, you are going to be receiving the full gamut of video quality that DOOM 3 has to offer. Shadows, specular lighting, bump mapping, heat haze, the quality is all the same between the latest NVIDIA and ATI video cards.
Originally posted by: Frackal
It is stupefying to me that nvidia can win benchmarks simply by defaulting their drivers to a lower quality setting, and 4/5 benchmarkers just let it go. :disgust:
Maybe ATI can suddenly release drivers that default to "High Performance/Lowest Quality" settings and suddenly 0MFG they have a faster card than the 7950GX2!!11
This has been the case for a long time as 16xAF is basically free on cards that support it.weird how all cards don't take a large performance hit when 16xAF is activated
The benchmarks I linked to at EliteBastards have nVidia running HQ and you can see ATi pull ahead.Im doubtful NV would have kept its lead with the drivers set to HQ, and optimizations turned off, but I guess we wont know.
At default driver settings Doom 3 visibly shimmers on modern nVidia cards but it doesn't on ATi cards.From playing doom3. Im not sure but it looked better on the NV card i was using then my friends ATi card.
Originally posted by: Ackmed
The benchmarks were done on "Quality", or on the 3rd notch out of 4 in the drivers for NV quality wise. And done on the highest driver setting for ATi. Hardly the same setting. Im doubtful NV would have kept its lead with the drivers set to HQ, and optimizations turned off, but I guess we wont know.
Originally posted by: RatchetI would think that the lead for NVIDIA has something to do with it being based on the Doom 3 engine, yes. Note, however, that it has an ATI logo on the spalsh screen and is not an NVIDIA TWIMTBP game.
Originally posted by: josh6079
Its also good to see an anticipated game launch and actually more of a game than a benchmark (i.e. Oblivion, GRAW, TRL). Don't get me wrong, Oblivion certainly is nice, but hardly as flexible between as many different video cards as Prey. Really, a lot of people will be able to play this at some pretty optimal settings.