• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Anandtech Fall 2003 Video Card Roundup - Part 2: High End Shootout **60 pages**

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
The further i get through this review the more perplexed i get.

What's with the strange way of recording Tomb Raider performance, totally different to every other game yet benchmarked in the review. I can only assume that it's a way to fudge over the fact that NVidia get whooped in this game, i can't tell if the frame rates - for any card - are playable.

edit.. I am the UT2003 screens with 4xAA/8xAF, now these screens look close enough to be called no difference. I have no problem with these.
 
not at tomb raider yet , but the pics of gunmetal drive me nuts. seems like derek was to much in front of benchmarks😉 the 52.14 pic has to be on an other moment of the benchy (firing-fire?) and the scenerie somewhat looks totaly different on the two cards.

oh well. another game where nvidia has to work on...
 
Quick question about the Cat3.8s. Will this new "OverDrive" feature be solely for the new 9600/9800XT's or for the entire line of Radeons (ie: 9500/9600/9700/9800 NPs/Pros)?
 
Originally posted by: OddTSi
Originally posted by: Evan Lieb
Again, these are incredibly small IQ differences that you would never notice when actually playing the game. Saying either driver is better than the other in terms of IQ would be misleading.

No offense intended to any of the AnandTech crew but, wouldn't it have been better to do as hominid skull said and post larger images with perhaps an animated gif so that these differences were more visible. What is and isn't "too small of a difference" is really subjective and should be left up to the user to decide.

Just my $1.50


Yeah, what's up with 450x450 JPGs to compare image quality?

If it's meant to save bandwidth, then you should've put less images, but useful ones. (real gaming resolution, lossless compression)

Rest of the review is excellent. 🙂
 
The pics are supposed to be linked to full sized images ... let me see if I can fix that

/me goes off to muck around in the innards of AT .... I hope I don't kill anyone!
 
Originally posted by: GTaudiophile
WTF is with those "Percent Performance Loss" benches for TR:AOD. Talk about misleading! Where are the normal FPS benches? Anand trying to hide something?

Yah I thought that was a bit wierd since every other graph had "higher bars = better". But I guess one could say its much like CPU comparisons on encoding and media with "lower scores = better".

What I dont get is why they just didnt have the graphs switched around showing performance GAIN rather than performance LOSS w/ ps 2.0 enabled. It would have been alittle less confusing (not to mention just as easy to do)... especially to someone just browsing through the review I could see how they could miss the little (lower is better) and come away with wrong impressions.
 
You wont see much of a performance gain by using PS 2.0. What is faster to calculate on a CPU. Integers or Floats?

As for the pics you guys are really really digging to find something wrong with the Nvidia drivers lol.

 
I am totally baffled by the lack of comments on image in cases where it is so apparent.

The 4xaa8xaf images for nVidia appear to have little to no AA going on in F1 challenge ( just look at building tops and signboard crossing the track), Homeworld 2 (look at engines and perimeter of ship), and warcraft 3 (look at edges of all characters)

SimCity 4 looks like it is being rendered at a lower precision on the nVidia cards look at the chairs ( I couldn't even figure out what they were on the nVidia picture) and the trees look like they are rendered at vga resolution.
 
Originally posted by: Genx87
You wont see much of a performance gain by using PS 2.0. What is faster to calculate on a CPU. Integers or Floats?

As for the pics you guys are really really digging to find something wrong with the Nvidia drivers lol.

Well from what I see we are back to where we were before the driver IQ cheats for the most part. I didn't see any glaring issues like some previous versions. The ATi cards clearly have better AA (that F1 shot and AQ3 for example), but this is nothing new. ATi's AA has looked better all along. It is very hard to judge with 450 x 450 compressed JPEGs. I will say nVidia has stepped up the performance of their cards a great deal. This is very good news for nV card owners. I hope this continues (speed increases without IQ compromises).
 
Back
Top