Anand's 9800XT and FX5950 review, part 2

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
This is nothing new, I can remember not being able to install game patches until driver updates came out years ago.
Rollo you have completely missed the point yet again.

BFG10K: It sounds like you are saying that effects are pre-computed, and later patches to the game may break it.
That's exactly what I'm saying.

Mathematically, there is no difference between 2/2 = 1 and 2 * 0.5 = 1.
You are are 100% correct and this is in fact an example of a genuine optimisation.

However a genuine optimisation requires two main things:
(1) For all possible inputs the outputs of the optimised and unoptimised code are exactly the same.
(2) No prior knowledge such as the code that it will be executed or the name of the executable is required.

Thus for example 2 if you knew that the program named 'X" was only going to ever input the value "10" into your equation then you'd customise a portion of your code to simply output the correct value straight away without calculating anything. Thus using your example of dividing by 2 in rough psuedo-code:

if (program == "X") output 5.
else output number/2.

That is a cheat and that's along the same lines as what nVidia is doing. Instead of optimising their drivers to universally improve shader performance, they're putting in sections of code that are hardcoded to specifically to deal with exact situations that they know is going to happen beforehand.

The static clip planes of 3DMark are another perfect example of this and it's quite simply cheating.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
The majority of the 'cheats' I've seen from nVidia seem to be compiler bugs rather then anything close to an actual cheat.
It's a bit of a tough stretch to call drivers that implement static clip planes that are only active during benchmark runs, drivers that detect screen captures and automatically raise the image quality and drivers that enrcypt themselves to stop anti-cheat programs from working "compiler bugs".

Well what issues are showing up with the latest Cats?
Is that a trick question? What are you actually asking?

I know ATi is going to extreme measures doing this,
Doing what?

Where is ATi substituting entire shader routines from games with their own versions, hardcoded exactly for the game in question?
Where is ATi purposely ignoring developer requests to run certain code and ignoring requests for certain levels of image quality?
Where is ATi implementing static clip planes that only serve to inflate benchmark runs?

I didn't see Gabe complain about any of things about ATi, did you? OTOH I certainly saw him complain about nVidia and that, coupled with the irrefutable evidence of cheating that we've already seen since the NV30's release, we really cannot conclude anything other than the fact that the entire FX line is geared up to simply cheat to make up for poor hardware design.

More then likely the Cats will balloon up over 30MBs to account for this
Why would the Catalysts balloon in size? Generic optimisations will work on any code. nVidia's drivers will be the ones ballooning as they furiously pour in customised shader code for every single possible game out there.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
BFG

It's a bit of a tough stretch to call drivers that implement static clip planes that are only active during benchmark runs

I said the majority. It's not much different from Quack. nV got nailed, working around it there performance tumbled, with the next driver release that aspect was gone and performance was back where it was previously.

drivers that detect screen captures and automatically raise the image quality

I have seen nothing that backs this up. You think Valve is a good example of an evenhanded company?

Were the Shaders in the benchmark compiled with the latest version of HLSL (i.e. the version that orders the assembly for a more efficient use of the FX's register limitations)?

[Brian Jacobson] The ones used by the mixed mode path were.

B3D.

DX9 has two different compilers, one that is better for ATi's boards and one that's better for the FX. The FX compiler focuses on scheduling differences and such, it does not search for any sort of code replacement(this compiler is from MS). With all Valve's talk about how much extra time they had to spend on coming up with a custom nV code path they couldn't even bother to compile the default shaders with MS's compiler that would improve performance. It's pretty clear that Valve was making an effort to make nVidia look as poor as they could, even going as far as implying that they were bending over backwards for nV while they really didn't even bother to make a very simple addition that required next to no work.

Where is ATi substituting entire shader routines from games with their own versions, hardcoded exactly for the game in question?

They were in 3DMark2K3. The Vertex shaders used for the leaves in GT4 was replaced. The leaves that were supposed to be animated individually were alterted to move in clusters. That was a specific request by developers that ATi changed to reduce IQ and improve performance. What's going on in TRAoD? You regularly state that any visual glitch displayed by the FX is an obvious cheat, so the same level should apply to ATi should it not? nVidia stated that the issues that were being shown in the past were overwhelmingly bugs, now those issues are gone and performance is at the same level. If any visual oddity is a clear cheat, then ATi must be cheating in TRAoD.

I didn't see Gabe complain about any of things about ATi, did you?

The same guy who implies he's bending over backwards spending a significant amount of time to optimize for nVidia and can't even bother to use MS's compiler. They were trying to make nVidia look bad, there is no other logical excuse.

Why would the Catalysts balloon in size? Generic optimisations will work on any code. nVidia's drivers will be the ones ballooning as they furiously pour in customised shader code for every single possible game out there.

Glad you brought this up. Compare the size of the latest Cats(3.8) to the Dets(52.xx). The Dets have been reduced in size while the Cats have exploded. Using your above statement, again the evidence points to ATi cheating, not nVidia.

As far as automatic shader replacement goes, every one of the vendors does this and they have to. The assembly level code output by the DX compiler is not machine level code. All of it has to be converted by the driver. If you are using a straightforward compiler then you won't need to increase your driver size by that much. If you are creating your own hard coded shaders for a plethora of ones used in titles then you migh see something like 20MB driver files. That would be an obvious example of cheating, right?

Sazar-

the IQ on the fx cards IS worse... specially if you are using 45.23 as the baseline... this has been shown in other tests all around the internet...

They were using the R3x0 core boards as a baseline.

admittedly AT's images are a little shyte in terms of size... but if they were proper fullscreen ones we could all see the differences quite clearly...

The only difference I've seen in any of the comparisons with the more recent drivers is the psuedo trilinear filtering. I've seen these articles from the same sites that didn't look in to the psuedo AF of the R3x0 with nearly the same level of depth.
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
Ben, Since you were good enough to mention TRoD bugs in Anands review, whats going on with the bugs mentioned in F1 Challenge, GunMetal, Homeworld 2, and X2??

I thought nVidia didn't have driver bugs?? :Q

On Driver size:
nVidia:
English: File Size: 8.5 MB
International: File Size: 18.9 MB

ATi:
25.3 Meg.
Supports 24 languages. The driver size "exploded" when they went to an international version instead of separate downloads.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Well I know I will get flamed because I'm using one of the two hot-button words here (one starts with A, the other with N), but check out these pictures:

Scroll down to the AA/AF section in Aquamark3

It's hard to tell on the screen since Anand used smaller screenshots for some reason, but save all three examples to your desktop (in Windows XP) and view them as a slideshow or open them in a paint or whatever if you don't have XP. To me, the 45.23 drivers have crap IQ (in both the AA/AF section and with AA/AF off) and the 52.14 are extremely blurry with AA/AF (check out the red stuff on the vehicle or the grass in the background). You can see the IQ has gone up considerably with the new 52.14 drivers, but something is still amiss with AA/AF on.

Also, check out AA/AF in F1 challenge . You can clearly see that neither driver has proper AA on Nvidia - check out the glass in the bottom middle. It's aliased all the same with 4xAA on.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Microsoft provides a software renderer for ANY game in one of the dx9 dev. tool packages or SDK that will give you the full correct image

I kind of figured this. So where are the base images?

 

THUGSROOK

Elite Member
Feb 3, 2001
11,847
0
0
i wonder,
did the NV drivers have "texture sharpening" enabled during those AM3 screenshots?
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
LOL
Now that HL2 is gone to whine about, we're back to scrutinizing screenshots with a magnifying glass and whining,"Nuh uh! Those nVidia cheater drivers lower IQ!" .
Even on pages where the guy who runs this site says flat out that he couldn't see the difference.

It would not surprise me at all if I'm the only real poster this board has, and the rest of you are ATIs marketing department and families of their employees. There's probably a big board at ATI headquarters up in Canada that tracks the most family member posts to bbs like this, most original nVidia flames, etc..

"Damn! If Claude's kids don't go back to school soon, I'll never get my bonus! Damn vasectomy!"

 

THUGSROOK

Elite Member
Feb 3, 2001
11,847
0
0
really?

then what about older games that are showing obvious cheats / changes have been made to the drivers?
items far off in the distance (im talking 1/2 mile distance) arent being rendered til they get closer.
revert back to a ti4600 on 30.82 drivers and these same items get rendered as soon as they come into view. (about 3/4 of a mile away)

NV is counting on ppl to "not notice".
if we do notice, they claim it is a bug and will be fixed in new drivers.

NV better start releasing new official drivers every week to keep up with the bugs we are finding ;)
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Rollo
LOL
Now that HL2 is gone to whine about, we're back to scrutinizing screenshots with a magnifying glass and whining,"Nuh uh! Those nVidia cheater drivers lower IQ!" .
Even on pages where the guy who runs this site says flat out that he couldn't see the difference.

It would not surprise me at all if I'm the only real poster this board has, and the rest of you are ATIs marketing department and families of their employees. There's probably a big board at ATI headquarters up in Canada that tracks the most family member posts to bbs like this, most original nVidia flames, etc..

"Damn! If Claude's kids don't go back to school soon, I'll never get my bonus! Damn vasectomy!"

This is a really mature, objective thought process you have going on Rollo. Nvidia has been caught cheating over and over again with their 44.xx and up drivers, so let's just sweep everything under the rug, dismiss all possible cheats as fanboyism, shrug off all accusations, and shut off our brains altogether.

I'm sick of all you damn child-minded fools who ignore all facts. How about you use your brain and eyes for once, instead of dismissing any little thing that isn't convenient for you, hmm?

Even on pages where the guy who runs this site says flat out that he couldn't see the difference.

Now Anand is recommending us to jump off a bridge. You first, Rollo ;).
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
I think review sites should just go back to a bunch of Quake(X) and 3DMock FPS charts with no IQ testing like the old days. Who really cares about image quality anyway?
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
I'm sick of all you damn child-minded fools who ignore all facts. How about you use your brain and eyes for once, instead of dismissing any little thing that isn't convenient for you, hmm?
LOL- I see I struck a nerve with you. Well, I'm sick of guys like you who just have to be right, have to put other people down over things as meaningless as a video card choice, and have to pimp ATI like your next paycheck depends on it.

This is a really mature, objective thought process you have going on Rollo. Nvidia has been caught cheating over and over again with their 44.xx and up drivers, so let's just sweep everything under the rug, dismiss all possible cheats as fanboyism, shrug off all accusations, and shut off our brains altogether.
Actually it is more mature and objective than yours- I don't assume that nVidia's new drivers will cheat at image quality because others have, that would be illogical. I really don't care if they do cheat if the rendered image is the same, which Anand says it is. He's seen both side by side full screen, have you Jiffy? Or does it just make you uncomfortable he's contradicting your anti -nVidia flaming? I'm as objective as it gets, I like both nVidia and ATI cards, purchase both, and am glad for the differences.

Now Anand is recommending us to jump off a bridge. You first, Rollo .
I'd rather jump off a bridge than be reduced to a little video card pimp. LOL the point was if the guy looking at them full screen side by side can't tell the difference in IQ, it must not be big enough to care about.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Rollo. I'm curious. Can you see a difference between the below shots?
OldFart I'm curious. Why do you study screenshots instead of playing games?
BTW, unlike you, I actually had a nV30 based card, and know there was a difference between no AA and 4X AA. I've actually seen it, which is way better than looking at screenshots.
 

SilentRunning

Golden Member
Aug 8, 2001
1,493
0
76
Originally posted by: Rollo[/i
It would not surprise me at all if I'm the only real poster this board has, and the rest of you are ATIs marketing department and families of their employees. There's probably a big board at ATI headquarters up in Canada that tracks the most family member posts to bbs like this, most original nVidia flames, etc...


Therein lies the problem. You can't even comprehend the reality that people with legitimate complaints about a review are not related to ATI. ITS ALL A BIG CONSPIRACY

(Yes you are the only REAL poster on this board. The rest of us are just a figment of your imagination.....WAKE UP)

rolleye.gif
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
And why do you always change the subject or turn the discussion into personal attacks? Can't you just answer and stay on topic?

The thread discussion is about the article and if there are noticeable IQ differences. You say there are not. Others disagree and say there are. I posted a few links to some shots from the review.

Are there image quality differences or aren't there? Pretty simple question. A yes or no would be fine.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Yes the ATI looks a little better.
Not enough to turn me into into a video card pimp though. LOL
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
Originally posted by: oldfart
And why do you always change the subject or turn the discussion into personal attacks? Can't you just answer and stay on topic?

The thread discussion is about the article and if there are noticeable IQ differences. You say there are not. Others disagree and say there are. I posted a few links to some shots from the review.

Are there image quality differences or aren't there? Pretty simple question. A yes or no would be fine.

If you're going to refute Anand so vehemently, perhaps you should start your own hardware site and do the reviews yourself. Until then, perhaps you should leave the reviews to the professionals.
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
Originally posted by: Rollo
Yes the ATI looks a little better.
Not enough to turn me into into a video card pimp though. LOL

You want to know the best way to decide about IQ, regardless of settings? Own both cards and run your own tests, on the same exact machine. Then you can make your own determination, intead of trying to make it based on some screenshots. That's how I make my choices, and what determines what's in my fleet.
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
Thats fine. Just trying to separate fact from opinion since there is much of that from both sides.

To say there is no difference between the IQ of the cards is simply not true. I think AT did a good job with the article, but did seem to miss that point a bit especially in some of the AA shots.

If the IQ difference between the cards is significant to you or is not is your opinion. Everyone has different ideas on how much AA or AF means to them in 3D image quality.

Let people judge for themselves.
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
Originally posted by: oldfart
Thats fine. Just trying to separate fact from opinion since there is much of that from both sides.

To say there is no difference between the IQ of the cards is simply not true. I think AT did a good job with the article, but did seem to miss that point a bit especially in some of the AA shots.

If the IQ difference between the cards is significant to you or is not is your opinion. Everyone has different ideas on how much AA or AF means to them in 3D image quality.

Let people judge for themselves.

I think perhaps in this whole thread, that's the first thing I've agreed with you wholeheartedly on. Well said.
 

CombatChuk

Platinum Member
Jul 19, 2000
2,008
3
81
One thing to think about... yes the ATI does look a little better but this is just a still. Let's be honest here, there's a slight blurriness in the Nvidia shot. But at 40-60 fps are we really going to notice a difference? Let's be practical here. Wouldn't you rather worry about passing the car ahead of you, or blowing aliens up than a slight blurriness in a still shot? BTW, I do own a 9800 Pro. I like both Nvidia and ATI. So I'm not a video card pimp ;)
 
Apr 17, 2003
37,622
0
76
Originally posted by: RoninCS
Originally posted by: oldfart
And why do you always change the subject or turn the discussion into personal attacks? Can't you just answer and stay on topic?

The thread discussion is about the article and if there are noticeable IQ differences. You say there are not. Others disagree and say there are. I posted a few links to some shots from the review.

Are there image quality differences or aren't there? Pretty simple question. A yes or no would be fine.

If you're going to refute Anand so vehemently, perhaps you should start your own hardware site and do the reviews yourself. Until then, perhaps you should leave the reviews to the professionals.

just because anand and his staff do the reviews, there is no reason why we cant debate topic like IQ, is there?
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
You want to know the best way to decide about IQ, regardless of settings? Own both cards and run your own tests, on the same exact machine
I've done that Ronin. I thought my 5800 IQ was great, I think the 9800Pro IQ is a little better. Not enough difference to care about in gameplay.