• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Nvidia accusses ATI/AMD of cheating - Benchmarks

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Agreed, but this stuff just shouldn't happen.

What stuff? Optional optimisations to increase frame-rates while trying to minimise any impact on visual quality?
Seems like a sensible idea to me.
As long as it's optional, there's nothing wrong with it. ATI has never tried to hide the fact that it exists, the NV wording even states as much.

AMD has admitted that performance optimizations in their driver alters image quality in the above applications.

They have been open and honest about it, so what's the problem? If you don't like it, disable it. If you think reviews shouldn't have it enabled, contact the editor of whatever website.
If you think that it shouldn't be in the drivers, then please take a hike, because it's an additional optional feature which benefits the majority of users.
 
Last edited:

crislevin

Member
Sep 12, 2010
68
0
0
Agreed, but this stuff just shouldn't happen, with either of these companys. :(
u mean the optimization? or arguing over it?

if its latter, I agree

if its former, I don't, I think many, if not majority users would like to have the option to improve the speed, sometime sacrifice the image quality a bit.
 

Voo

Golden Member
Feb 27, 2009
1,684
0
76
Ok somehow this thread is confusing as hell. The only thing I get from reading the original source is that in all games but FC1 the optimizations are reasonable and gain you a noticeable performance gain, but they really should do something for FC1 - the IQ loss is clearly visible especially since there's lots of water in FC..

Not sure how that has anything to do with what Nvidia does (and even less with what they did), but I assume "But he did it too!" is always a great excuse..
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
Sounds to me like just more of Nvidias sleazy tactics. How people can stand up for this shit is beyond me. I guess if i got free hardware i would be inclined to ignore that part of a companies methods.

Price/performance is king. Talk is cheap, especially cheap is the talk from the high and mighty Nvidia about how reviewers should review their products.

I would like to point at Anands latest retest of the Bobcat/Zacate platform and AMDs complete openess as a great example of how this stuff should be done.

What Nvidias version sounds like is: (for the average forum browser that is. trust me when i say im not the only one)
NVIDIA: Remember to turn off the performance tweaks of our competitors cards/drivers or there will be no more cards sent to you for reviews
 

Mem

Lifer
Apr 23, 2000
21,476
13
81
Is it crap, or is it really happening?. Doesn't matter if Nvidia is desperate or not. Motive is irrelevant. Subject remains.

It reminds me of a lot of things as well.

Actually, what Nvidia does best "currently" is make better DX11 hardware and superior dev support/relations. The former may change with 6xxx series, but we'll see shortly.

Still defending Nvidia as always,actually Keysplayr what Nvidia does best is spread a lot of FUD and crap over the years,I wonder why sometimes I use Nvidia products.

I disagree on Nvidia make better products,lot of times they are late to the game and some of their hardware overhyped but you would never admit it would you?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
No, not really. They play catch-up for the most part with DX11

NV is likely to play catch up in DX11 performance when higher end HD6000 series launch. But when it comes to current DX11 performance hardware, ATI is not even in the same league. I am too tired to link benchmarks in every thread, trying to convince people otherwise who cling to 6 months old reviews when GTX4xx series just launched.
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
edit: in reply to happy medium

No, its the word thats become a synonym with Nvidia: sleazy

Im not about defending any castles, except the castle for correct behaviour and good moral/ethics. Some of the core things missing from what you are prone to defend.



in reply to RS:
NV is likely to play catch up in DX11 performance when higher end HD6000 series launch. But when it comes to current DX11 performance hardware, ATI is not even in the same league. I am too tired to link benchmarks in every thread, trying to convince people otherwise who cling to 6 months old reviews when GTX4xx series just launched.

performance does not equal better. Call it semantics, but you should know exactly what is both treasured AND measured by both reviewers and buyers/users. If you all of a sudden wish to ignore everything that is not related to how fast you push a pixel, the HD5970 is still the highest performan piece out there. Maybe your trying to convince yourself its not?
 
Last edited:

happy medium

Lifer
Jun 8, 2003
14,387
480
126
What stuff?

Back door optimisations. If it was something to improve image quality and speed, I'm sure every news website would be told about it and Will Robinson would have it in blinking red lights on the front page of the video forums.

You get what I'm saying?

No offence Will, just using you as an example :)
 

Scali

Banned
Dec 3, 2004
2,495
0
0
What Nvidias version sounds like is: (for the average forum browser that is. trust me when i say im not the only one)

Strictly speaking, AMD is breaking DirectX specs.
The software SPECIFICALLY requests an FP16 texture format. It gets an R11G11B10 format instead.
It is not exactly a 'performance tweak', it's closer to what nVidia did with their shader replacement in the GeForce FX era... where the driver just did something completely different than what the application was telling it to do.
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
read post 59, thats exactly what I'm saying. When Nvidia does it ,they are wrong and when ATI does it they are EQUALLY wrong.

Don't you agree?

It just so happens it's ATI's turn. :)


100% in agreement. But Nvidia having included a passage in their review guides as said in another post in this same topic, is what im refering to.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
8,348
9,730
136
Agreed, but this stuff just shouldn't happen, with either of these companys. :(

The one game AMD gets there a$$ handed to them in is Far CRY, so I tend to believe they used a little extra help. If it was just a game that that cards were close, I'm might tend to doubt.

That seals the deal for me.

- Why shouldn't it happen? ATI gives you the option to disable driver optimizations and improve image quality right there in CCC. They are simply providing an option that doesn't normally exist. Additionally, Cat AI is DISABLED on a fresh driver install, so I'm sure this isn't affecting anyone unless they go out of their way to enable it in the first place.

If you have a problem with this you must have a problem with NV exclusive physx, which is giving NV users higher image quality at the expense of performance.

It seems you deleted the rest of your post, so I'm assuming you discovered that they're talking about FC1.

[Edit: Oops nvm, but they're talking about FC1, which just about everything gets a bajillion frames on nowadays]
 
Last edited:

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Back door optimisations. If it was something to improve image quality and speed, I'm sure every news website would be told about it and Will Robinson would have it in blinking red lights on the front page of the video forums.

You get what I'm saying?

No offence Will, just using you as an example :)

Except it's not backdoor, it's out in the open (as a feature) and for all to know about, and disable if they wish.
Backdoor would be if they did it and didn't tell anyone and gave no way to disable it while pretending it wasn't really there.
Cat AI is an optional feature which has a specific goal, and they haven't tried to hide the concept.
They may not spell out every little thing it does or how it does it, but if you don't like the idea, then like I said, disable it and email websites asking them to discuss it/disable it for their testing.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
What else can nVidia do? Simply accept using what was called for by the application and their competitors don't? Don't play by the same rules and don't bench apples-to-apples? I would wager that this less precision wasn't in the AMD reviewers guide to reviewers and AMD knew they were benching with less precision to nVidia's full precision. Kinda disingenuous.
 

crislevin

Member
Sep 12, 2010
68
0
0
too intense!!!!!!!!!!!!!!

its not a new argument, its not a meaningful argument.

I can only imagine how intense you guys will be when a real problem pop up.

Calm down...

What else can nVidia do? Simply accept using what was called for by the application and their competitors don't? Don't play by the same rules and don't bench apples-to-apples? I would wager that this less precision wasn't in the AMD reviewers guide to reviewers and AMD knew they were benching with less precision to nVidia's full precision. Kinda disingenuous.

Isn't nVidia doing the same thing? or am I missing something here?
 

HurleyBird

Platinum Member
Apr 22, 2003
2,812
1,550
136
I don't like their methodology. Disabling/enabling catalyst AI is going to change a crapload of optimizations, and it's not a precise way to test for any one optimization. The performance increase may or may not be due to FP16 demotion, and the artifacts seen with Cat AI on may or may not be due to FP16 demotion as well. The fact that the only title to display noticeable differences is ancient might just be small bugs for older games making there way into newer drivers.

If possible, it would be best to test with newer games, and find a way to force the original FP16 rendering without effecting other enhancements. Then if there is a noticable IQ difference, call on AMD to stop using demotion for the titles that display it.

Also, it seems Nvidia should immediately start using FP16 demotion themselves in DOW ;)

and then there's the complete irony of a company who a year ago lied through their teeth to the entire world about when Fermi was going to launch, having the gall to lecture anybody about minor alleged IQ differences coming from a specific performance optimization they themselves used to endorse.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
I'm assuming you discovered that they're talking about FC1.

The whole basis of my arguement was FAR CRY 2, Far Cry 1? who f'en cares.")


Sorry for wasting your time. :)

Good day guys.

edit: but this I do agree with and is a very good point.

I don't like their methodology. Disabling/enabling catalyst AI is going to change a crapload of optimizations, and it's not a precise way to test for any one optimization.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
So let me get this straight. After the Batman AA fiasco, NV is accusing ATI of cheating? Gimme a break!
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Isn't nVidia doing the same thing? or am I missing something here?

No.
They supplied a special tool to enable some hidden functionality in the 260 drivers to enable the same hack that AMD applied (silent R16G16B16A16->R11G11B10 downgrading).
With normal use, nVidia's drivers always just use R16G16B16A16 as requested.

Ofcourse nVidia may do some *other* things for certain games (shader replacement etc), and they may have done things like this in the past. But this specific example, no, they are not doing that.
 

Elfear

Diamond Member
May 30, 2004
7,165
824
126
Agreed, but this stuff just shouldn't happen, with either of these companys. :(

The one game AMD gets there a$$ handed to them in is Far CRY, so I tend to believe they used a little extra help. If it was just a game that that cards were close, I'm might tend to doubt.

That seals the deal for me.

Wait, what?? Do you really think AMD was so worried about the performance comparisons in a six-year old game that they sneakily threw some optimizations in there? 200fps just wasn't enough?

I honestly don't think Far Cry is really on anyone's radar anymore (even though it still is an awesome game). Are there any other games where the graphics took a hit?
 

brybir

Senior member
Jun 18, 2009
241
0
0
I dont understand what the fuss is about. I read this:

"The alleged "optimization" is the selective use of the HDR format R11G11B10 at times when the memory cost of the FP16 HDR format would otherwise impact game play." However, the ATI/AMD response goes on to explain that NVIDIA don't have a problem with this:

Given that in their own documents, NVIDIA indicates that the R11G11B10 format "offers the same dynamic range as FP16 but at half the storage", it would appear to us that our competitor shares the conviction that R11G11B10 is an acceptable alternative.





So my take away is that it is part of driver game play optimizations, i.e. that little slider where you can say how much of an impact on FPS you want before image quality is degraded within CCC. I fail to see how this is cheating if its out in the open and even I knew that slider did "something" and knowing that extra quality comes at the price of performance, it was logical to assume somewhere the driver was indeed making choices, apparently such as the one in the article.

If you ask me, if you get no perceivable differences, then send in the FP16 all day long!

/shurg
 

crislevin

Member
Sep 12, 2010
68
0
0
No.
They supplied a special tool to enable some hidden functionality in the 260 drivers to enable the same hack that AMD applied (silent R16G16B16A16->R11G11B10 downgrading).
With normal use, nVidia's drivers always just use R16G16B16A16 as requested.

Ofcourse nVidia may do some *other* things for certain games (shader replacement etc), and they may have done things like this in the past. But this specific example, no, they are not doing that.
thanks for inform me.

so does ATi card does this by default (like if I buy a new card and plug it in)? hence the problem nVidia is complaining?
 

Scali

Banned
Dec 3, 2004
2,495
0
0
"The alleged "optimization" is the selective use of the HDR format R11G11B10 at times when the memory cost of the FP16 HDR format would otherwise impact game play." However, the ATI/AMD response goes on to explain that NVIDIA don't have a problem with this:

Given that in their own documents, NVIDIA indicates that the R11G11B10 format "offers the same dynamic range as FP16 but at half the storage", it would appear to us that our competitor shares the conviction that R11G11B10 is an acceptable alternative.

I already responded to that earlier.
What nVidia says is a fact: the dynamic range of R16G16B16A16 is the same as R11G11B10.
However, the *precision* is not.
Aside from that, nVidia's document was to advise the developer to make the best possible decision for their situation (in some cases it IS an acceptable alternative, but in others it is not). You cannot compare that with drivers that 'know better' than what the developers specifically request.

So AMD is being very misleading here.