Nvidia accusses ATI/AMD of cheating - Benchmarks

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
Cant believe the usual 3 or so fanboys keep defending Nvidia and somehow think everyone else posting in this forum is wrong, Lol

Do you really think theres some conspiracy going on? Because its not just AT forum... Read the comments on the article itself, everyone agrees with the point we (as in the majority) are making, Nvidia is the stupid one for making a fuss about something that has been known (and done) for years

But the best part is this is just based on ancient DX9 games, which already get ridiculous fps with any recent card, it really is pathetic


Speak ye not ill of Nvidia!

thermi.jpg
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Quite a lot of talk despite one simple thing: Catalyst AI has never once by anyone been found to degrade image quality in any game in any scenario on any card. It has only been shown to improve performance. Nvidia's optimizations on the other hand have shown severe degradation of image quality in numerous titles over the years. Nvidia telling reviewers to make their AMD cards score lower is just typical Nvidia nonsense. Nvidia hasn't had a shread of credibility in anything they've said for a long time; I don't know why anyone would listen to them now.
 
Last edited:

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Quite a lot of talk despite one simple thing: Catalyst AI has never once by anyone been found to degrade image quality in any game in any scenario on any card. It has only been shown to improve performance. Nvidia telling reviewers to make their AMD cards score lower is just typical Nvidia.
FWIW, in the article there is a definite image quality drop in Far Cry. Overall, I could care less though, as it's great in the other 99.9% of games.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
From what I can tell the images are nearly identical and the tiny differences can be attributed to them being screenshots taken from slightly different locations.
 

dust

Golden Member
Oct 13, 2008
1,328
2
71
Quite a lot of talk despite one simple thing: Catalyst AI has never once by anyone been found to degrade image quality in any game in any scenario on any card. It has only been shown to improve performance. Nvidia's optimizations on the other hand have shown severe degradation of image quality in numerous titles over the years. Nvidia telling reviewers to make their AMD cards score lower is just typical Nvidia nonsense. Nvidia hasn't had a shread of credibility in anything they've said for a long time; I don't know why anyone would listen to them now.

I guess they have to use all they can to catch up with the sales for the DX11 hardware, ATI has pulled. They constantly do similar if not worse stuff, showing the cheek when their crybabies division finds an irregularity, no matter how small.

If anything, it gave everyone a chance to raise the post count. Thanks Nvidia!
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Cant believe the usual 3 or so fanboys keep defending Nvidia

I cant believe this troll thread was not locked hours ago.
As soon as I realized it was Far Cry 1, this thread became irrelevent for me.

The question is, why are the same 7 or 8 ATI fanboys still defending something not worth defending?

If anything, it gave everyone a chance to raise the post count. Thanks Nvidia!
__________________

I like this and agree.
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
Man, Nvidia must be desperate and worried about AMD's 6 series to come up with this crap.

This reminds me of Rollo crying and calling foul on AMD/ATI in Far Cry 2 with his 'screenshots' 'proving' degraded image quality.

Nvidia needs to stick to what they do best currently, slashing prices on all their cards :cool:

Degraded visuals to get more performance, unless specifically chosen by the user, is never a good thing.

To be honest, and as others have noted, both companies have such high image quality nowadays that aside from visual bugs they're almost indistinguishable.

My actual problem is that nVidia, as well as AMD, has pulled similar shenanigans and is being hypocritical in accusing AMD of "cheating." Pot meet Kettle.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
NV is likely to play catch up in DX11 performance when higher end HD6000 series launch. But when it comes to current DX11 performance hardware, ATI is not even in the same league. I am too tired to link benchmarks in every thread, trying to convince people otherwise who cling to 6 months old reviews when GTX4xx series just launched.

GF100 (mostly the GTX 480, but including the 470 and 465 as well) is really only "in a different league" than the HD5000 series cards in DX 11 titles. The rest of the Fermi lineup, not so much.

See my post here:
http://forums.anandtech.com/showthread.php?p=30364966&highlight=#post30364966

The HD5850 sits right in between the GTX 460 and 470 in DX 11 titles.
Summary across DX 11 titles:
GTX 460: 100%
HD 5850: 109%
GTX 470: 116%
HD 5870: 124%
GTX 480: 153%

And here's another review:
http://techpowerup.com/reviews/NVIDIA/GeForce_GTS_450_SLI/3.html

Summarizing the DX11 titles:

AvP: GTX 450 on par with HD 5750. HD 5850 between GTX 460 and 570; HD 5870 slightly above GTX 470. Only the GTX 480 seems be "in a different league".

BC2: GTS 450 slower than HD 5700s. HD 5850 between GTX 460 and 570; HD 5870 between GTX 470 and 480.

Battleforge: GTS 450 performing between the HD 5700s. HD 5800s in between the GTX 460 and 470. GTX 480 seems to have a nice boost over any other single card solution.

Metro: GTS 450 on par with HD 5700 cards. 5800 cards perform between the GTX 460 and 470. GTX 470 seems superior, and yet again the GTX 480 is the big out-lier for single cards.

So just in summary, I don't see it being valid saying the entire GTX 400 lineup is "in another league" in DX 11 titles compared to the Radeon competitors. That moniker really only seems to apply to the GTX 470 and 480, especially the 480 which seems to have a bigger lead over the other cards in DX11 games than it has in other games. But the GTS 450 and GTX 460 aren't anything special in DX11 games, from what I'm seeing. Or, in other words, those cards don't seem to leapfrog in performance past their price bracket (like the GTX 470 does to the HD 5870 in some cases).
 
Last edited:

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
And here's another review:
http://techpowerup.com/reviews/NVIDIA/GeForce_GTS_450_SLI/3.html

Summarizing the DX11 titles:

AvP: GTX 450 on par with HD 5750. HD 5850 between GTX 460 and 570; HD 5870 slightly above GTX 470. Only the GTX 480 seems be "in a different league".

BC2: GTS 450 slower than HD 5700s. HD 5850 between GTX 460 and 570; HD 5870 between GTX 470 and 480.

Battleforge: GTS 450 performing between the HD 5700s. HD 5800s in between the GTX 460 and 470. GTX 480 seems to have a nice boost over any other single card solution.

Metro: GTS 450 on par with HD 5700 cards. 5800 cards perform between the GTX 460 and 470. GTX 470 seems superior, and yet again the GTX 480 is the big out-lier for single cards.

So just in summary, I don't see it being valid saying the entire GTX 400 lineup is "in another league" in DX 11 titles compared to the Radeon competitors. That moniker really only seems to apply to the GTX 470 and 480, especially the 480 which seems to have a bigger lead over the other cards in DX11 games than it has in other games. But the GTS 450 and GTX 460 aren't anything special in DX11 games, from what I'm seeing. Or, in other words, those cards don't seem to leapfrog in performance past their price bracket (like the GTX 470 does to the HD 5870 in some cases).
Yeah, but that isn't even the case though. The 5870 nips at the heels of the GTX 480 in BF: BC2 (a DX11 game). You're seeing architectural differences here, and therefore I wouldn't make generalizations about performance until they're considered.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Okay I see what your saying now about this. Its one thing for a driver to be monitoring code to ensure that things work they way they need to work

DirectX and OpenGL are actually designed to work this way.
In the OpenGL case, the entire compiler is inside the driver, so you pass a C-like text-file of source code directly to the driver, and the driver has complete control over how it compiles and optimizes the code. It doesn't matter at all how it does it, as long as it produces results that are within the OpenGL specs.

DirectX is slightly different, but the effect is the same. In their case, Microsoft supplies the compiler, and it outputs an intermediate bytecode file, very similar to how Java and .NET work.
The driver will then take this bytecode and compile and optimize it for the underlying hardware.
Basically the advantage of this method is that there will not be issues with source-code level compatibilities... and you can cut off some of the overhead at runtime by using pre-compiled bytecode in your application, so you don't have to parse and optimize text files entirely.

So in my example before, it is the case then that the software is requesting 16bit precision, and the hardware can execute that 16bit precision just fine, but the driver is then dropping the precision just to inflate the performance numbers at a sacrifice of quality that neither the developer or the customer is necessarily aware of.

Exactly. End-users and reviewers are so used to just 'turning up all the settings', that they generally run everything with AI on, but aren't fully aware of the impact it may have.
There is nothing wrong with that in and of itself... I mean, I've always used AI at the highest settings on my Radeons. I think they are generally a good trade-off between performance and image quality. Likewise I always run my GeForces with AF optimizations and such on.

However, when you want to benchmark against nVidia's products, I think nVidia is right about pointing out this AI issue, and how you should conduct apples-to-apples comparisons.

It's not exactly a new trick either, I'd like to add.
As soon as the first cards arrived with S3TC, drivers quickly appeared which would 'magically' enable texture compression and give massively increased texturing bandwidth in applications that had no support for texture compression whatsoever.
I know that the FutureMark guys were looking at one of those cards in the multitexturing benchmark and were going "Hum... hang on a minute... the scores this card is getting are physically impossible, it just doesn't have that much fillrate".
Then they figured out what was going on: the driver had replaced their 32-bit texture with a 16-bit compressed version, which greatly reduced memory bandwidth, and allowed the card to reach inflated fillrate scores, which were beyond the theoretical fillrate peak, because the fillrate was still calculated with the assumption that the textures were 32-bit.
 
Last edited:

MegaWorks

Diamond Member
Jan 26, 2004
3,819
1
0
Oh no! AMD user quickly remove your video card from your rig, and proceed to newegg and buy the mighty morphing Fermi ranger!!! It will give you mighty PAWERR!!!

PowerLoader.jpg


Regardless of the article I would still buy AMD video cards. :)
 

wanderer27

Platinum Member
Aug 6, 2005
2,173
15
81
Got to give the OP credit, finely executed post to incite trolls and open the door to another 20 page pile of crap. This is as useless a subject as the Mafia 2 physx on/off thread.

With some opinion piece from some site that reviews energy drinks as much as they do hardware :rolleyes:

Keep spinning NV, I can't wait to see the CrapX and BloatVision(tm) threads springing up in the next few months and the fanboys howling about 'features' while AMD blows the doors off NV again for another half year stretch.

There were actually a few sites reporting this, but they pretty much all ended up at the link I posted - that's why I used it.


I was primarily just passing this along as more of an FYI, as both Green and Red Teams are/have been guilty of this issue.

Both FarCry 1 and Oblivion (torches) show the IQ changes as a result of the optimizations.


Look, when I buy a video card I just want it to work as well as possible and to not alter the intended graphical display.

Put the darn picture on the Monitor as fast and as nicely looking as possible - that's it.

If I want more FPS I turn down/off a feature, if I want better IQ then I turn things up.
There are probably a lot more tweaks this way than some generic optimization algorithm.

Again, both sides are guilty of this - I had really thought they'd both matured and gotten away from this kind of thing :rolleyes:
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Maybe we should just have one thread like this so that the rabid fanboyism can stay out of the rest of this sub-forum. :)
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Maybe we should just have one thread like this so that the rabid fanboyism can stay out of the rest of this sub-forum. :)

Agreed...every sub-forum has a need for a sanctioned Zion.

Now then...what are you doing outside of your pod?
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Still defending Nvidia as always,actually Keysplayr what Nvidia does best is spread a lot of FUD and crap over the years,I wonder why sometimes I use Nvidia products.

I disagree on Nvidia make better products,lot of times they are late to the game and some of their hardware overhyped but you would never admit it would you?

What does this have to do with the subject? Me defending Nvidia? Why not? I think they have superior products, so why wouldn't I?
Being late to the game doesn't make the product inferior, just makes it late. Products are always overhyped no matter which company is pushing it.
So, don't make your posts about me. Make them about the subject.
 

Mem

Lifer
Apr 23, 2000
21,476
13
81
What does this have to do with the subject? Me defending Nvidia? Why not? I think they have superior products, so why wouldn't I?
Being late to the game doesn't make the product inferior, just makes it late. Products are always overhyped no matter which company is pushing it.
So, don't make your posts about me. Make them about the subject.

Sure its about you and anybody else thats posts since we all have the right to disagree with any member/members here and vice versa,Nvidia as we know will do anything to get the upperhand or make AMD look bad ,Nvidia have been involved in their own similar situation in the past,personally I think they have all been overblown.

Should we take it serious?....hell no its just benchmarks,Nvidia should concentrate more on their own hardware and drivers and ignore silly things like AMD benchmarks,no gamer will take this sort of thing serious.

Don't Nvidia have better things to do?
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
Sure its about you and anybody else thats posts since we all have the right to disagree with any member/members here and vice versa,Nvidia as we know will do anything to get the upperhand or make AMD look bad ,Nvidia have been involved in their own similar situation in the past,personally I think they have all been overblown.

Should we take it serious?....hell no its just benchmarks,Nvidia should concentrate more on their own hardware and drivers and ignore silly things like AMD benchmarks,no gamer will take this sort of thing serious.

Don't Nvidia have better things to do?

You're absolutely right! You have every right to disagree with my opinion. But it's not ok to say what you think of me for having my opinion. See the difference? No more posts about this from me. I was being nice.
 

JimmiG

Platinum Member
Feb 24, 2005
2,024
112
106
I'm pretty sure the AMD QA team has access to more accurate analysis tools than used in that article.

For example this:
That inside the town of Bravil we had the same torch flame height difference. Does this mean that the extra half-frame performance boost comes from a slight reduction in flame height? It would appear so!

Of course the height will be different. The flame is *animated* so it will rise and fall, shrink and swell. To compare, you need to capture a the shot at the exact same frame of the torch animation.

I'm pretty sure the jagged water in Far Cry is also a result of frames being captured at different stages of the water animation.

After all, why would reduced HDR precision cause flames to be shorter, or water/land edges to be jagged?