Why does nvidia cheat so much?

Page 20 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

mosox

Senior member
Oct 22, 2010
434
0
0
Hawx2, Mafia2 and Crysis2 are made by three different developers. All of 'em got lazy with their DX11 stuff so the AMD cards behave worse. It's a new disease that affects only the TWIMTBP developers, the main symptom is the tessellation maniacal disorder. The non TWIMTBP developers are not affected, they must have a genetic immunity.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
626
126
There are members here who get free kit through AMD channels and they do the same thing that the Old AEG Focus Group did - and much worse.
You're going to have to back that up.

Back to the original question. Nvidia "cheats" because it works. Nvidia is not stupid, as long as their highly questionable tactics give them an advantage (and they get away with it), they will continue to do it.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Hawx2, Mafia2 and Crysis2 are made by three different developers. All of 'em got lazy with their DX11 stuff so the AMD cards behave worse. It's a new disease that affects only the TWIMTBP developers, the main symptom is the tessellation maniacal disorder. The non TWIMTBP developers are not affected, they must have a genetic immunity.

Those titles use tessellation. AMD cards are not competitive with Nvidia right now in this performance metric. This isnt rocket science or a conspiracy. Theoretical benchmarks said AMD would not compete with nvidia in tessellation. Games come out using tessellation and AMD cards perform worse. Simply amazing...
 

96Firebird

Diamond Member
Nov 8, 2010
5,738
334
126
Hawx2, Mafia2 and Crysis2 are made by three different developers. All of 'em got lazy with their DX11 stuff so the AMD cards behave worse. It's a new disease that affects only the TWIMTBP developers, the main symptom is the tessellation maniacal disorder. The non TWIMTBP developers are not affected, they must have a genetic immunity.

Who said anything about HAWX 2 and Mafia 2 having bad DX11 implementation? Both of those games play at very playable framerates anyways, so whats the big deal if Nvidia cards get higher FPS if both are playable?
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
You're going to have to back that up.

Back to the original question. Nvidia "cheats" because it works. Nvidia is not stupid, as long as their highly questionable tactics give them an advantage (and they get away with it), they will continue to do it.

He backed up his claim better than most of the other claims in this thread....eye of the beholder, eh?
 

Mistwalker

Senior member
Feb 9, 2007
343
0
71
A game engine programmer (dunno99) had this to say about Crysis 2:

http://forums.anandtech.com/showpost.php?p=32307833&postcount=219

I asked if anyone had any technically based rebuttal to it, but nope, all went quiet re Crysis 2 after his post.
Thank you for pointing out the specific post.

To be fair it seems like he is simply offering his own theory to address some of the points being discussed, and while it does make sense to me, I would hardly call it a "debunking" of any other possibilities. Unless I'm not comprehending the details, the need to tessellate the ocean in the first place hasn't really been explained, for example.

Honestly I think the discussion died because theorizing is all that could be accomplished in that thread, at the end of the day.

I just read that thread on ABT.

LOL. Anyone here still want to defend Rollo's behavior?
I don't think anyone here is defending Rollo, per se...
 

mosox

Senior member
Oct 22, 2010
434
0
0
Those titles use tessellation.

A lot of games use tessellation, how come only the TWIMTBP ones have very different results?

Who said anything about HAWX 2 and Mafia 2 having bad DX11 implementation? Both of those games play at very playable framerates anyways, so whats the big deal if Nvidia cards get higher FPS if both are playable?

How it's not bad implementation since the 560Ti is better than the HD 6970? Unlike in every single non TWIMTBP DX11 game out there? It is a big deal since everyone sees the Nvidia cards beating in X games out of Y total the AMD counterparts. Or see the aggregate performance summary that includes all the tested games.
 
Last edited:

Genx87

Lifer
Apr 8, 2002
41,091
513
126
A lot of games use tessellation, how come only the TWIMTBP ones have very different results?

Because they use more tessellation obviously. AMD cards choke on higher tessellated scenes. We already went over this in this thread.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Thank you for pointing out the specific post.

To be fair it seems like he is simply offering his own theory to address some of the points being discussed, and while it does make sense to me, I would hardly call it a "debunking" of any other possibilities. Unless I'm not comprehending the details, the need to tessellate the ocean in the first place hasn't really been explained, for example.

Honestly I think the discussion died because theorizing is all that could be accomplished in that thread, at the end of the day.

I don't think anyone here is defending Rollo, per se...

Actually the tesselated ocena HAS been explained...by Seero.
In short, Crysis 1 has the same "invisible" ocean as Crysis 2.
In Crysis 2 it gets tesselated, but that dosn't wate a lot of computing power, as tesselation can be, just like SM3.0, be "instanced"...the calcualtions copied.

The only ones claiming foul play in Crysis are people ignorant about programming.
Hence no source code, no techincal explations...only "NVIDIA killed my dog...and ate it to...BWAAAAAAAAAuuuu"
 

KCfromNC

Senior member
Mar 17, 2007
208
0
76
Thank you for pointing out the specific post.

To be fair it seems like he is simply offering his own theory to address some of the points being discussed, and while it does make sense to me, I would hardly call it a "debunking" of any other possibilities. Unless I'm not comprehending the details, the need to tessellate the ocean in the first place hasn't really been explained, for example.

Nor does it explain the need for tessellation on a perfectly flat surface (even nVidia focus group members thought this was suspicious). But it is interesting nevertheless.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
Hawx2, Mafia2 and Crysis2 are made by three different developers. All of 'em got lazy with their DX11 stuff so the AMD cards behave worse. It's a new disease that affects only the TWIMTBP developers, the main symptom is the tessellation maniacal disorder. The non TWIMTBP developers are not affected, they must have a genetic immunity.

Here is another case where the dx11 hit was less on Nvidia cards. Codemaster works with AMD, evidence by their games used in AMD advertising.
http://www.hardocp.com/article/2011/09/26/f1_2011_gameplay_performance_review/6

DirectX 9 vs. DirectX 11


F1 2011 has a DirectX 11 renderer as well as a DirectX 9 renderer. Unlike previous games with the Ego engine, F1 2011 allows us to select "Ultra" settings in DX9 and DX11 alike. So of course, we wanted to see what the performance difference is.



The GeForce GTX 580 was 12.5% slower when running F1 2011 in DX11 than in DX9, indicating the extra DX11 effects are causing a greater burden on GPU cycles.


The Radeon HD 6970 was a whopping 18.8% faster in DX9 than in DX11. It seems the GTX 580 is indeed more efficient, in terms of performance, in this game when running the DX11 shader effects.



NVIDIA GeForce GTX 580



(Click the above thumbnail for a larger version.)


DirectX 9 vs. DirectX 11







AMD Radeon HD 6970



(Click the above thumbnail for a larger version.)
 

mosox

Senior member
Oct 22, 2010
434
0
0
AMD cards choke on higher tessellated scenes.

Nope, AMD only chokes only on subpixel tessellation that produce a bottleneck only on the AMD cards and is useless for the image quality. You can 8-pixel tesselate anything you want and the kitchen sink and an AMD card won't choke. Only the extreme (and useless) tessellation choke the AMD cards and Nvidia uses exactly this.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Nope, AMD only chokes only on subpixel tessellation that produce a bottleneck only on the AMD cards and is useless for the image quality. You can 8-pixel tesselate anything you want and the kitchen sink and an AMD card won't choke. Only the extreme (and useless) tessellation choke the AMD cards and Nvidia uses exactly this.

Taht is false.

AMD chokes already a 16 x tesselation in Crysis 2...even if they introduce visual artifacting.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Nope, AMD only chokes only on subpixel tessellation that produce a bottleneck only on the AMD cards and is useless for the image quality. You can 8-pixel tesselate anything you want and the kitchen sink and an AMD card won't choke. Only the extreme (and useless) tessellation choke the AMD cards and Nvidia uses exactly this.

It isnt useless in graphics quality. Quit gulping down the AMD blog propaganda.
Next year when AMD is capable of not choking on higher tesselated scenes they will change their tune.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Nope, AMD only chokes only on subpixel tessellation that produce a bottleneck only on the AMD cards and is useless for the image quality. You can 8-pixel tesselate anything you want and the kitchen sink and an AMD card won't choke. Only the extreme (and useless) tessellation choke the AMD cards and Nvidia uses exactly this.

HAWX2 uses 6xY pixel-wide triangles.

Nor does it explain the need for tessellation on a perfectly flat surface (even nVidia focus group members thought this was suspicious). But it is interesting nevertheless.

You are using Tessellation on a "perfect flat surface" because surfaces are not always flat.

How do you think Tessellation is working?
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Nor does it explain the need for tessellation on a perfectly flat surface (even nVidia focus group members thought this was suspicious). But it is interesting nevertheless.

Come again? Suspicious? Too far. You need to knock that off my friend. (Actually no you don't. It serves my arguments when you do stuff like this). Put up a link to ANY source showing a focus group member, myself or others, thinking the tesselation of flat surfaces or anywhere really, is "suspicious".

How does that sneaker rubber taste? Or are you a wingtip type of guy? :D
 
Last edited:

mosox

Senior member
Oct 22, 2010
434
0
0
Taht is false.

AMD chokes already a 16 x tesselation in Crysis 2...even if they introduce visual artifacting.

ONLY in Crysis 2, that's a "feature" of the game not of the AMD cards who can run 16x very fine in every other game. Makes one wonder if it's not a Nvidia new "tweak for their latest TWIMTBP game.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
ONLY in Crysis 2, that's a "feature" of the game not of the AMD cards who can run 16x very fine in every other game. Makes one wonder if it's not a Nvidia new "tweak for their latest TWIMTBP game.

What has AMD said about it?
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
It's not nVidia's fault that AMD's tessellation is not as robust. Why should their vision and customers have to wait for AMD? AMD's answer is to allow their customers to lower quality for performance. Personally desire more than this, desire to see more robust tessellation next round and a commitment to adding a lot of it for their customers as well. I desire to see AMD be more of an aggressor and a predator because the whining is self defeating.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Nvidia blatantly cheats in multiple 3DMark benchmarks to artificially boost their scores.
http://www.futuremark.com/pressroom/companypdfs/3dmark03_audit_report.pdf
http://www.theregister.co.uk/2003/06/03/futuremark_nvidia_didnt_cheat/

my turn, ATI cheats with FP16 demotion:
http://www.geeks3d.com/20100916/fp1...-by-ati-to-boost-benchmark-score-says-nvidia/

Nvidia artificially locks out any non-Nvidia users from in-game MSAA. Rocksteady claims that the lockout code was from Nvidia and that Rocksteady was not able to remove it without permission from Nvidia. AMD cards shown to be perfectly capable of displaying the graphics once the device ID is changed to that of an Nvidia card.
http://www.brightsideofnews.com/news/2009/11/4/batmangate-amd-vs-nvidia-vs-eidos-fight-analyzed.aspx
This has been reposted at least 10x in this forum. Go get Batman AA: GOTY edition and see if there is MSAA lockout yourself. If you bother to ask why MSAA was not open to AMD user before? The answer is from dev was that AMD never sent their code in before the game went gold.

My turn, Multi-display HUD problem on DX:HR. I have linked a direct quote from game dev, saying that they use AMD's eyefinity APIs for that, which, without surprise, does not with with any other cards that does not contain such APIs.

Nvidia artificially locks out PhysX from any computer system containing an AMD video card claiming they have done so for technical reasons, even for users of add-in Aegia PhysX cards. A later driver (Forceware 257.15) that somehow slipped out without the lockout in place shows that Nvidia isn't blocking it due to any technical concerns. PhysX works just fine with an AMD card as the primary renderer.
http://www.anandtech.com/show/3744/...terogeneous-gpu-physx-its-a-bug-not-a-feature
The technical reasoning is BS, I can give you that, but what exactly happened is Nvidia no longer release drivers officially to support such configuration. You can still use older drivers for that.

This puppy is Fermi! Actually, no, it isn't. It's just a mockup. But we'll try to pass it off as a working sample so people don't realize that we aren't even close to releasing it as a retail product yet.
http://www.xbitlabs.com/news/graphi...irst_Graphics_Cards_on_Track_for_Q4_2009.html
You got one. At the time of that conference the card in J's hand is not Fermi. You, however, made up the rest. The cause of delay on Fermi has 2 folds, one is TSMC manufacturing problem, and the second is Fermi's design flaw which cause 4xx's SM reduction, resulting video card that use excessive electricity and produce excessive heat.


Nvidia hires marketing firm AEG to increase Nvidia sales by seeding online forums with viral marketers. Small surprise, we find one of them here at ATF.
http://forums.anandtech.com/showthread.php?t=1804008
http://www.ngohq.com/home.php?page=articles&go=read&arc_id=112
Those happened 5 years ago, while Rollo is still today's hot rod, my source is already too old.

Nvidia Purevideo fiasco. Nvidia touts the power of their new NV40 video processor and how it will have full hardware acceleration of MPEG1, MPEG2 and WMV9 encoding and decoding. It's in slides, it's on the box. Yet even after the NV40 (Geforce 6800) is release, where is the promised Purevideo? Well, it turns out it's actually broken. What does Nvidia do about it? Nothing. And admits nothing. They make no announcements, no apologies and certainly don't attempt any sort of reimbursement for those people who bought a 6800 wanting to use the PureVideo function.
http://www.anandtech.com/show/1506/2
2004? Seriously? Is GPU video decoding a myth? GPU accelerated flash?

To be fair, Nvidia had the vision, but not the technology at the time. However, ATI did no better at this, at the time.

My turn, Richard Huddy was once very enthusiastic about Tessellation, saying that:
RH: The Rocksteady stuff is interesting from a number of perspectives. For starters, it's a DirectX 9 title so it doesn't push hardware terribly hard, and it certainly doesn't push any of the new capabilities. I won't deny it's a perfectly good game but in terms of using new hardware and using it hard - it doesn't - so it's not insanely exciting to us...
http://www.nvnews.net/vbulletin/showthread.php?t=134449

What happened? Oh Nvidia cheated on tessellation by doing it correctly. What? Developers overly use tessellation? Then GPU companies better wipe their engineers hard to handle it. No?

http://scalibq.wordpress.com/2010/10/25/amdrichard-huddy-need-to-lie-about-tessellation/

Nvidia bumpgate. At first, Nvidia denies that there is any problem at all. Then blames TSMC, insufficient cooling designs and even customers usage patterns. In the end, Nvidia is found to have been at fault and is forced to pay for repairs.
http://semiaccurate.com/2010/07/11/why-nvidias-chips-are-defective/

And still don't admit that it was their fault.
http://classactiontimes.com/entry/Nvidia-settles-Bumpgate-class-action-lawsuit
Please read the definition of settlement. I am not trolling you, but settling a case means the case is dropped, not defendant being guilty. When it comes to lawsuit, we talk about proof and evidence, not believe. Other than Charlie's one sided explanation of how that may have happened, none of us here actually tested the event by reconstructing the environment. That is, all we know about the case and what may have caused the problem is all from one source, Charlie Demerjian. Did I mentioned that Charlie hates Nvidia?

What is interesting about bumpgate is that the trick at is used to resurrect the card works as a universal fix to many electronics, which is now known as "baking." Coincidentally, many AMD card dies the same way and can be restored using the same trick.

In short, the case is closed without anyone guilty. Sorry, that is a settlement and those who decided to settle the case know this well. In fact, one clause of the settlement is that they agreed that there is no wrongdoing or liability on behalf of Nvidia.

In my own understanding, of what bumpgate really was:

customer: "My computer broke and it is under warranty, who is going to fix it?"
HP/Apple/dell: "We will, our warranty covers it." Mean while, forwarding claims toward Nvidia for broken GPU.
Nvidia replied: "My chip works fine. It died because it was operating in an environment that is off the spec."
Since Nvidia rejected those claims, HP/Apple/Dell have to bear the cost of repair.
After warranty period.
customer: "My computer broke, who is going to fix it?"
Nvidia/HP/Apple/Dell: "Not me."
customer: "WTF?"
mean while:
HP/Apple/Dell:"Nvidia, share the cost!"
Nvidia: "Not my fault."
Class lawsuit against Nvidia.
Nvidia: "Easy bros and bras, I was joking. Here is my share, the 200 million dollars."
happy HP/Apple/Dell.
Customer: "WTF? 150 bucks compensation for my 1500 bucks laptop?"
On the surface, a proper repair requires a new GPU, which is far more expensive than the "candle/bake/heatgun" fix.
apoppin got this one.
I'm sure there is more than this. These are just some of the more memorable Nvidia moments.
Most of these issues, which happened in the time span of 7 years, has be chew and rechew and rechew and then pick up and rechew and rechew.
 
Last edited:

KCfromNC

Senior member
Mar 17, 2007
208
0
76
Come again? Suspicious? Too far. You need to knock that off my friend. (Actually no you don't. It serves my arguments when you do stuff like this). Put up a link to ANY source showing a focus group member, myself or others, thinking the tesselation of flat surfaces or anywhere really, is "suspicious".

How does that sneaker rubber taste? Or are you a wingtip type of guy? :D

Wow, someone's getting awfully defensive here... One might even say emotional.

So you're saying it's perfectly reasonable to over-tessellate flat surfaces. OK, fair enough, I guess everyone is entitled to their own opinion on the matter.
 
Last edited:

Pantalaimon

Senior member
Feb 6, 2006
341
40
91
Please read the definition of settlement. I am not trolling you, but settling a case means the case is dropped, not defendant being guilty.

My, my, for this not being their fault, they sure set aside a lot of money for it for write downs in their results. So, it wasn't the PC manufacturers' fault, nor the users', and now according to you, not NVIDIA's either. Those chips must have just spontaneously become faulty and cooked themselves.

You know, you keep citing the result of the lawsuit as some holy truth, but we all know it's not, especially in civil suits. In a perfect world it would be, but in the real world, it's also about who has lawyers with better persuasion skills in influencing the jury, deeper pockets to be able to drag out the proceedings until the other side runs out of money, or happen to catch a judge who's not savy enough about all the technical details.
 

Mistwalker

Senior member
Feb 9, 2007
343
0
71
Actually the tesselated ocena HAS been explained...by Seero.
In short, Crysis 1 has the same "invisible" ocean as Crysis 2.
In Crysis 2 it gets tesselated, but that dosn't wate a lot of computing power, as tesselation can be, just like SM3.0, be "instanced"...the calcualtions copied.
It still doesn't explain why the ocean needed tessellating, and what Seero proposed was something he himself admitted to being just a theory that needed testing and verification. The sites that ran the reports found what was done did waste enough computing power to affect performance--that's factual, reproduceable information.

Like I said, there are theories, partial and potential explanations, but we have no solid data regarding the technical necessity of what was done, or the extent of Nvidia's role in it.

Not that it will stop AMD fans from denouncing Nvidia over it, or the Nvidia fans from dismissing the whole thing out of hand as "completely baseless." The fact remains it's not nearly as open-and-shut as you're portraying it. :(