• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Testing nvidia cheat claims

bs56

Junior Member
I was wondering if there are any ideas out there for how to try to verify one way or the other the claims that Nvidia cheated? Especially the claim that the cheating could invalidate any benchmarks from real games, like serious sam.

After seeing the speculation that Serious Sam 1 is showing image problems because Nvidia did the rail optimization for Serious Sam II, I was wondering if the following would work:

Record a few new time demos in serious sam II. Get benchmark numbers for an old driver that wouldn't have the culling optimizations. Compare the newly recorded time demos against the results for an existing SSII benchmark. Since these are older non-bug/non-cheating drivers, these results should be similar, probably within +/- 1-2 fps I'd guess.

Now move to the new drivers, and run the same test. If Nvidia has put in hand optimized code that effects normal game benchmarks beyond 3dmark03, then we may see that the time demo shipping with sam would be much faster than the newly recorded time demos. If both newly recorded demos and original demo are all the same speed (but faster then the previous driver), then Nvidia just has a good optimization for normal games.

Does this sound like a reasonable approach?

Any other ideas?
 
I can?t see the Hardware sites graphic card reviews having any credibility anymore unless they address the 3Dmark issue -- and even possible cheating on the 5 or 6? ?usual suspects? ? scripted benchmarks they run like UT3 (botmatch, flyby) etc.

The 5900 was winning benchmarks with flying colors and looked faster than the 9800, but when ExtremeTech ran their own set of benchmarks (3DGameGage 8 games) the 9800 came out 12% faster (1024x960 4X FSAA and 8X AF) and the 9800 was faster in 5 of the 8 game tests. I believe Extremetech said is was 3DGG?s disparity with 3Dmark that prompted them to investigate.

ATI and Nvidia can optimize for specific games, but at least the entire game will run faster. This still of course has the possible effect of making a card look faster than it really is overall if Anandtech was to use the 4 or 5 games Nvidia optimized for (and ATI didn?t optimize). Nvidia?s card would look much better in those games than it would run comparatively in general games.

Hardware sites are going to have to script their own benchmarks, and perhaps use more and varied games.
 
Since when is optimizing hardware for a game considered cheating? Who cares HOW they get a game to run faster on their hardware as long as it runs faster.

That brings up an interesting idea though... what if games came with their own drivers for your hardware... so like, Doom 3 could come with GeForce 4 drivers, GeForce FX drivers, and ATI drivers... and that driver would only be used for that game... and then Half Life 2 would come with it's own GF4, GFFX, and ATI drivers.
Would be a new way of doing things... a generic driver for Windows and all 2D stuff... and then application specific drivers for each application.
 
Originally posted by: Jeff7181
Since when is optimizing hardware for a game considered cheating? Who cares HOW they get a game to run faster on their hardware as long as it runs faster.

That brings up an interesting idea though... what if games came with their own drivers for your hardware... so like, Doom 3 could come with GeForce 4 drivers, GeForce FX drivers, and ATI drivers... and that driver would only be used for that game... and then Half Life 2 would come with it's own GF4, GFFX, and ATI drivers.
Would be a new way of doing things... a generic driver for Windows and all 2D stuff... and then application specific drivers for each application.
So just make them look crappy to run faster? You can already do that by making the game run 16bit and 640x480
Bad idea. To throw-out industry standardization of game coding would be the biggest mistake ever. Game developers would be off the hook with any problems caused by bad coding, video card manufactures would be struggling to fix game developer issues because there is no spec.

Ugh, what a nightmare.


 
Originally posted by: Killrose
Originally posted by: Jeff7181
Since when is optimizing hardware for a game considered cheating? Who cares HOW they get a game to run faster on their hardware as long as it runs faster.

That brings up an interesting idea though... what if games came with their own drivers for your hardware... so like, Doom 3 could come with GeForce 4 drivers, GeForce FX drivers, and ATI drivers... and that driver would only be used for that game... and then Half Life 2 would come with it's own GF4, GFFX, and ATI drivers.
Would be a new way of doing things... a generic driver for Windows and all 2D stuff... and then application specific drivers for each application.
So just make them look crappy to run faster? You can already do that by making the game run 16bit and 640x480
Bad idea. To throw-out industry standardization of game coding would be the biggest mistake ever. Game developers would be off the hook with any problems caused by bad coding, video card manufactures would be struggling to fix game developer issues because there is no spec.

Ugh, what a nightmare.

i can take steroids, so who cares how i'm getting faster right?

 
i can take steroids, so who cares how i'm getting faster right?
Yes... exactly. If the game is faster, who cares how they did it?

If it's at the expense of image quality, then obviously that's bad. But if they optimized their hardware drivers to work well with certain game engines... in my opinion, that's a plus for those who play games that use those engines.
 
If they get faster by means other than lowering image quality, then im all for it.
But reducing image quality to get faster speeds is unforgivable and shouldnt be allowed.
 
Originally posted by: Jeff7181
i can take steroids, so who cares how i'm getting faster right?
Yes... exactly. If the game is faster, who cares how they did it?

What if their ONLY optimizing for the games that are commonly benchmarked, and virtually ignoring all other games?
99.9% of reviews wouldnt notice, and that applies to AT as well from what I've seen.
Anyone want to place bets on the fact that both ATi/nVidia have spend hundreds of hours more on UT2003 then they have on any 2 other games (Besides Q3A) combined?
Regardless of the relative popularity of the games.
 
Originally posted by: Jeff7181
Since when is optimizing hardware for a game considered cheating? Who cares HOW they get a game to run faster on their hardware as long as it runs faster.

That brings up an interesting idea though... what if games came with their own drivers for your hardware... so like, Doom 3 could come with GeForce 4 drivers, GeForce FX drivers, and ATI drivers... and that driver would only be used for that game... and then Half Life 2 would come with it's own GF4, GFFX, and ATI drivers.
Would be a new way of doing things... a generic driver for Windows and all 2D stuff... and then application specific drivers for each application.

that is not actually a bad idea. uhuh, might better go downstairs. tornado warning and its getting a wierd greenish colour outside.
 
Originally posted by: Schadenfroh
Originally posted by: Jeff7181
Since when is optimizing hardware for a game considered cheating? Who cares HOW they get a game to run faster on their hardware as long as it runs faster.

That brings up an interesting idea though... what if games came with their own drivers for your hardware... so like, Doom 3 could come with GeForce 4 drivers, GeForce FX drivers, and ATI drivers... and that driver would only be used for that game... and then Half Life 2 would come with it's own GF4, GFFX, and ATI drivers.
Would be a new way of doing things... a generic driver for Windows and all 2D stuff... and then application specific drivers for each application.

that is not actually a bad idea. uhuh, might better go downstairs. tornado warning and its getting a wierd greenish colour outside.

Re-reading the other responses... it would be up to the game developers to develop drivers specific to their games. In other words... the video card makers would make their video cards and drivers "programmable" ... if you get what I'm saying.
 
Originally posted by: Jeff7181Re-reading the other responses... it would be up to the game developers to develop drivers specific to their games. In other words... the video card makers would make their video cards and drivers "programmable" ... if you get what I'm saying.
Thats a good idea, but, is unpracticle. It`d require very close developer relations, and something like that wouldnt work. Developers could be accused by say ATI for working closer, and thus optimising better for nVidia cards, So then ATI would do the same with another developer, and then nvidia would do it again, and after a while, the gaming market would be split into two camps. Which isnt good for us at all.
 
Does anyone else here think all this talk of nvidia cheating is getting bloody boring? All we hear is the same things over and over again. It hasn't even been proven if nvidia has done anything wrong. I think some of the people here who have been talking constantly about it need to get out abit more.
 
Says the person who spells the words "a bit", like the motherboard manufactuer, abit. lol.
Just Joking.
Seriously though. If nvidia has infact been cheating for benchmarks, then we should have a right to know really, shouldnt we?
I mean alot of us have reccommending various GF cards over the last couple of months, and if we now find out that these supposid cards that we`ve reccommended are a bag of sh1t, image quality wise, then alot of us are goind to be really pi$$ed off.
If these rumors are infact true, then i can see nvidia losing alot of enthusiast support.
 
Originally posted by: BoomAM
If they get faster by means other than lowering image quality, then im all for it.
But reducing image quality to get faster speeds is unforgivable and shouldnt be allowed.

I could have sworn that the reviews said that the image quality from nvidia was on par with ATI now.
Where are you getting all of this "crappy image quality from"? Can you post a link?
I am not dismissing your claim, I would just like some kind of reliable proof from an accredited hardware reviewer.

Thanks,

Keys
 
Keysplayer, i didnt say that the image quality was worse.
I said that if they WERE to lower the image quality to get faster speeds, then it shouldnt be allowed.
Sorry for the confusion, i should have been more clear.
 
Originally posted by: BoomAM
Says the person who spells the words "a bit", like the motherboard manufactuer, abit. lol.
Just Joking.
Seriously though. If nvidia has infact been cheating for benchmarks, then we should have a right to know really, shouldnt we?
I mean alot of us have reccommending various GF cards over the last couple of months, and if we now find out that these supposid cards that we`ve reccommended are a bag of sh1t, image quality wise, then alot of us are goind to be really pi$$ed off.
If these rumors are infact true, then i can see nvidia losing alot of enthusiast support.

My FX5200 has great image quality with the new 44.03 drivers, however I can't comment on the FX5800 or FX5900 etc because I don't have any.
 
Ive gotta admit, i find the 5200 an interesting product, i was considering buying one in a few weeks to put in the second pc im building from my old pc parts, then i realised that i had a old GF3Ti200 on my shelf that OCed to Ti500 seppds, so i scrapped that idea. lol
 
Hi guys,

Just adding my 2 cents' worth. Most of us rely on benchmarks to give us an idea of a card's potential, IF nVidia optimized their driver specifically to gain extra 3D marks, then it's cheating. NO buts.... NO ifs...nada! Optimizing a driver to enable a card to run more smoothly with higher framerates in a particular game (eg DOOM III) IS not cheating. I'm sure we all appreciate our cards' manufacturers punching out new drivers for our cards, optimized for games but not for benchmark (especially if it's true based on an article I read whereby it was alleged that nVidia's driver, 44.03 I think, cut corners to bench higher). That's a definite NO NO. Don't get me wrong I used to own a GF2 MX400 and 4200 Ti, both were, still are, excellent cards, I've since gotten a 9700P and 9800P and hve not looked back.
 
Originally posted by: Glitchny
Originally posted by: YBS1
In reply to keysplayr,

Not that this is relevant to discussion of nV's cheating in 3D Mark but....

9800

FX 5900

yikes i hadnt seen the comparison before looking at those... and its definatly not on par

Maybe I'm missing something but those pictures look almost identical to me! :|

 
Originally posted by: BoomAM
Ive gotta admit, i find the 5200 an interesting product, i was considering buying one in a few weeks to put in the second pc im building from my old pc parts, then i realised that i had a old GF3Ti200 on my shelf that OCed to Ti500 seppds, so i scrapped that idea. lol

The FX5200 is ok but it does have some problems and with the 44.03 drivers it's game performance is only just a bit higher than my old GF4 MX440 (ofcourse the FX5200 is much faster with 3dmark 2001 and 2003 but you don't play them) . Some games actually run slower on the FX5200 than on my MX440, Serious Sam 2 and RTCW are two good examples. My XFX FX5200 will be in a cupboard soon anyway, I'll be using a XFX GF4 Ti4200 instead which will give my 2nd pc a big speed boost at the cost of DX9 support ofcourse.

I recently sold my old Visiontek GF3, it was a great video card much faster than my FX5200 and MX440, I made a big mistake selling it! 🙁
 
Look at the ??jaggies?? on the horizontal window boarders on the first row just above the plants. The 6xAA of the 9800 does a much better job and smoothes them out completely.
 
Back
Top