Is Nvidia at it again with faking scores?

funboy6942

Lifer
Nov 13, 2001
15,362
416
126
I believe it was in the 5xxx series nvida got busted using a optimized driver that rendered only what you saw on screen and in doing so no other card could come close to what it was doing in terms of scores.

But I mention this because I own a Ati 1900gt and a 8800GT oc. And in just testing them on say 3dmark 2005 there are spots that has me going hmmmmm.

One spot iis in the very beginning, when they do a pan shot coming back from the water tower, you know the one that falls on the guy in the end. As it is panning back along the line of fire my 8800gt oc was showing 40+ FPS, but anyone with an eye can tell that is bs because it is very jittery as if it is doing less then 20, not smooth frame rate at all.

The other one that caught my eye was the flying light butterfly. When it was going through the trees I was seeing over 30fps, but it to was very jittery, in spots, and then when the camera would back up would do over 80fps.

These are the two that caught my eye and decided to test my old X1900GT card on the new system I built. OH BTW its a DFI Lanparty CF Mobo, E6600 at stock, and 3 gigs of DDR667 @ 800, running stock sound card.

The interesting thing is, my old card in the same jittery spots, showed the same "look" but sowed what I felt was scores to be true, the pan back in the fight, showed 21fps, and it looked like it, not as the 8800gt was reporting 40 but looked as the same as my 1900GT.

The Butterfly light, same deal, it got slow in the same spots but was showing a much lower FPS compared to the 8800gt, but you can tell the 8800gt was saying it was faster, but the slow motion picture frame smoothness told a different story as far as the fps was being shown on the screen. And my X1900gt supported what I was seeing, it showed low FPS, looked like thoes fps on the screen.

There were other parts thoughout the tests in 2005 and even 2001 that got me thinking that what it said it was reporting to me as FPS were nto what I was seeing with my own eye. And then seeing the X1900GT show what to me to be the same visual jitterness at the same points, I mean they looked identical, there was no way to go the GT was slower showing the slideshow for it was just as slow as the 8800 gt, just showed what I have to say was the correct FSP it was getting.

Im not a fan of either, just my money, and I hate to be duped into something that is reporting its a kick ass card when it may be cheating to show those figures. I mean lets face it andthing really over say 40fps your eye wont be able to tell so if the 8800gt is over that, it can put out numbers anywhere it liked if the driver was detecting what you were running it on to boost the fps numbers for you have no way of going, BS, thats not 150fps, its 55.

Then you got about every review out there with the author going "I cannot believe a card this type, running slower, not using near as much power as the rest, can go so fast, it boggles the mind."

I have to admit I was caught up with the reviews I read on it, going wow, $250 card smoking a 8800GTS 640MB one clocked much higher, and sucks up a tn more power. But after what I saw in my own tests, I have began to go wtf, something isnt right.

I have a ATI HD3870 card coming soon as well to see if in the same spots, running the same driver as my old X1900GT, are going to "look" the same way as the gt and 8800 did.

I would also like to know if anyone else can do the same tests. If so I will run it again and post where to look for slow down vs the fps being shown on the screen. If the drivers are cheating, and when it is actually doing 45fps in spots, and the drivers make it show 100+, of course its going to be able to post higher scores, or maybe its doing liek the old 5xxx sets were doing and not rendering everything on the screen, just what you will see to boost the numbers again, just got better at hiding that kind of code to be seen by the normal Joe. But I know what I been seeing between them. i have owned a ton of old hardware to notice slow down when I see it to go back into the options to maximize smooth game play, and in those same sports between the 2 cards, one is lying for what I can see.

Spots to look at if you have means to test between a 8800gt, and say a 8800gts 640, or 2900xt, or even a 3870, would be, just for starters, the opening parts of 3dmark 2005. Right before the hatch is open pay close attention to how smooth everything is going, vs the FPS being shown, and a dead give away is when the chain gun is being fired and they pan down the firing line from the water tower back, watch it to be sure it runs smooth, not a dit dit dit semi smooth, if it is running at 40+fps there shouldnt be any dit dit dit in the visuals.

Also try looking at the butterfly light part for any times it doesnt look to be running smooth, notice the fps it is showing, then test it with another card, and look for the same kind of visual jittery. The human eye only picks up on this jitter when fps start to dip around 20ps. Anything really over that to the eye and it looks like one smooth fluid motion. And if the cards are showing 24fps+, then there shouldnt be any jittery anything noticed at all.

Anyway, thats it. Will see when the new HD3870 shows up if it shows a fluid motion during those parts I saw the "problems" at and report back. If anyone else tries to test the card against another PLEASE report what you see.

Link from 3dmark back in 2003 finding Nvida cheating.
 

funboy6942

Lifer
Nov 13, 2001
15,362
416
126
Looks as though I wasnt the only one who took notice that something just didnt look right.

And if this does turn out to be true, it is soooo much bull shit to have the MAKER of the card, make its drivers cheat to boost scores, to boost sales.

If they are cheating, by posting wrong fps, or lack of image quality in order to do so, it shouldnt be up to them to make drivers do such but game programmers, and end users to adjust their systems accordingly. Not do it on their end so it looks as though their stuff is the shit, killing all others no matter what, to boost their image, and make people think that their card is better for it gets better scores, because it is cheating to do so, when say ATI isnt giving the end user a true score achieved without driver optimizations, or cheats embedded to make it go faster.
 

Avalon

Diamond Member
Jul 16, 2001
7,571
178
106
When the HD3000/2000 series does better in 3Dmark over actual games, while the 8800GT does well in both, I'm having a hard time swallowing that.

That's not to say it can't be done again, but I dunno.
 

funboy6942

Lifer
Nov 13, 2001
15,362
416
126
Well see the place in the link above found a problem with the demo of Crysis and found a driver cheat. What is there to say that same kind of "cheat" isnt enabled in 3dmark, and games if they have access to doing it making their games look like they are kicking ass in both because their drivers cheat vs what AtI puts out and how it may render the games. And just sitting back writing this has me wondering about it on all the games with the beginning logo that says "Plays best on Nvidia graphics", I wonder why :p
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
hold your horses!!! I'm pretty sure there are parts in 3dmark 03, 05 and 06 that no matter what card is used just looks choppy since 3dmark is rendered at a set speed no matter how many fps are being displayed. I remember a spot in the troll lair level in 3dmark 03 that looks choppy even as I get 70-200+fps. I've run 9800pro, X800pro,6800GT,7800GTX X1800XT, 7800GT sli,X1900XT, 8800GTX, and my current 8800gt sli rig..... guess what? The same spot still does not look smooth. Driver cheats?...... i think not. 3dmark suckage?.... I think yes.

/end thread
 

ja1484

Platinum Member
Dec 31, 2007
2,438
2
0
3dMark didn't matter back then, and it still doesn't matter now. I couldn't care less if they're crapping around with it.

We've all seen the image quality comparisons and apple to apple game benchmarks. It's not like the community is under some illusion of what they're getting with either card.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: funboy42
Spots to look at if you have means to test between a 8800gt, and say a 8800gts 640, or 2900xt, or even a 3870, would be, just for starters, the opening parts of 3dmark 2005. Right before the hatch is open pay close attention to how smooth everything is going, vs the FPS being shown, and a dead give away is when the chain gun is being fired and they pan down the firing line from the water tower back, watch it to be sure it runs smooth, not a dit dit dit semi smooth, if it is running at 40+fps there shouldnt be any dit dit dit in the visuals.

maybe something is wrong with your nvidia setup
-or perhaps you are blinded by fanboyism with a flame-bait topic title - "Is Nvidia at it again with faking scores?" :p

i'd say emphatically, No!

Several of us already did these tests:

In House HD2900XT vs. 8800GTS 640

*one thing* - run FRAPS and check it again ... don't depend on the FPS counter that is shown in 3DMarkXX ;)
 

funboy6942

Lifer
Nov 13, 2001
15,362
416
126
Originally posted by: ja1484

3dMark didn't matter back then, and it still doesn't matter now. I couldn't care less if they're crapping around with it.

but my point would be now, that they are busted in doing something with a game, Crysis, to make it run faster, and a better score when tested as a benchmark. Whats to say they havent done something to all benchmarks, and games for that matter to make it look as if their card is faster and better?

That what I am trying to figure out is if they are doing this to everything, if so its lying, deceiving, and not as far out ahead from ATI as everyone is led to believe, when ATI drivers and not doing any cheats or "Optimizing" give you a more feature rich, eye candy, experience because no corners were made to make it run faster, and get a better score to sell more cards and make more money.

I dont know about anyone else but I like my card to do its best job as is and leave the hack and optimizations to the producers and programmer of the games I play. I dont like to miss out on a few tidbits because the card manufacture wants to sell more products at the cost of lying about scores at the cost of eye candy I should be seeing I AM PAYING FOR when I buy a new game.

If everyone is to say, who gives a shit, then why buy Crysis for the surpurb graphics if the card manufacture is going to start disabling features of the game so it runs faster making their product look kick ass. If that be the case, but a 7900gtx and cut the eye candy and res down yourself and save a hundred +.

I know I have a hard time writing out my point so if anyone can understand where I am going with this, why I want to look into it, and put it in terms that may come across a bit easier that be great.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
That what I am trying to figure out is if they are doing this to everything, if so its lying, deceiving, and not as far out ahead from ATI as everyone is led to believe, when ATI drivers and not doing any cheats or "Optimizing" give you a more feature rich, eye candy, experience because no corners were made to make it run faster, and get a better score to sell more cards and make more money.

people much smarter than you or i haven't been able to catch them :p

i think something is wrong with your setup or nvidia driver - but try running FRAPs instead of watching the 3DMark counter - and report back
:confused:

 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
I have stopped caring what a card gets in 3dmock a long time ago, I only look at game benchmarks from reliable and competent sources. Nevertheless, I wouldn't be surprised if both sides were guilty of overly-aggressive "optimizations."
 

Sheninat0r

Senior member
Jun 8, 2007
515
1
81
3DMark lost all meaning to me when the R600 wrecked the G80 scores-wise, but ended up going much slower in games.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: funboy42
I believe it was in the 5xxx series nvida got busted using a optimized driver that rendered only what you saw on screen and in doing so no other card could come close to what it was doing in terms of scores.

And that is where I stopped reading. I am sorry but how in the blazes is "not rendering stuff that is not on the screen" faking scores?
Its improving performance via eliminating waste, rather then via brute force increase to calculation capacity.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: taltamir
Originally posted by: funboy42
I believe it was in the 5xxx series nvida got busted using a optimized driver that rendered only what you saw on screen and in doing so no other card could come close to what it was doing in terms of scores.

And that is where I stopped reading. I am sorry but how in the blazes is "not rendering stuff that is not on the screen" faking scores?
Its improving performance via eliminating waste, rather then via brute force increase to calculation capacity.

If you read up on the whole cheating debacle, it was found that Nvidia wasn't just eliminating waste. From what I can remember they did the following:
* Eliminate entire parts of the scene that the predefined camera path did not look at. You would get severe rendering errors if the user controlled the camera and looked around freely.
* Only clear the buffer between frames on predefined parts of the scene, based on the fixed camera. Again, a free-roaming camera would reveal rendering errors.
* Replace pixel shader code with partial-precision code that ran faster at the expense of image quality
These were't just optimizations. They were cheats used to inflate the score, and moreover, those cheats based on predefined camera movement would never work in an actual interactive game, which made the score even more meaningless.
 

ja1484

Platinum Member
Dec 31, 2007
2,438
2
0
Originally posted by: funboy42
but my point would be now, that they are busted in doing something with a game, Crysis, to make it run faster, and a better score when tested as a benchmark. Whats to say they havent done something to all benchmarks, and games for that matter to make it look as if their card is faster and better?

I think you're missing my point, which is:

If it runs faster and the image quality is comparable to the competition, who gives a shit what they did?

That what I am trying to figure out is if they are doing this to everything, if so its lying, deceiving, and not as far out ahead from ATI as everyone is led to believe, when ATI drivers and not doing any cheats or "Optimizing" give you a more feature rich, eye candy, experience because no corners were made to make it run faster, and get a better score to sell more cards and make more money.

....Other miscellaneous ranting of questionable sanity....


Look, that's all well and good, but show me some evidence. Where does the image quality suffer? I want to see comparison screens. Every review I've seen (many of which have custom benchmark timedemos produced by the reviewer, not built in ones shipped with the game) has shown image quality to be, for all practical purposes, identical and the 8800GT ahead in performance.

If you have some contrarian screens or benchmarks, then by all means let's see them, but if you don't, exactly what are you ranting about?

 

formulav8

Diamond Member
Sep 18, 2000
7,004
522
126
I know learned people could care less about 3DMark. But the facts are that Many people DO care. Even some normal joes has heard of 3DMark.

Why would a company waste time on cheating at 3DMark>? Because it WILL have a IMPACT on the video card many people will buy. When they see this or that card with a 3DMark score higher than the other companys, then they assume it is a faster card in gaming. So cheating in 3DMark could 'possible' impact video card sales. And that is why a maker would over-optimize/cheat or whatever you call it.

And besides, if it was meaningless to most people, reviewers wouldn't post 3DMark results in their card reviews. And definitely gpu makers wouldn't waste resources on 'over-optimizing' for it. Even when you see so-called "leaked" first results of a video card, it is many times the 3DMark score that is posted.

And besides, cheating is cheating. At least to me it is :)



Jason
 

smaky

Member
Jan 1, 2005
119
0
0
If nvidia is cheating, why have I not seen an nvidia card score in the top 5 3dmark06 hall of fame in the last 3 or more months?

Yeah so I only check once every few weeks but out of the top 10 scores 9 always seemed to be the Ati 2900 card crossfire.

edit: I just have to add this little emote :p make your own conclusions about the fact
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Funny thing is how R600/RV670 cards score much higher in 3dmark then the G80/G92 counterparts.

The situation is totally reversed in real world performance.

 

Buck Armstrong

Platinum Member
Dec 17, 2004
2,015
1
0
Wait...people still use 3DMark? And they actually use it to decide which card to buy?!

Let's assume for second that they are cheating to gain higher scores in some irrelevant synthetic application...what am I supposed to do, go out and buy and ATI card? Please. I have better things to do with my computer than spend hours trying to get each game to work with broken drivers, on a card with a huge brick sh*thouse of a HSF squatting on top of it.

I realize this might be unpopular, but let's just be honest here: they had the 9700 and 9800, and that was it. And those cards were the exception, not the rule. They sucked before those two cards, and they've sucked since those two cards.

That's right, I said it. :p
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: Buck Armstrong
Wait...people still use 3DMark? And they actually use it to decide which card to buy?!

Let's assume for second that they are cheating to gain higher scores in some irrelevant synthetic application...what am I supposed to do, go out and buy and ATI card? Please. I have better things to do with my computer than spend hours trying to get each game to work with broken drivers, on a card with a huge brick sh*thouse of a HSF squatting on top of it.

I realize this might be unpopular, but let's just be honest here: they had the 9700 and 9800, and that was it. And those cards were the exception, not the rule. They sucked before those two cards, and they've sucked since those two cards.

That's right, I said it. :p

Yep, you said it. "IT" however, is incorrect. Contrary to your slightly bent beliefs, ATI has had several very good graphics cards since the 9700/9800's. Superior in some ways and inferior in others. But all along, have had good products.

And, I highly doubt that any knowledgeable computer user would base their graphics card purchase on 3DMark06 these days. The ones that don't know any better might, but most rely on actual gaming benchmarks.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: munky
Originally posted by: taltamir
Originally posted by: funboy42
I believe it was in the 5xxx series nvida got busted using a optimized driver that rendered only what you saw on screen and in doing so no other card could come close to what it was doing in terms of scores.

And that is where I stopped reading. I am sorry but how in the blazes is "not rendering stuff that is not on the screen" faking scores?
Its improving performance via eliminating waste, rather then via brute force increase to calculation capacity.

If you read up on the whole cheating debacle, it was found that Nvidia wasn't just eliminating waste. From what I can remember they did the following:
* Eliminate entire parts of the scene that the predefined camera path did not look at. You would get severe rendering errors if the user controlled the camera and looked around freely.
* Only clear the buffer between frames on predefined parts of the scene, based on the fixed camera. Again, a free-roaming camera would reveal rendering errors.
* Replace pixel shader code with partial-precision code that ran faster at the expense of image quality
These were't just optimizations. They were cheats used to inflate the score, and moreover, those cheats based on predefined camera movement would never work in an actual interactive game, which made the score even more meaningless.

What you are saying is COMPLETELY different then what he said.. I guess he just worded it abysmally.

Originally posted by: Syntax Error
Originally posted by: Scoop
Who cares about 3dmark results?

Stupid people! :laugh:

I love you syntax error. QFT



I do however agree that no matter how stupid it is, a company shouldnt cheat on its scores... then again I expect NOTHING from the slimes at nvidia, ati, amd, and intel...
Its a bunch of evil bastards at each of them.

@OP: if both cards look the same but report different FPS, then perhaps that "look" you are seeing is not FTP lag but something else entirely.
 

Quiksilver

Diamond Member
Jul 3, 2005
4,725
0
71
Adding Fuel To This Fire

Image quality of the 3870 compared to the 8800GT looks 10 times better, but what I don't know sames it was done with a) a demo, b) older drivers, and c) different altitudes if any of it has changed.