Is Nvidia at it again with faking scores?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

superbooga

Senior member
Jun 16, 2001
333
0
0
Originally posted by: Quiksilver
Adding Fuel To This Fire

Image quality of the 3870 compared to the 8800GT looks 10 times better, but what I don't know sames it was done with a) a demo, b) older drivers, and c) different altitudes if any of it has changed.

Why are we posting about things that are months old now?

The reflection problem is a bug, not a hack or optimization. It's been fixed in more recent drivers. This is OLD NEWS.

Also, the difference you see in the screenshot is due to different altitudes. The 8800GT shot is hazy because you're through a cloud. The shot was taken about 30 frames before the the ATI shot.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: superbooga
Originally posted by: Quiksilver
Adding Fuel To This Fire

Image quality of the 3870 compared to the 8800GT looks 10 times better, but what I don't know sames it was done with a) a demo, b) older drivers, and c) different altitudes if any of it has changed.

Why are we posting about things that are months old now?

The reflection problem is a bug, not a hack or optimization. It's been fixed in more recent drivers. This is OLD NEWS.

Also, the difference you see in the screenshot is due to different altitudes. The 8800GT shot is hazy because you're through a cloud. The shot was taken about 30 frames before the the ATI shot.

Correct, the article claiming the cheating is cheating by itself by posting pictures taken from slightly different frames causing one to be seen through a cloud just as its passing through (not far enough from the cloud to be blocked and easily recognizable as a cloud by someone who didn't play the game/isn't paying attention)
 

Quiksilver

Diamond Member
Jul 3, 2005
4,726
0
71
Originally posted by: taltamir
Originally posted by: superbooga
Originally posted by: Quiksilver
Adding Fuel To This Fire

Image quality of the 3870 compared to the 8800GT looks 10 times better, but what I don't know sames it was done with a) a demo, b) older drivers, and c) different altitudes if any of it has changed.

Why are we posting about things that are months old now?

The reflection problem is a bug, not a hack or optimization. It's been fixed in more recent drivers. This is OLD NEWS.

Also, the difference you see in the screenshot is due to different altitudes. The 8800GT shot is hazy because you're through a cloud. The shot was taken about 30 frames before the the ATI shot.

Correct, the article claiming the cheating is cheating by itself by posting pictures taken from slightly different frames causing one to be seen through a cloud just as its passing through (not far enough from the cloud to be blocked and easily recognizable as a cloud by someone who didn't play the game/isn't paying attention)

I know that, I even mentioned that it was taken at a different altitude and using old drivers.

However, I posted it for the reason that it might cause some to compare the two cards in an image quality perspective rather than just "who has the best performance".
 

neothe0ne

Member
Feb 26, 2006
197
0
0
Originally posted by: Buck Armstrong
I realize this might be unpopular, but let's just be honest here: they had the 9700 and 9800, and that was it. And those cards were the exception, not the rule. They sucked before those two cards, and they've sucked since those two cards.

That's right, I said it. :p

X800. That's right, you're wrong.
 

nemesismk2

Diamond Member
Sep 29, 2001
4,810
5
76
www.ultimatehardware.net
Originally posted by: Cookie Monster
Funny thing is how R600/RV670 cards score much higher in 3dmark then the G80/G92 counterparts.

The situation is totally reversed in real world performance.

That MUST mean that Nvidia are cheating with real world performance but not with 3dmark, I knew Nvidia couldn't be trusted! ;)
 

funboy6942

Lifer
Nov 13, 2001
15,290
389
126
Originally posted by: ja1484
Originally posted by: funboy42
but my point would be now, that they are busted in doing something with a game, Crysis, to make it run faster, and a better score when tested as a benchmark. Whats to say they havent done something to all benchmarks, and games for that matter to make it look as if their card is faster and better?

I think you're missing my point, which is:

If it runs faster and the image quality is comparable to the competition, who gives a shit what they did?

That what I am trying to figure out is if they are doing this to everything, if so its lying, deceiving, and not as far out ahead from ATI as everyone is led to believe, when ATI drivers and not doing any cheats or "Optimizing" give you a more feature rich, eye candy, experience because no corners were made to make it run faster, and get a better score to sell more cards and make more money.

....Other miscellaneous ranting of questionable sanity....


Look, that's all well and good, but show me some evidence. Where does the image quality suffer? I want to see comparison screens. Every review I've seen (many of which have custom benchmark timedemos produced by the reviewer, not built in ones shipped with the game) has shown image quality to be, for all practical purposes, identical and the 8800GT ahead in performance.

If you have some contrarian screens or benchmarks, then by all means let's see them, but if you don't, exactly what are you ranting about?

OK

Here they been busted cheating in a GAME DEMO. Now whats to say they havent done more across the board on all the games to make it cheat at the expense of graphic quality.
And again this isnt about them doing it to make the games run better for you, they do this to inflate their scores so people like us will go on when we buy new graphic hardware and go "DAMN! It got this in 3dmark, game xxx, vs ATI they kick ass", during all of which was a lie to the consumer because they "optimized" their drivers to render less, to make it look like it is faster.

That is my beef here people, not just in 3dmark, but they are doing it in games and got busted. Is the 8800gt as fast as we are led to believe? Again im not a fan of either ATI or Nvida, just my money and I hate to be deceived or lied to when I go to make my decisions because they want to look good in the spot light to inflate score, sell more card because of it, all at the cost of my visual enjoyment which I paid for when I buy a game. I am sure the developers of the games didnt ask to have some of they visual messed with, if so they wouldnt of bothered writing all that damn code if a GRAPHIC maker is going to find a way around it to make their card faster, or seem faster, just to sell more of them. Its the comments like "Who gives a shit what they do" that allows them to do what ever they want because because people dont care, just as long as they are fed BS and think what they read is true, just give it to them. I mean hell I went and bought the card based on all the reviews I read, which was their plan if they really are cheating in more then just Crysis. And I dont need FRAPS to see slowdown, I dont the program to tell me fps, I can see it when it gets low enough since I know what Im looking for.

Again, I will see when the new card shows. My "problems" I seen in 3dmark were not driver or set up issues. It is a fresh install and I been building computers for over 10 years, some of which were for big business back up north where I lived, such as Verizon Wireless. I may not be able to write good, or get my point across when writing, but I do know what the hell I am doing. And again I used 3dmark as my EXAMPLE I found between the 2 cards I own as to why I see the same damn slowdown in spots, at the same visual rates, but one card shows 20~ fps, when the other, showing the same kind of slowdowns, is showing over 40fps, when you can clearly see its not. That 8800gt oc should of spanked my X1900GT and not showed any slowdown at all at 1024x768, stock setting, if it is as bad ass as I was led to believe.
 

funboy6942

Lifer
Nov 13, 2001
15,290
389
126
Originally posted by: keysplayr2003
Originally posted by: Buck Armstrong
Wait...people still use 3DMark? And they actually use it to decide which card to buy?!

Let's assume for second that they are cheating to gain higher scores in some irrelevant synthetic application...what am I supposed to do, go out and buy and ATI card? Please. I have better things to do with my computer than spend hours trying to get each game to work with broken drivers, on a card with a huge brick sh*thouse of a HSF squatting on top of it.

I realize this might be unpopular, but let's just be honest here: they had the 9700 and 9800, and that was it. And those cards were the exception, not the rule. They sucked before those two cards, and they've sucked since those two cards.

That's right, I said it. :p

Yep, you said it. "IT" however, is incorrect. Contrary to your slightly bent beliefs, ATI has had several very good graphics cards since the 9700/9800's. Superior in some ways and inferior in others. But all along, have had good products.

And, I highly doubt that any knowledgeable computer user would base their graphics card purchase on 3DMark06 these days. The ones that don't know any better might, but most rely on actual gaming benchmarks.

And I dont, I was looking at all the scores between the games I own in all the reviews before I went and bought a 8800gt, over the HD3870 in the first place because Im a fan of my money, not maker or a graphics card. And call me silly, but 2005 and my NFS MW game didnt look near as nice running on my 8800gt vs my X1900gt. My beef is like I said before. Its at stock speeds, I run the test after a install to make sure my pc is running right and looking for graphical errors. Its a way so I dont have to try and play a game and look for anything out of the ordinary. Hence why I was seeing slow downs between the 2 cards running 3dmark, for I was paying very close attention looking for graphical errors, and was like "WTF" 40+fps, that shit is a slide show, homie please :p
Then stuck back in my old card on my new system and seen the same slowdown, posting a much different FPS which is about what I was figuring with my eye.
 

funboy6942

Lifer
Nov 13, 2001
15,290
389
126
OH YEAH, and the crysis thing wasnt a bug, becaus eif it were changing the name of the demo from anything but Crysis.exe made the 8800gt run slower, and look "normal".

If you read most of it up on the problem if it was a bug renaming the demo from its normal name the bug would of still been there. It changed because it was tricked and the driver "optimizations" didnt kick in because it didnt think it was running the Crysis.exe demo but something else and had to do its job and not cheat at it to make more FPS to sell more cards.

A bug is a bug is a bug, changing the name of a demo wont change they way it should be rendered unless it knew it was running a demo and had to do something to kick the fps up, and something you may not of noticed right off hand, or would of just said everything up close looks good, why do I care what the foreground looks like, wee Im at 30fps now and looks like shit. At least I have more fps, because thats better then it looking kick ass and really liking the game and the visual eye candy. Also my point is if they

And again if your going to say fuck it, I dont care, why did you buy the card at the expense of visual candy if thats what they are doing to make it seem faster? At that point you should of kept your older card and run at a lower res with more eye candy on.did this to crysis, only because someone noticed how it didnt look right to them, and was really looking for a difference between the cards, what is to say they didnt do this to ALL the other demos, or games, but hid it better so you would be able to see with your eye, something else taking place to increase a score or fps. Just because you cant see it doesnt make it right, and IT IS still cheating to make more sales.
 

BassBomb

Diamond Member
Nov 25, 2005
8,396
1
81
Why is everyone getting pissed about a review using BETA drivers, which results in an issue already fixed in newer drivers?

 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: funboy42
OH YEAH, and the crysis thing wasnt a bug, becaus eif it were changing the name of the demo from anything but Crysis.exe made the 8800gt run slower, and look "normal".

If you read most of it up on the problem if it was a bug renaming the demo from its normal name the bug would of still been there. It changed because it was tricked and the driver "optimizations" didnt kick in because it didnt think it was running the Crysis.exe demo but something else and had to do its job and not cheat at it to make more FPS to sell more cards.

A bug is a bug is a bug, changing the name of a demo wont change they way it should be rendered unless it knew it was running a demo and had to do something to kick the fps up, and something you may not of noticed right off hand, or would of just said everything up close looks good, why do I care what the foreground looks like, wee Im at 30fps now and looks like shit. At least I have more fps, because thats better then it looking kick ass and really liking the game and the visual eye candy. Also my point is if they

And again if your going to say fuck it, I dont care, why did you buy the card at the expense of visual candy if thats what they are doing to make it seem faster? At that point you should of kept your older card and run at a lower res with more eye candy on.did this to crysis, only because someone noticed how it didnt look right to them, and was really looking for a difference between the cards, what is to say they didnt do this to ALL the other demos, or games, but hid it better so you would be able to see with your eye, something else taking place to increase a score or fps. Just because you cant see it doesnt make it right, and IT IS still cheating to make more sales.

So how does your GT look in Crysis today? How does it perform? Anything out of the ordinary? Do you notice things not being rendered? I don't. But please enlighten me. Tell me what I am missing. I never had to rename Crysis.exe, but maybe the driver I am using already contained the bug fix. Or maybe the patch I used for Crysis already fixed it. Again, tell me how it wasn't a bug? In detail. Do you know of anyone who had to change the name of the exe in anything other than the demo? Key word, "DEMO" ?????

So, I don't get it guys. Nvidia offers great cards that have great performance and great image quality, and you still cry cheat? How come?


EDIT: Note the date on all this ancient ?evidence? and dig a little deeper? There was an issue with IQ on a beta driver, it was fixed, and performance stayed the same or went up.

BeHardware Link

?Note that there was some polemic regarding a bug in Nvidia?s 169.04 driver related to deformed reflections on the water surfaces which weren?t updated enough. Due to the fact that it was calculated less often, performances improved or at least in certain conditions like in 1920X1200 with AA4x in DirectX 10 mode (+30%). In the beginning, we thought this was a bug in the game itself ; however, it showed that by renaming the executable in order that the Nvidia driver could no longer detect it, the problem was corrected. For this reason, this bug seemed a bit suspicious.

We contacted Nvidia about this and they confirmed the presence of the bug and the related gain in performances it could add. Nvidia provided us with a driver that no longer has this problem (169.05), which of course we used here. They also gave precise information on the bug in question in order that there be no further confusion on the subject.?
 

ja1484

Platinum Member
Dec 31, 2007
2,438
2
0
Originally posted by: Quiksilver
Adding Fuel To This Fire

Image quality of the 3870 compared to the 8800GT looks 10 times better, but what I don't know sames it was done with a) a demo, b) older drivers, and c) different altitudes if any of it has changed.


Just...shut up... seriously.

Do not roll in here, posting 1 screenshot from some no-name hardware site that looks like they ran the game on two different gamma settings. At that link above you can find these nice shots from HardOCP:

Text
Text
Text
Text
Text

If anything, Nvidia might have the edge in Crysis. Look at the fourth screenshot from the top in this list. Look at the rocks sticking out of the water in the upper left of the screenshot. Look at their reflection. The 8800GT has a noticably more precise rendering than the 2900XT.

Finally we can look at ATI versus NVIDIA image quality to see if there are any differences. We are going to run the 2900 XT and 8800 GT at ?Very High? setting in DX10 to see if there are any differences with all the features enabled.

Out of all the screenshots the only one we noticed anything different on is the fourth picture comparing depth of field. If you look at the rock in the upper left of the image it has a shadow reflecting off of the water. There is a difference in this shadow?s quality, on the Radeon HD 2900 XT it appears like depth of field is working on it to make it appear out of focus, but on the 8800 GT this shadow is much sharper in detail. We are unsure which is ?correct,? but it is a small image quality difference that is rather hard to notice when you are actually playing the game.


Bottom line: As far as I'm concerned, nobody's cheating. Every image quality comparo I've seen puts image quality on even footing.


Originally posted by: funboy42

OK

Here they been busted cheating in a GAME DEMO. Now whats to say they havent done more across the board on all the games to make it cheat at the expense of graphic quality.
And again this isnt about them doing it to make the games run better for you, they do this to inflate their scores so people like us will go on when we buy new graphic hardware and go "DAMN! It got this in 3dmark, game xxx, vs ATI they kick ass", during all of which was a lie to the consumer because they "optimized" their drivers to render less, to make it look like it is faster.

That is my beef here people, not just in 3dmark, but they are doing it in games and got busted. Is the 8800gt as fast as we are led to believe? Again im not a fan of either ATI or Nvida, just my money and I hate to be deceived or lied to when I go to make my decisions because they want to look good in the spot light to inflate score, sell more card because of it, all at the cost of my visual enjoyment which I paid for when I buy a game. I am sure the developers of the games didnt ask to have some of they visual messed with, if so they wouldnt of bothered writing all that damn code if a GRAPHIC maker is going to find a way around it to make their card faster, or seem faster, just to sell more of them. Its the comments like "Who gives a shit what they do" that allows them to do what ever they want because because people dont care, just as long as they are fed BS and think what they read is true, just give it to them. I mean hell I went and bought the card based on all the reviews I read, which was their plan if they really are cheating in more then just Crysis. And I dont need FRAPS to see slowdown, I dont the program to tell me fps, I can see it when it gets low enough since I know what Im looking for.

Again, I will see when the new card shows. My "problems" I seen in 3dmark were not driver or set up issues. It is a fresh install and I been building computers for over 10 years, some of which were for big business back up north where I lived, such as Verizon Wireless. I may not be able to write good, or get my point across when writing, but I do know what the hell I am doing. And again I used 3dmark as my EXAMPLE I found between the 2 cards I own as to why I see the same damn slowdown in spots, at the same visual rates, but one card shows 20~ fps, when the other, showing the same kind of slowdowns, is showing over 40fps, when you can clearly see its not. That 8800gt oc should of spanked my X1900GT and not showed any slowdown at all at 1024x768, stock setting, if it is as bad ass as I was led to believe.


Did you have to work at being retarded, or did it come naturally? The items in the link you posted have already been addressed and fixed. It's 3 months old, and two driver releases old. Look at the HardOCP review I linked and its associated image quality comparisons.

I answered all these criticisms, and you just restated your previous post with the same flawed points.

You seem incapable of understanding that if image quality is the same and performance is better, nothing is cheating. In that instance, it's called "a better way of doing things". That's how technological advancements occur, you twit.

It's like you claiming that my car is cheating in a race because it has a more powerful engine. The problem isn't my car's engine, it's your feeble fucking brain.


 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Not to add more fuel to the fire, but HardOCP is the last source I would trust as far as image quality is concerned. They blatantly stated that the Nvidia 7-series cards did not have any IQ issues at the default quality, when many people, including myself, witnessed the ugly shimmering that resulted from their texture filtering optimizations.
 

xj0hnx

Diamond Member
Dec 18, 2007
9,262
3
76
My 3DMark06 score went from 5966 with...

AMD 3500+
BFG 8800GT

to 11831 with...

Q6600
same BFG 8800GT


3DMark06 at least, and I believe the same holds true for 05, is too dependent on CPU to be a good VGA benchmark. I am happy with the fact that I can play Crysis at playable framerates on high, I hope nVidia cheats more.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: funboy42
Originally posted by: ja1484
Originally posted by: funboy42
but my point would be now, that they are busted in doing something with a game, Crysis, to make it run faster, and a better score when tested as a benchmark. Whats to say they havent done something to all benchmarks, and games for that matter to make it look as if their card is faster and better?

I think you're missing my point, which is:

If it runs faster and the image quality is comparable to the competition, who gives a shit what they did?

That what I am trying to figure out is if they are doing this to everything, if so its lying, deceiving, and not as far out ahead from ATI as everyone is led to believe, when ATI drivers and not doing any cheats or "Optimizing" give you a more feature rich, eye candy, experience because no corners were made to make it run faster, and get a better score to sell more cards and make more money.

....Other miscellaneous ranting of questionable sanity....


Look, that's all well and good, but show me some evidence. Where does the image quality suffer? I want to see comparison screens. Every review I've seen (many of which have custom benchmark timedemos produced by the reviewer, not built in ones shipped with the game) has shown image quality to be, for all practical purposes, identical and the 8800GT ahead in performance.

If you have some contrarian screens or benchmarks, then by all means let's see them, but if you don't, exactly what are you ranting about?

OK

Here they been busted cheating in a GAME DEMO. Now whats to say they havent done more across the board on all the games to make it cheat at the expense of graphic quality.
And again this isnt about them doing it to make the games run better for you, they do this to inflate their scores so people like us will go on when we buy new graphic hardware and go "DAMN! It got this in 3dmark, game xxx, vs ATI they kick ass", during all of which was a lie to the consumer because they "optimized" their drivers to render less, to make it look like it is faster.

That is my beef here people, not just in 3dmark, but they are doing it in games and got busted. Is the 8800gt as fast as we are led to believe? Again im not a fan of either ATI or Nvida, just my money and I hate to be deceived or lied to when I go to make my decisions because they want to look good in the spot light to inflate score, sell more card because of it, all at the cost of my visual enjoyment which I paid for when I buy a game. I am sure the developers of the games didnt ask to have some of they visual messed with, if so they wouldnt of bothered writing all that damn code if a GRAPHIC maker is going to find a way around it to make their card faster, or seem faster, just to sell more of them. Its the comments like "Who gives a shit what they do" that allows them to do what ever they want because because people dont care, just as long as they are fed BS and think what they read is true, just give it to them. I mean hell I went and bought the card based on all the reviews I read, which was their plan if they really are cheating in more then just Crysis. And I dont need FRAPS to see slowdown, I dont the program to tell me fps, I can see it when it gets low enough since I know what Im looking for.

Again, I will see when the new card shows. My "problems" I seen in 3dmark were not driver or set up issues. It is a fresh install and I been building computers for over 10 years, some of which were for big business back up north where I lived, such as Verizon Wireless. I may not be able to write good, or get my point across when writing, but I do know what the hell I am doing. And again I used 3dmark as my EXAMPLE I found between the 2 cards I own as to why I see the same damn slowdown in spots, at the same visual rates, but one card shows 20~ fps, when the other, showing the same kind of slowdowns, is showing over 40fps, when you can clearly see its not. That 8800gt oc should of spanked my X1900GT and not showed any slowdown at all at 1024x768, stock setting, if it is as bad ass as I was led to believe.

it certainly *can* be your setup ... NO ONE else sees it :p
-there are no CURRENT trustworthy reviews supporting your views
or ... i am more inclined to believe - it is what you want to see

Did you even bother to run FRAPS? to confirm that 3DMark05's FPS counter is SCREWED UP?
:roll:

you appear to WANT to just diss nvidia
- you know - Like a Fanboy
:music:
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Another thing I should mention, is that I've personally witnessed cases where the framerate stays at a constant 60fps, and yet in motion it appears jerky, like it was in the 20's. I work on my own 3d apps and games, and a certain combination of code and settings will cause this to happen. In my case, it's related to whether vsync is on, and how you finish the rendering pass before swapping the front and back buffers. I've only noticed this recently when I switched to a 8800gt, I've never seen this on Ati cards. But that's not to say that Nvidia is cheating, because these are my own apps, and I actually measure the framerate inside the app. A minor change in the code resolves this issue, so I don't attribute this behavior to any cheating.
 

hooflung

Golden Member
Dec 31, 2004
1,190
1
0
Originally posted by: Cookie Monster
Funny thing is how R600/RV670 cards score much higher in 3dmark then the G80/G92 counterparts.

The situation is totally reversed in real world performance.

That is an easy one to answer, and its not so funny. Its actually just a numbers game. On paper the AMD technology is just better in nearly every possible way. The problem with the benchmark programs is that they are very gpu neutral and conform to the DX API very closely.

In games where each vendor is using its own engine or a custom, licensed engine there are features that couldn't be achieved using pure API features. The vendor then has to figure out how each card can execute something and in many cases Nvidia works directly with game companies a bit closer. Nvidia is very much a brute force technology. Its not fancy, its not even ground breaking but it works and usually works well. ATI is very elegant and ground breaking. They are revisionists as well. They just get less developer support especially with the r6xx shader language.

Problem is that ATI needs developers to adopt their way of doing things to bleed performance. Many times using ATI in games follows a standard DX path where if you had Nvidia it would be following a Nv path. The fact ATI can even compete with Nvidia by adhering to the DX spec half the time and getting no love from developers is amazing in itself.
 

Quiksilver

Diamond Member
Jul 3, 2005
4,726
0
71
Originally posted by: ja1484
Just...shut up... seriously.

Do not roll in here, posting 1 screenshot from some no-name hardware site that looks like they ran the game on two different gamma settings. At that link above you can find these nice shots from HardOCP:

Text
Text
Text
Text
Text

If anything, Nvidia might have the edge in Crysis. Look at the fourth screenshot from the top in this list. Look at the rocks sticking out of the water in the upper left of the screenshot. Look at their reflection. The 8800GT has a noticably more precise rendering than the 2900XT.

Finally we can look at ATI versus NVIDIA image quality to see if there are any differences. We are going to run the 2900 XT and 8800 GT at ?Very High? setting in DX10 to see if there are any differences with all the features enabled.

Out of all the screenshots the only one we noticed anything different on is the fourth picture comparing depth of field. If you look at the rock in the upper left of the image it has a shadow reflecting off of the water. There is a difference in this shadow?s quality, on the Radeon HD 2900 XT it appears like depth of field is working on it to make it appear out of focus, but on the 8800 GT this shadow is much sharper in detail. We are unsure which is ?correct,? but it is a small image quality difference that is rather hard to notice when you are actually playing the game.


Bottom line: As far as I'm concerned, nobody's cheating. Every image quality comparo I've seen puts image quality on even footing.

Did you have to work at being retarded, or did it come naturally? The items in the link you posted have already been addressed and fixed. It's 3 months old, and two driver releases old. Look at the HardOCP review I linked and its associated image quality comparisons.

I answered all these criticisms, and you just restated your previous post with the same flawed points.

You seem incapable of understanding that if image quality is the same and performance is better, nothing is cheating. In that instance, it's called "a better way of doing things". That's how technological advancements occur, you twit.

It's like you claiming that my car is cheating in a race because it has a more powerful engine. The problem isn't my car's engine, it's your feeble fucking brain.
Talk about not reading my posts after that one where I stated I posted it just to for others to maybe bother to compare way to back up your view using last gen cards that are now using old drivers doesn't prove anything to me because it doesn't have to do with the card series I was talking about... Oh and so much for "don't flame others".
 

BFG10K

Lifer
Aug 14, 2000
22,674
2,824
126
With the 8xxx series I have not seen any evidence of nVidia cheating. I mean why would they? They have the best AF and AA in the business plus they've been top-dog since November 2006.

As it is panning back along the line of fire my 8800gt oc was showing 40+ FPS, but anyone with an eye can tell that is bs because it is very jittery as if it is doing less then 20, not smooth frame rate at all.
I don't think it's the driver that reports the framerate but rather the application. What you describe could be an issue with 3DMark's framerate counter or 3DMark?s renderer in general.

Looks as though I wasnt the only one who took notice that something just didnt look right.
The Crytek water issue was found to be a bug due to application specific optimizations (yeah, I don't like them either but both vendors do them now).

Basically it had something to do with Crysis only calculating the water surface when absolutely necessary but this interfered with SLI so nVidia made certain assumptions to work around the issue, assumptions which later turned out not completely robust.

The issue was later fixed and performance went up too.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: funboy42
OH YEAH, and the crysis thing wasnt a bug, becaus eif it were changing the name of the demo from anything but Crysis.exe made the 8800gt run slower, and look "normal".

If you read most of it up on the problem if it was a bug renaming the demo from its normal name the bug would of still been there. It changed because it was tricked and the driver "optimizations" didnt kick in because it didnt think it was running the Crysis.exe demo but something else and had to do its job and not cheat at it to make more FPS to sell more cards.

A bug is a bug is a bug, changing the name of a demo wont change they way it should be rendered unless it knew it was running a demo and had to do something to kick the fps up, and something you may not of noticed right off hand, or would of just said everything up close looks good, why do I care what the foreground looks like, wee Im at 30fps now and looks like shit. At least I have more fps, because thats better then it looking kick ass and really liking the game and the visual eye candy. Also my point is if they

And again if your going to say fuck it, I dont care, why did you buy the card at the expense of visual candy if thats what they are doing to make it seem faster? At that point you should of kept your older card and run at a lower res with more eye candy on.did this to crysis, only because someone noticed how it didnt look right to them, and was really looking for a difference between the cards, what is to say they didnt do this to ALL the other demos, or games, but hid it better so you would be able to see with your eye, something else taking place to increase a score or fps. Just because you cant see it doesnt make it right, and IT IS still cheating to make more sales.

Or maybe it was a bug in the game specific optimization. The drivers ARE loading game specific optimizations... that improves performance (on SLI especially)...
Maybe it was on purpose, but it was there in one beta release and gone in the next, it never made it to any final releases... but it COULD have been on purpose...
 
May 8, 2007
86
0
0
You guys all need a tinfoil hat, this tells us NOTHING about the 3d rendering and EVERYTHING about exceptions that the drivers make when operating in various runtime environments. I imagine its a lot easier to label games/runtime environments by the .exe they NORMALLY run through as opposed to some other means. Granted, it may not be as robust, but its not something I tend to flip a nut about.

The drivers show an improvement overall from the originals REGARDLESS of the naming of the file which speaks to improvements made in general to the drivers for gaming. The other artifacts that we see only in the crysis.exe file are simply optimizations made speciffically for that game. Its a BETA driver, so I imagine that theyll try to see how performance enhancements effect image quality and optimize the hardware.

The idea of refreshing the reflection angle seems fairly benign to me because its simply dealing with the sampling rate of the human eye. For anyone that is an engineering and can go into detail about the finer points of image reconstruction, pls do. Im not going to pretend to be an armchair engineer like some of you here.



PS Overall, in the article I find the GT 512mb tests to be silly because the GT is only run in the beta driver environment. Moreover, it is apparent that one of the graphs is doubled as the GTS 640mb beta driver test is used twice (both have exact same numbers and title but their captions are radically different hmmmm).