• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Geforce 6800Ultra / Forceware 60.72 cheatin' rulez again !

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Since there are no retailly available cards yet, judging IQ is a bit premature. Not to mention most IQ these days is entirely a matter of opinion.
 
u must be blind ifu cant see ati is very similar to the raster image, while nvidia is farther away from both raster and ati. you really can't see the diff??

whats the point of contacting ati/nvidia, they are in the business of making money

and your link doesn't work
 
Originally posted by: Schadenfroh
it said when you have to apply special colouring affects and look under a microscope to tell a differance just because you are a fanATIc. Above all, nvidia DOES NOT NEED to cheat with the NV40, its got enough power
 
The point of PS3.0 isnt just image quality (fp 32). It has to do with having longer shader programs, instancing (the same object only has to be processed once in the case of repeating objects) and giving them the freedom to write branching loops (if, else, etc). What does this all do? The same thing DX8, DX9 and every other major update to DX does. It makes it easier on the card to draw the same scene. If it is easier to draw the same scene, then they can have more complex scenes with more texures, shaders, or even turning on AA or AF.

As far as your first link is concerned, ATI's card is more like the reference design in MOST situations. How about the other planes in the scene? How about the one on the left in the back? Nvidia has areas on that plane that are more like the reference design than ATI. There are situations though, where both cards draw something "different" from the reference design. (ex: right above the right of the gun in the 2nd test). Why would anyone care about all of this? If it is close enough that I enjoy both pictures equally, and knowing that it is IMPOSSIBLE to get the exact same picture as the reference design, then who the hell cares!? Maybe Nvidia's looks BETTER than ATI's AND the reference design??? I know it is close enough so that I wouldnt ever complain owning the 6800 in comparison to ATI's 9800 series.

I see this situation as being similar to processors in the sense that if they both produce the same results in a sesne that I can measure, then it is just who does it faster.
 
Originally posted by: TimisoaraKill
Originally posted by: CaiNaM
unfortunately you have no clue what you are talking about, and your "followup" post only reinforces that fact.
Some more stuff m8 to reinforce your Nvidia trust:
http://www.theinquirer.net/?article=15502
-----
did you even read either article? the first was FUD, and the second was nothing more than reporting (referencing) the first... it has not content whatsoever. so what, exactly is the point in linking to the inq article?
 
Originally posted by: CaiNaM

did you even read either article? the first was FUD, and the second was nothing more than reporting (referencing) the first... it has not content whatsoever. so what, exactly is the point in linking to the inq article?



""We said that what we saw when we were in Geneva with Nvidia was a clear difference between the code that was running PS 3.0 path and a referred system that was running PS 2.0 code.

ATI says that CryTek's representative told it that what Nvidia showed us in Geneva was a 2.0/3.0 path versus 1.1 path. ""
 
Futuremark deemed the 60.72 drivers compliant and this dude at Driver Heaven is saying their wrong?

FM is an idiot. Just look at their software, how is it supposed to test real world game performance, when it just tests the video card. Real world games use the CPU also.

I was going to say that you should know better than to post something like this in this forum especially with the word "cheat" in the title. But then I looked at the RIG in your sig. It all makes sense now.
Don't place stereotypes.

MOST of the replies chose the top picture as the better IQ, and one even down right said the top picture was of an ATI Radeon on 6xAA!!! But the top picture is actually the GF 6800 U
Nvidia's IQ specially when not moving, looks similar if not better with certain options enable than the 9800 - 6800 only. Even with cheats, but cheating isn't the right way to go. This isn't about IQ looking crappy, it is about try to sell something that doesn't do what it's supposed to.

The 6800 is too new of a product to start ripping about hacks, even me, an Atifanatic realises this. Once the card is released for sale to the public, and several drivers have been available for it, then you can start pointing the IQ finger, but not till then, IMHO.
This may be correct, but this cheating gives them a good first impression to those who don't know if they are cheating, as well as people just using these benchmarks for comparison with ATI's new card, instead of benching them again with newer drivers. Some sites are that lazy.

What I really hate about ati fans is that they will find any excuse to slag off nvidia products. However when you ask them what chipset their motherboard is using they usually go very quiet because they are using a nforce2. Isn't that a conflict of interest?
That is a different story. I use an Nforce, because they kick ass. Although, I could always go for an KT600. Graphics is another market.

Both images differ from the reference image. What defines "cheating"?
It's always gonna be a little different, but Nvidia is using this totaly different technique.

1. A section of code that relies on pre-calculated knowledge (ex.: algorithmic detection, application
detection, shader detection, etc) to work.

2. A section of code that for all possible inputs purposefully alters one or more outputs to incorrect
values.

I think UT/UT2k4 would be the best tests for mipmap detail, as you can clearly see when there are issues, especially on maps with grate type textures.
UT2k4 with it's big large environments. I am about to post an anomaly with 16x AF in a bit.

Well , let me tell you a thing or two , i owned a TI4600 when i did my last videocard investment , i first opted for the 5900NU like you and was kinda firm in my purchase since i never owned any ATI card before , then i went to some websites to inform myself and find out that all Nvidia drivers are reduced to bilinear filtering instead of trilinear , that IQ is reduced by default for more FPS , that Nvidia 4 AA look like the ATI 2AA and etc ,... after having knowledge about all this stuff i went to ATI.
If i buy a new car with 180 advertised HP by manufacturer this is what i expect to hawe and not a car who normaly have 150HP but if you take out the seats , the hood and the bumpers to make it' lighter then you will have 180 HP.
The point is that the people shoud know about this , not everybody like to be scamed ,.. now you might not interested but some people like me are .
Some things seem a bit exagerated but, I think he's got an Idea.

what driverheaven did is create a massive prejudice against the 6800 when they don't even know what is going on yet. the responsible thing to would have been to contact FM, NV & ATI (their image also differs from the reference image, after all) FIRST before publishing that crap.
That would have been the nicer thing to do, instead of cause all this controversy, but he does believe that it isn't a bug.

Maybe Nvidia's looks BETTER than ATI's AND the reference design???
If the cards render the way they are supposed to, it should look like the reference design.
 
Since Futuremark approved the 6072 driver it's tough to say that nVidia is cheating. As for the Max Payne issue, it could just be a bug in the driver with the way it handles that lightmap in the centre of the screen.

If he tried the anti-cheater utilities and found the images changed when he did then that would be serious evidence of nVidia cheating. But because he didn't there's simply too little evidence to claim either way.
 
Originally posted by: TimisoaraKill
Originally posted by: CaiNaM

did you even read either article? the first was FUD, and the second was nothing more than reporting (referencing) the first... it has not content whatsoever. so what, exactly is the point in linking to the inq article?

""We said that what we saw when we were in Geneva with Nvidia was a clear difference between the code that was running PS 3.0 path and a referred system that was running PS 2.0 code.

ATI says that CryTek's representative told it that what Nvidia showed us in Geneva was a 2.0/3.0 path versus 1.1 path. ""

umm..ok.. and we all firgured that out last week and it was confirmed... your point?
 
I really think something needs clearing up here. CHEATING is not having a 4 pixel difference and distortion in the outer edged of the screen. Futuremark is not crap also. If you want to stress your CPU at the same time then run it at a lower resolution so it becomes CPU bound, duh! Secondly this whole thread started out biased towards ATI. Seriously does anyone here care about a 4 pixel difference come on. Ill bet you someone could find something wrong with ATI's picture too if they wanted too.

This isn't about IQ looking crappy, it is about try to sell something that doesn't do what it's supposed to.

What the hell is that supposed to mean VIAN. All nvidia and ATI cards do what there supposed to do but there are certain things that go wrong.

I think people shouldn't be so quick to jump on and insult Nvidia in particular, or ATI too. Both companies if you step back and look at it are doing very well.

Cheating...give me a break

-Kevin
 
The reason people are jumping on this is because Nvidia did it BEFORE. Futuremark even spoke out scolding nvidia for their cheating ways. I have no doubt that nvidia would do it again.
 
I really think something needs clearing up here. CHEATING is not having a 4 pixel difference and distortion in the outer edged of the screen. Futuremark is not crap also. If you want to stress your CPU at the same time then run it at a lower resolution so it becomes CPU bound, duh! Secondly this whole thread started out biased towards ATI. Seriously does anyone here care about a 4 pixel difference come on. Ill bet you someone could find something wrong with ATI's picture too if they wanted too.

quote:
This isn't about IQ looking crappy, it is about try to sell something that doesn't do what it's supposed to.



What the hell is that supposed to mean VIAN. All nvidia and ATI cards do what there supposed to do but there are certain things that go wrong.

I think people shouldn't be so quick to jump on and insult Nvidia in particular, or ATI too. Both companies if you step back and look at it are doing very well.

Cheating...give me a break

-Kevin
This is not a four pixel difference. you can see that the IQ is lowered, which is different than normal image by too much to just be an error. Unfortunately you can't do that with this test. Try 2001SE. It's much better.

It means that they are trying to sell something that does this at this framerate when in fact it does it with lowered IQ. Any little thing can do wonders. Just look for benches on the performance settings in the drivers where you can chose high performance, performance, quality and high quality. There is a big framerate difference lowering from high quality to perfromance or even to high performance. Some 10-20fps. But the IQ difference is barely noticeable sometimes - unless you go al the way to high performance.
 
Originally posted by: VIAN
... and no more idotic than someone who doesn't back it up.
lol.. back what up? there's nothing to back up.. there was nothing intelligent in your post for me to 'disprove'. but hey if ya really wanna.. jeez vian you just a glutton for punishment? hmm.. where to start.

how about first and foremost, this is a beta card with beta drivers - which you obviously do not have access to, so all you can do is play lemming and follow and let someone think for you. just another bandwagon for you to jump on.

and then there's what.. a dozen other sites putting the 6800 to the torture test but they don't see any issues with IQ (other than the far cry bug)? i suppose they're all dumbarses who simply cannot see what DH (who didn't even have a card to review) so "easily" finds?

Originally posted by: VIAN This is not a four pixel difference. you can see that the IQ is lowered, which is different than normal image by too much to just be an error. Unfortunately you can't do that with this test. Try 2001SE. It's much better.
hmmm.. ok, show me the 2k1SE screenshot. what? you don't have a 6800? oh yea, you're talkin out your arse again.

Originally posted by: VIAN It means that they are trying to sell something that does this at this framerate when in fact it does it with lowered IQ. Any little thing can do wonders.
what lowered iq? the author of the dh article flat out said the rendered output was fine.. however the rasterized image (i don't play games in "raterized", so it's hard for me to imagine) showed something was going on "in the background".. well.. wtf? yea, that makes sense.. he goes on to say it's not what he sees that matters, it's how it gets there.. umm..okay...

Originally posted by: VIANJust look for benches on the performance settings in the drivers where you can chose high performance, performance, quality and high quality. There is a big framerate difference lowering from high quality to perfromance or even to high performance. Some 10-20fps. But the IQ difference is barely noticeable sometimes - unless you go al the way to high performance.
and that has what to do with this? the differences they point out are nowhere near the degree the driver settings would affect not only image quality, but performance. did you even bother to read WHAT the "differences" were they were complaining about?

as far more on the article, there are numerous reasons it's nothing more than FUD.. and there are many posts on that already; no need to rehash it here. another reason you should actually inform yourself on a subject before jumping in.. at any rate, i think dave baumann @ b3d had the most objective statement regarding this whole farce:

I think we've been over this old ground enough before.

IMO much of what's displayed doesn't appear to be much beyond differences in how the texturing is handled. Lets wait to see if there is any reply from anyone else.
_________________
'Wavey' Dave
Beyond3D
"The drinks are on the roof"


in other words, let's wait for more info and the full story before passing judgement. i serioulsy wonder why with such impressive numbers nv would want to "cheat" just to get a couple extra fps, and it would take more than the speculation that's been presented so far for me to jump on the "nv is cheating" bandwagon.

hmm.. going back to re-read the article.. anyone noticed it's been heavily edited from it's orignal form?
 
Originally posted by: VIAN
I really think something needs clearing up here. CHEATING is not having a 4 pixel difference and distortion in the outer edged of the screen. Futuremark is not crap also. If you want to stress your CPU at the same time then run it at a lower resolution so it becomes CPU bound, duh! Secondly this whole thread started out biased towards ATI. Seriously does anyone here care about a 4 pixel difference come on. Ill bet you someone could find something wrong with ATI's picture too if they wanted too.



quote:

This isn't about IQ looking crappy, it is about try to sell something that doesn't do what it's supposed to.







What the hell is that supposed to mean VIAN. All nvidia and ATI cards do what there supposed to do but there are certain things that go wrong.



I think people shouldn't be so quick to jump on and insult Nvidia in particular, or ATI too. Both companies if you step back and look at it are doing very well.



Cheating...give me a break



-Kevin

This is not a four pixel difference. you can see that the IQ is lowered, which is different than normal image by too much to just be an error. Unfortunately you can't do that with this test. Try 2001SE. It's much better.



It means that they are trying to sell something that does this at this framerate when in fact it does it with lowered IQ. Any little thing can do wonders. Just look for benches on the performance settings in the drivers where you can chose high performance, performance, quality and high quality. There is a big framerate difference lowering from high quality to perfromance or even to high performance. Some 10-20fps. But the IQ difference is barely noticeable sometimes - unless you go al the way to high performance.

There is a huge difference between each of the four settings, if you cant see it, youre on crack.
 
Originally posted by: AcanthusThere is a huge difference between each of the four settings, if you cant see it, youre on crack.
? what are you talking about? IQ is not this issue here, lol 😉

i like this quote in the DH article:

When your playing away at 100+ fps I have to be honest and say that its not a noticeable change in IQ over the Radeon. You?d be hard pushed to say which image is optimised with the mipmap square. To me though, its not a matter of what I can see so much as the fact that this changing of textures is happening behind the users back.

no.. it's not a witch hunt :roll:

i guess that would also mean that, as long as what's happening "behind the users back" is "correct" the IQ could be crap, but that would be a-ok 😛

the DH article mentions the reference image was from the dx sdk.. which reminds me of something interesting i read on the hocp forums, "the reference software rendering device that the dx sdk provides developers will never exactly match any hardware" and, " d3d specs are also not exactly or completely defined in all areas. some areas are still left up to hardware manufacturers implementation and can vary wildly (AA filters, anisotropic methods, etc)." does anyone familiar with the dx sdk have any thoughts on that?
 
Originally posted by: CaiNaM
Originally posted by: AcanthusThere is a huge difference between each of the four settings, if you cant see it, youre on crack.
? what are you talking about? IQ is not this issue here, lol 😉

i like this quote in the DH article:

When your playing away at 100+ fps I have to be honest and say that its not a noticeable change in IQ over the Radeon. You?d be hard pushed to say which image is optimised with the mipmap square. To me though, its not a matter of what I can see so much as the fact that this changing of textures is happening behind the users back.

no.. it's not a witch hunt :roll:

i guess that would also mean that, as long as what's happening "behind the users back" is "correct" the IQ could be crap, but that would be a-ok 😛

the DH article mentions the reference image was from the dx sdk.. which reminds me of something interesting i read on the hocp forums, "the reference software rendering device that the dx sdk provides developers will never exactly match any hardware" and, " d3d specs are also not exactly or completely defined in all areas. some areas are still left up to hardware manufacturers implementation and can vary wildly (AA filters, anisotropic methods, etc)." does anyone familiar with the dx sdk have any thoughts on that?

I was just referring to what VIAN said earlier.
 
There is a huge difference between each of the four settings, if you cant see it, youre on crack.
It's more like.

The ATI is at most 25% different.

Nvidia is 50-75% different.
 
I do not fully understand what the rast image is, what is this and why does it serve as a reference? I ask because if you look at the images from Max Payne, you will notice that both the nvidia images and ati images have a lens flare effect that the rast image does not. If the rast image is truely the 100% correct way to render these frames, then why doesn't it contain the lens flare? Also, if one was to render the same frame with the rast method multiple times, would the results be identical or is this simply one instance of this frame?

Edit: typo... Does anyone else find the text in the message window to be a bit small and more difficult to read than that of the final post?
 
Back
Top