Originally posted by: Schadenfroh
it said when you have to apply special colouring affects and look under a microscope to tell a differance just because you are a fanATIc. Above all, nvidia DOES NOT NEED to cheat with the NV40, its got enough power
Originally posted by: CaiNaM
unfortunately you have no clue what you are talking about, and your "followup" post only reinforces that fact.
did you even read either article? the first was FUD, and the second was nothing more than reporting (referencing) the first... it has not content whatsoever. so what, exactly is the point in linking to the inq article?Originally posted by: TimisoaraKill
Some more stuff m8 to reinforce your Nvidia trust:Originally posted by: CaiNaM
unfortunately you have no clue what you are talking about, and your "followup" post only reinforces that fact.
http://www.theinquirer.net/?article=15502
-----
heh.. thanks 🙂Originally posted by: quikah
fixed Cainam's Link
Originally posted by: CaiNaM
did you even read either article? the first was FUD, and the second was nothing more than reporting (referencing) the first... it has not content whatsoever. so what, exactly is the point in linking to the inq article?
Futuremark deemed the 60.72 drivers compliant and this dude at Driver Heaven is saying their wrong?
Don't place stereotypes.I was going to say that you should know better than to post something like this in this forum especially with the word "cheat" in the title. But then I looked at the RIG in your sig. It all makes sense now.
Nvidia's IQ specially when not moving, looks similar if not better with certain options enable than the 9800 - 6800 only. Even with cheats, but cheating isn't the right way to go. This isn't about IQ looking crappy, it is about try to sell something that doesn't do what it's supposed to.MOST of the replies chose the top picture as the better IQ, and one even down right said the top picture was of an ATI Radeon on 6xAA!!! But the top picture is actually the GF 6800 U
This may be correct, but this cheating gives them a good first impression to those who don't know if they are cheating, as well as people just using these benchmarks for comparison with ATI's new card, instead of benching them again with newer drivers. Some sites are that lazy.The 6800 is too new of a product to start ripping about hacks, even me, an Atifanatic realises this. Once the card is released for sale to the public, and several drivers have been available for it, then you can start pointing the IQ finger, but not till then, IMHO.
That is a different story. I use an Nforce, because they kick ass. Although, I could always go for an KT600. Graphics is another market.What I really hate about ati fans is that they will find any excuse to slag off nvidia products. However when you ask them what chipset their motherboard is using they usually go very quiet because they are using a nforce2. Isn't that a conflict of interest?
It's always gonna be a little different, but Nvidia is using this totaly different technique.Both images differ from the reference image. What defines "cheating"?
UT2k4 with it's big large environments. I am about to post an anomaly with 16x AF in a bit.I think UT/UT2k4 would be the best tests for mipmap detail, as you can clearly see when there are issues, especially on maps with grate type textures.
Some things seem a bit exagerated but, I think he's got an Idea.Well , let me tell you a thing or two , i owned a TI4600 when i did my last videocard investment , i first opted for the 5900NU like you and was kinda firm in my purchase since i never owned any ATI card before , then i went to some websites to inform myself and find out that all Nvidia drivers are reduced to bilinear filtering instead of trilinear , that IQ is reduced by default for more FPS , that Nvidia 4 AA look like the ATI 2AA and etc ,... after having knowledge about all this stuff i went to ATI.
If i buy a new car with 180 advertised HP by manufacturer this is what i expect to hawe and not a car who normaly have 150HP but if you take out the seats , the hood and the bumpers to make it' lighter then you will have 180 HP.
The point is that the people shoud know about this , not everybody like to be scamed ,.. now you might not interested but some people like me are .
That would have been the nicer thing to do, instead of cause all this controversy, but he does believe that it isn't a bug.what driverheaven did is create a massive prejudice against the 6800 when they don't even know what is going on yet. the responsible thing to would have been to contact FM, NV & ATI (their image also differs from the reference image, after all) FIRST before publishing that crap.
If the cards render the way they are supposed to, it should look like the reference design.Maybe Nvidia's looks BETTER than ATI's AND the reference design???
Originally posted by: TimisoaraKill
Originally posted by: CaiNaM
did you even read either article? the first was FUD, and the second was nothing more than reporting (referencing) the first... it has not content whatsoever. so what, exactly is the point in linking to the inq article?
""We said that what we saw when we were in Geneva with Nvidia was a clear difference between the code that was running PS 3.0 path and a referred system that was running PS 2.0 code.
ATI says that CryTek's representative told it that what Nvidia showed us in Geneva was a 2.0/3.0 path versus 1.1 path. ""
This isn't about IQ looking crappy, it is about try to sell something that doesn't do what it's supposed to.
This is not a four pixel difference. you can see that the IQ is lowered, which is different than normal image by too much to just be an error. Unfortunately you can't do that with this test. Try 2001SE. It's much better.I really think something needs clearing up here. CHEATING is not having a 4 pixel difference and distortion in the outer edged of the screen. Futuremark is not crap also. If you want to stress your CPU at the same time then run it at a lower resolution so it becomes CPU bound, duh! Secondly this whole thread started out biased towards ATI. Seriously does anyone here care about a 4 pixel difference come on. Ill bet you someone could find something wrong with ATI's picture too if they wanted too.
quote:
This isn't about IQ looking crappy, it is about try to sell something that doesn't do what it's supposed to.
What the hell is that supposed to mean VIAN. All nvidia and ATI cards do what there supposed to do but there are certain things that go wrong.
I think people shouldn't be so quick to jump on and insult Nvidia in particular, or ATI too. Both companies if you step back and look at it are doing very well.
Cheating...give me a break
-Kevin
lol.. back what up? there's nothing to back up.. there was nothing intelligent in your post for me to 'disprove'. but hey if ya really wanna.. jeez vian you just a glutton for punishment? hmm.. where to start.Originally posted by: VIAN
... and no more idotic than someone who doesn't back it up.
hmmm.. ok, show me the 2k1SE screenshot. what? you don't have a 6800? oh yea, you're talkin out your arse again.Originally posted by: VIAN This is not a four pixel difference. you can see that the IQ is lowered, which is different than normal image by too much to just be an error. Unfortunately you can't do that with this test. Try 2001SE. It's much better.
what lowered iq? the author of the dh article flat out said the rendered output was fine.. however the rasterized image (i don't play games in "raterized", so it's hard for me to imagine) showed something was going on "in the background".. well.. wtf? yea, that makes sense.. he goes on to say it's not what he sees that matters, it's how it gets there.. umm..okay...Originally posted by: VIAN It means that they are trying to sell something that does this at this framerate when in fact it does it with lowered IQ. Any little thing can do wonders.
and that has what to do with this? the differences they point out are nowhere near the degree the driver settings would affect not only image quality, but performance. did you even bother to read WHAT the "differences" were they were complaining about?Originally posted by: VIANJust look for benches on the performance settings in the drivers where you can chose high performance, performance, quality and high quality. There is a big framerate difference lowering from high quality to perfromance or even to high performance. Some 10-20fps. But the IQ difference is barely noticeable sometimes - unless you go al the way to high performance.
Originally posted by: VIAN
I really think something needs clearing up here. CHEATING is not having a 4 pixel difference and distortion in the outer edged of the screen. Futuremark is not crap also. If you want to stress your CPU at the same time then run it at a lower resolution so it becomes CPU bound, duh! Secondly this whole thread started out biased towards ATI. Seriously does anyone here care about a 4 pixel difference come on. Ill bet you someone could find something wrong with ATI's picture too if they wanted too.
quote:
This isn't about IQ looking crappy, it is about try to sell something that doesn't do what it's supposed to.
What the hell is that supposed to mean VIAN. All nvidia and ATI cards do what there supposed to do but there are certain things that go wrong.
I think people shouldn't be so quick to jump on and insult Nvidia in particular, or ATI too. Both companies if you step back and look at it are doing very well.
Cheating...give me a break
-Kevin
This is not a four pixel difference. you can see that the IQ is lowered, which is different than normal image by too much to just be an error. Unfortunately you can't do that with this test. Try 2001SE. It's much better.
It means that they are trying to sell something that does this at this framerate when in fact it does it with lowered IQ. Any little thing can do wonders. Just look for benches on the performance settings in the drivers where you can chose high performance, performance, quality and high quality. There is a big framerate difference lowering from high quality to perfromance or even to high performance. Some 10-20fps. But the IQ difference is barely noticeable sometimes - unless you go al the way to high performance.
? what are you talking about? IQ is not this issue here, lol 😉Originally posted by: AcanthusThere is a huge difference between each of the four settings, if you cant see it, youre on crack.
Originally posted by: CaiNaM
? what are you talking about? IQ is not this issue here, lol 😉Originally posted by: AcanthusThere is a huge difference between each of the four settings, if you cant see it, youre on crack.
i like this quote in the DH article:
When your playing away at 100+ fps I have to be honest and say that its not a noticeable change in IQ over the Radeon. You?d be hard pushed to say which image is optimised with the mipmap square. To me though, its not a matter of what I can see so much as the fact that this changing of textures is happening behind the users back.
no.. it's not a witch hunt :roll:
i guess that would also mean that, as long as what's happening "behind the users back" is "correct" the IQ could be crap, but that would be a-ok 😛
the DH article mentions the reference image was from the dx sdk.. which reminds me of something interesting i read on the hocp forums, "the reference software rendering device that the dx sdk provides developers will never exactly match any hardware" and, " d3d specs are also not exactly or completely defined in all areas. some areas are still left up to hardware manufacturers implementation and can vary wildly (AA filters, anisotropic methods, etc)." does anyone familiar with the dx sdk have any thoughts on that?
It's more like.There is a huge difference between each of the four settings, if you cant see it, youre on crack.
the "wink" was for sarcasm 🙂Originally posted by: Acanthus