I would like to clear up ATI/Nvidia IQ stuff

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
http://www.digit-life.com/articles2/gffx/nv40-rx800-4-p1.html

Just one example. It compares a 6800 ultra and an x800 (something). Obviously, the IQ difference is minimal. Sometimes ATI pulls ahead, sometimes Nvidia does. Either way, its always a slight difference. Both companies use different methods to produce IQ.

Just thought i'd post this cuz there's so many "ATI PWNZORZ NVIDIA AT IQ!!!!!1!1@2"
One quote i read said "at lowest settings (optimizations on) ATI pulls ahead. Without it, Nvidia pulls ahead."

http://www.pcstats.com/articleview.cfm?articleid=1667&page=3

Screen comparisons (very convenient) and they say the IQ gap is very small between the x800 and 6800
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
07.05.2004? Would think both companies have improved since then.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Seems like a good review, however they do seem to bash ATI a lot about the minor problem in their drivers.

They both look good to me, with the nod going to the non optimized 6800, then overall the 9800, then the X800 and the 6800 w/ optimizations on.

-Kevin
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: ronnn
07.05.2004? Would think both companies have improved since then.

Doh... i missed that lol.

Well it would have been a good review a couple years ago.

-Kevin
 

Pabster

Lifer
Apr 15, 2001
16,986
1
0
This issue really won't come up again, unless (until?) R520 comes out and, if it falls behind 7800GTX, the ATi fanboys will start with the "but it has better IQ!!!" arguments.
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
Originally posted by: Gamingphreek
Originally posted by: ronnn
07.05.2004? Would think both companies have improved since then.

Doh... i missed that lol.

Well it would have been a good review a couple years ago.

-Kevin

Yes they are old, but still, it's not like the x800 or 6800 cards are outdated, so its still pretty reliable. unless drivers like really change things...
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
This post is great and all but the current topic of debate is the 7800's IQ.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: hans030390
Originally posted by: Gamingphreek
Originally posted by: ronnn
07.05.2004? Would think both companies have improved since then.

Doh... i missed that lol.

Well it would have been a good review a couple years ago.

-Kevin

Yes they are old, but still, it's not like the x800 or 6800 cards are outdated, so its still pretty reliable. unless drivers like really change things...

The drivers were the whole issue this "debacle" cambe about.

-Kevin
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: Gamingphreek
Originally posted by: hans030390
Originally posted by: Gamingphreek
Originally posted by: ronnn
07.05.2004? Would think both companies have improved since then.

Doh... i missed that lol.

Well it would have been a good review a couple years ago.

-Kevin

Yes they are old, but still, it's not like the x800 or 6800 cards are outdated, so its still pretty reliable. unless drivers like really change things...

The drivers were the whole issue this "debacle" cambe about.

-Kevin

Didn't VIAN also have some thread flaming NVIDIA IQ? Other way around? Someone else? I know there was some big thread on something like this.
 

M0RPH

Diamond Member
Dec 7, 2003
3,302
1
0
THe reason people are criticizing the IQ of Nvidia 6800/7800 cards is mainly because of the shimmering texture effect. The articles you mentioned don't talk about this, and shimmering is not something you see in screenshots anyways. But lots of rational people will tell you that it's a real problem. Many others will say that it's so subtle that you don't notice it in-game unless you are really looking for it. My response to that is that you may not be aware specifically of shimmering, but it contributes to the overall IQ that you percieve. People who are used to gaming with the Nvidia card just get used to shimmering and it becomes normal for them, but if you switched them over to an X800 they would notice a better perceived image quality since there would no longer be any shimmering.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
We have already determined that undersampling isn't the culprit. As stated by the latest drivers from what i understand, it is just a problem with the MIP map bands. It is not enough to warrant ANYONE switching cards, nor is it enough to sway anyone decision on cards. It is merely a slight nusance, if you can actually see it.

-Kevin
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: hans030390
Originally posted by: TheSnowman
Originally posted by: hans030390
Just thought i'd post this cuz there's so many "ATI PWNZORZ NVIDIA AT IQ!!!!!1!1@2"

Who is saying that?

http://forums.anandtech.com/messageview...atid=31&threadid=1679086&enterthread=y

one example. other places are hidden in threads with 7 pages.

Heh, a few nutjobs without even a few hunderd posts trying to start flamewars isn't really much reason to get up in arms. Having used both my Geforce and Radeon extensively I have to give the nod for image quality to ATI, but it is a slight difference by all reasonable accounts and any claims of ATI owning nVidia in image quality is just absurd.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Heh, a few nutjobs without even a few hunderd posts trying to start flamewars isn't really much reason to get up in arms. Having used both my Geforce and Radeon extensively I have to give the nod for image quality to ATI, but it is a slight difference by all reasonable accounts and any claims of ATI owning nVidia in image quality is just absurd.

QFT! I agree that ATI should get the nod for IQ (When TrAA is enabled on the 7 series, you could give it the nod however). Excellent post Snowman.

-Kevin
 

Snakexor

Golden Member
Feb 23, 2005
1,316
16
81
i wasnt trying to start a flame war, i was just asking questions and was looking for answers, its people like you who start the flames and turn it into something it is not....

hans, get over your worthless sm3.0 6600gt and buy a real video card...
 

Pabster

Lifer
Apr 15, 2001
16,986
1
0
Originally posted by: Snakexor
hans, get over your worthless sm3.0 6600gt and buy a real video card...

Ouch. When the flames fail I guess personal attacks are modus operandi?
 

VIAN

Diamond Member
Aug 22, 2003
6,575
1
0
I think these websites do really bad examples, especially PCStats. These pics aren't valid for showing the artifacts. Although, Digit-life does a decent job and they actually tell you something, even if the pics don't seem that different.
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
Originally posted by: Snakexor
i wasnt trying to start a flame war, i was just asking questions and was looking for answers, its people like you who start the flames and turn it into something it is not....

hans, get over your worthless sm3.0 6600gt and buy a real video card...

Yes, it's too bad I can play any game out now very well (at high settings, ram is the main issue right now).

Look, don't start anything with SM3.0, because your 200 posts don't look like you'd put up a decent fight.

And how about this. I'm 15. how about you give me money to buy a new card, eh? idiot.

Anybody who claims Nvidia to have inferior IQ to ati (which i'll admit, it is very slightly worse in most cases) is just looking for a flame war. Besides if it has ANYTHING to do with shimmering, drivers will fix that, and FPS is not reduced. It's been tested.

So is it really I who starts flame wars? Sometimes. Shader model 3 stuff mostly :D but it's kinda not a good thing when someone makes a post like you did.
 

Snakexor

Golden Member
Feb 23, 2005
1,316
16
81
i love it when 15 year olds exaggerate the truth.....any game on HIGH SETTINGS?? i hope your 12inch monitor @ 800x600 or 640x480 looks amazing with 16xaf and 8xaa
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: Snakexor
i love it when 15 year olds exaggerate the truth.....any game on HIGH SETTINGS?? i hope your 12inch monitor @ 800x600 or 640x480 looks amazing with 16xaf and 8xaa

I have a 6800NU, and I can play BF2 1280x1024 on all high settings with 2xAA and 16xAF, smoothly. A 6600GT is about 2% behind if at all. Take your trolling elsewhere.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: xtknight
Originally posted by: Snakexor
i love it when 15 year olds exaggerate the truth.....any game on HIGH SETTINGS?? i hope your 12inch monitor @ 800x600 or 640x480 looks amazing with 16xaf and 8xaa

I have a 6800NU, and I can play BF2 1280x1024 on all high settings with 2xAA and 16xAF, smoothly. A 6600GT is about 2% behind if at all. Take your trolling elsewhere.

That 6800NU has 12 pipes and is a 256bit memory architecture. It will pull way ahead of the 6600 when you turn the details up.

-Kevin
 

Blastman

Golden Member
Oct 21, 1999
1,758
0
76
Originally posted by: Gamingphreek
We have already determined that undersampling isn't the culprit.
Actually, the latest article by hexus.net concludes it is from undersampling the proper MIPS.

Basically, NV has lower performance with AF (compared to the R420?s) because they tied the texture sampler to one of the shader units, the R420 has a separate texture address unit. When doing AF (with trilinear/bilinear) you need to do a lot of sampling ..

?Texture samplers in NV40 are bound to their containing fragment units, so you can't assign any idle samplers to the filtering task being performed by one of the four fragment quads. Idle quad that's not texturing? Idle texture samplers, sadly. With a usual 16x anisotropic filter taking 256 texel samples per pixel, you can see how without a separate dedicated array of units that can texture, finding ways to limit the performance of a tied-to-shader-hardware configuration is prudent.

?They've apparently moved the sampling ratio to favour texel samples from MIPs further down the chain, undersampling the closest MIP, which results in oversampling the next MIP in the chain. ? ?

Shifting the ratio of MIP:nextMIP texel sampling ratio is therefore causing the shimmering, as a product of sampling the most relevant texels, less. The image therefore appears to shimmer, as the hardware gets the texel sampling mix wrong, caused by the driver telling it sample in the wrong place.

By sampling the farther MIPS more (smaller textures) I?m guessing it?s faster (less load on texture address unit), but it?s also not as accurate, so you get IQ degradation in the form of shimmering. But it?s a way to unload the shader tied to the texture address unit, to get more speed.

From that one benchmark he says the difference between the 2 drivers is less than 5% at HQ. But what we really need to know is the difference between 77.77-Q (where current benches were run) and 78.03-HQ, where the shimmering is almost completely reduced (which looks about 7-9% when I put a ruler to the graph).

Also, the performance hit is going depend on the game and even level being rendered. Hexus had originally posted 2 HL2 benches, and the second one took a larger hit by going HQ. He thinks he might have messed something up so he pulled the second bench. Rys says he?s currently looking at some other benches and may post them.