• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

FiringSquad does image quality comparisons...again

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Gstanfor
Keep reading Dweeb.

And I'll happily put my Philips monitor up against your Sony anytime. The Diamondtron tube is superior to the Trinitron and I'll guarantee the electronics in the Philips are way above the crap that Sony loves consumers to pay through the nose for. In short, your monitor may have the "prestige" (at least where clueless consumers are concerned) name, but mine has the prestige components...

:confused: What does this have to do with the diffences in image quality between ATI and Nvidia?

You need not post anything more since we already know your assumption concerning Nvidia's shimmering:
Originally posted by: Gstanfor
I do doubt the video card has anything whatsoever to do with it...
Needless to say, this has yet to be proved.

If you are going to post again, could you stay on track, without derailing into your own vendetta as to how your monitor supercedes BFG10K's, and tell us how the shimmering effects apparent on Nvidia's hardware while under certain driver settings do NOT correspond to a problem with Nvidia's hardware or software? Do not attempt to recirculate you prior falsities blaming game developers, OpenGL extenstions, or any other unrelated glitch. I'm curious as to what you believe the explanations are to these issues. What's next? Is the amount of moisture in a room going to create a fixated trace of water along a person's retina to where shimmering is a problem at fault with a human's eye?
 

CP5670

Diamond Member
Jun 24, 2004
5,668
768
126
I am another CRT user (Mitsu 2070SB, if it matters) who quickly found the shimmering and other AF artifacts on Quality mode annoying when I started using 7 series cards, and still notice them easily on another computer with a 19" LCD. It's distracting enough that it looks worse to me than having no AF at all. The HQ/LOD clamp combination fixes up things nicely though and I find the HQ on Nvidia cards to result in slightly less shimmering than ATI's HQ setting, although the mipmap transitions at some angles are still present.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: CP5670
I am another CRT user (Mitsu 2070SB, if it matters) who quickly found the shimmering and other AF artifacts on Quality mode annoying when I started using 7 series cards, and still notice them easily on another computer with a 19" LCD. It's distracting enough that it looks worse to me than having no AF at all. The HQ/LOD clamp combination fixes up things nicely though and I find the HQ on Nvidia cards to result in slightly less shimmering than ATI's HQ setting, although the mipmap transitions at some angles are still present.

You didn't see the shimmering when you had the same monitor but an older video card right? Didn't shimmering become widespread starting in the NVIDIA 6 series?

Edit: edited my edited post to reflect the included quote to match your quote edit. us two edit too much. ;)
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: xtknight
Didn't shimmering become widespread starting in the NVIDIA 6 series?

comparing my 6800nu and 7600gt, my 6800 shimmers much less in Q mode (i actually use Q with my 6800 unlike my 7600gt), but has about the same amount of shimmering in HQ (ie basically none)

the 6 series doesnt really shimmer much. it really started to be a problem for some people starting with the 7 series
 

CP5670

Diamond Member
Jun 24, 2004
5,668
768
126
Originally posted by: xtknight
You didn't see the shimmering when you had the same monitor but an older video card right? Didn't shimmering become widespread starting in the NVIDIA 6 series?

Edit: edited my edited post to reflect the included quote to match your quote edit. us two edit too much. ;)

LOL I do tend to edit my posts too much, probably averaging two or three times per post. :D

I actually didn't find the shimmering all that noticeable when I had a 6 series card (on the same monitor). There were a couple of games where I turned on HQ, but otherwise Q looked quite acceptable. The AF problems really became a lot worse on the 7 series models.
 

Avalon

Diamond Member
Jul 16, 2001
7,571
178
106
Yeah, shimmering wasn't too bad when I owned an eVGA 6800NU @ HQ, but my eVGA 7600GT @ HQ was pretty bad, relatively speaking. Most of the time it didn't bother me, but there were a couple games where it was distracting. On quality mode, it was consistently distracting.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
I can't imagine the shimmering getting any worse than what i see on my 6600GT, so i hope y'all are joking ;)

I nearly fell off my perch when i went from a 9800 pro (which i always used HQ), to a 6600GT (which i have always used with HQ)...

Fiddling with that LOD shizzle helps, but why should i need to do that? Surely Nvidia HQ is supposed to be ~to ATI HQ? And why are nvidia Q/P/HP exactly the same in terms of performance?

/dug rant.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
no, my 6800gt doesn't have near the shimmering my 7800 did. repl the 7800 with a 1800xt, and no more shimmering (tho in certain areas there is some subtle shimmering, but it's not nearly as obvious as the 7800 was)
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: dug777
I can't imagine the shimmering getting any worse than what i see on my 6600GT, so i hope y'all are joking ;)

the IQ on 6 series cards really isnt bad.............. i havent heard anyone besides you say that they were bothered by shimmering on 6 series cards :confused: IQ was pretty equal between the xXXX's and 6 series -- and the xXXX's should have pretty much equal IQ to the 9xxx's, they're based off the same core.

see HERE, "As far as AF quality goes, it's hard to notice substantial differences in quality between the 6800 and X800. We've circled two areas where we could notice small differences in quality, and in those locations you can see just a touch more detail with the X800. Remember, this is at 4x magnification ? you would likely not see the difference at normal size, much less when actually moving the camera around while playing the game. This is becoming disappointing. In game after game, we're unable to find substantial differences in anisotropic filtering quality between the Radeon X800 and GeForce 6800. World of Warcraft is no exception ? aside from slight variances is color due to the screens being captured at different times, the shots look almost identical. Take away the 4x magnification and you'd never spot a difference.

It looks like anisotropic filtering is a wash. We're able to empirically prove that the filtering algorithms are different by looking at colored mipmap levels, but at either 8X or 16X levels, the resulting images look nearly identical. We have to scrutinize a still screenshot under 4x magnification to find even the smallest differences"
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: schneiderguy
Originally posted by: dug777
I can't imagine the shimmering getting any worse than what i see on my 6600GT, so i hope y'all are joking ;)

the IQ on 6 series cards really isnt bad.............. i havent heard anyone besides you say that they were bothered by shimmering on 6 series cards :confused: IQ was pretty equal between the xXXX's and 6 series -- and the xXXX's should have pretty much equal IQ to the 9xxx's, they're based off the same core.

see HERE, "As far as AF quality goes, it's hard to notice substantial differences in quality between the 6800 and X800. We've circled two areas where we could notice small differences in quality, and in those locations you can see just a touch more detail with the X800. Remember, this is at 4x magnification ? you would likely not see the difference at normal size, much less when actually moving the camera around while playing the game. This is becoming disappointing. In game after game, we're unable to find substantial differences in anisotropic filtering quality between the Radeon X800 and GeForce 6800. World of Warcraft is no exception ? aside from slight variances is color due to the screens being captured at different times, the shots look almost identical. Take away the 4x magnification and you'd never spot a difference.

It looks like anisotropic filtering is a wash. We're able to empirically prove that the filtering algorithms are different by looking at colored mipmap levels, but at either 8X or 16X levels, the resulting images look nearly identical. We have to scrutinize a still screenshot under 4x magnification to find even the smallest differences"

i agree. i had both an x800pro and 6800gt when they were first released, and i chose the 6800gt over the x800 at the time because the IQ was pretty much equal (they were both kind of crap for texture filtering, however i had less of a problem with mipmap transitions with the GT than the x800), however the GT was a much better performer - at a lower price.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
For the umpteenth time in this thread, once you disable the optimizations in Quality mode and enable the LOD bias clamp shimmering will be minimal on G7x. The same applies for nv4x - in fact, the LOD bias clamp was introduced into nvidia's drivers because of nv40's shimmering problems early on.

As far as filtering quality in general goes, we have sadly been on a downhill slide in that department ever since GF4 - everything from that point on has reduced IQ in some way or another since then. In ATi's case the rot started with R200 and they have started to address the issue with their new AF mode in r5xx (but not before they triggered the IQ war that led to the situation we see today).

We can only hope that r600 and G80/nv50 start to put some IQ back in again.
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Gstanfor, what IQ did ATI take out that you would like to see them add in the r600?
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
In ATi's case the rot started with R200 and they have started to address the issue with their new AF mode in r5xx (but not before they triggered the IQ war that led to the situation we see today).
Nvidia's inferior AF is ATI's fault? :roll:

Are you getty dizzy yet?
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: GstanforIn ATi's case the rot started with R200 and they have started to address the issue with their new AF mode in r5xx (but not before they triggered the IQ war that led to the situation we see today).

doh! too bad nvidia had neither the engineering skill nor initiative to do something other than follow ati down that same path... but wait, since ati has done something about it with r5xx, maybe nvidia can once again "follow the leader" and take the same direction of ati by improving texture filtering w/ g80...
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
I don't care who is following who just as long as Nvidia fix the crap they call the 7 series. I really hope G80 is up to the task of fixing the IQ not to mention HDR+AA for the masses and maybe they could add a new feature that ATI doesn't have.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: redbox
I don't care who is following who just as long as Nvidia fix the crap they call the 7 series. I really hope G80 is up to the task of fixing the IQ not to mention HDR+AA for the masses and maybe they could add a new feature that ATI doesn't have.

That will happen for sure.
Not to mention NV will have new AA algorithms as well.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: CaiNaM
Originally posted by: GstanforIn ATi's case the rot started with R200 and they have started to address the issue with their new AF mode in r5xx (but not before they triggered the IQ war that led to the situation we see today).

doh! too bad nvidia had neither the engineering skill nor initiative to do something other than follow ati down that same path... but wait, since ati has done something about it with r5xx, maybe nvidia can once again "follow the leader" and take the same direction of ati by improving texture filtering w/ g80...

nvidia will take the initiative once again with G80, you can rest assured of that.

The reason it didn't happen for G7x is because G7x is a refresh product (just like r4xx was a refresh product and ATi users had to wait for R5xx for the improved anisotropic filtering).
 

redbox

Golden Member
Nov 12, 2005
1,021
0
0
Originally posted by: Gstanfor
Originally posted by: CaiNaM
Originally posted by: GstanforIn ATi's case the rot started with R200 and they have started to address the issue with their new AF mode in r5xx (but not before they triggered the IQ war that led to the situation we see today).

doh! too bad nvidia had neither the engineering skill nor initiative to do something other than follow ati down that same path... but wait, since ati has done something about it with r5xx, maybe nvidia can once again "follow the leader" and take the same direction of ati by improving texture filtering w/ g80...

nvidia will take the initiative once again with G80, you can rest assured of that.

The reason it didn't happen for G7x is because G7x is a refresh product (just like r4xx was a refresh product and ATi users had to wait for R5xx for the improved anisotropic filtering).

So which brand is going to have a unified shader.

 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: schneiderguy
Originally posted by: dug777
I can't imagine the shimmering getting any worse than what i see on my 6600GT, so i hope y'all are joking ;)

I nearly fell off my perch when i went from a 9800 pro (which i always used HQ), to a 6600GT (which i have always used with HQ)...

Fiddling with that LOD shizzle helps, but why should i need to do that? Surely Nvidia HQ is supposed to be ~to ATI HQ? And why are nvidia Q/P/HP exactly the same in terms of performance?

/dug rant.

the IQ on 6 series cards really isnt bad.............. i havent heard anyone besides you say that they were bothered by shimmering on 6 series cards :confused:

O rly?

I've heard many people agree with me that they found the shimmering extremely noticeable with 6 series cards, and remember i went from a 9800 pro to a 6600GT, so i never saw an xX00 series card in action to compare it with...

Anyone who claims shimmering is not an issue on 6 series cards with the default 'quality' driver setting, without enabling HQ and that LOD clamp setting, needs to either get their eyes checked out, or stop shamelessly lying :roll:

I'm just giving my personal experience here mate, FWIW...

Oh & that crap you posted notioceably doesn't look at moving images, which is essentially the only placve where you can see shimmerinhg. So yeah, if you bought your card to look at screenshots, maybe the IQ was similar...but i bought mine to play games ;)
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
From an API (DX10) point of view all shaders are unfied.

Whether the hardware shader units or not is irrelevant and is merely an implimentation detail.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
nvidia will take the initiative once again with G80, you can rest assured of that.

The reason it didn't happen for G7x is because G7x is a refresh product (just like r4xx was a refresh product and ATi users had to wait for R5xx for the improved anisotropic filtering).
(just like Nvidia users had to wait for G7x for the improved aniso............oh, wait...it wasn't much of an improvement......)

From what I hear they're just going to take ATI's initiative. HDR+AA is probably more likely since ATI decided to implement it the way video game developers would like it to be. And if they didn't incorporate angle independent AF, that would be yet another Nvidia card with inferior AF.

It seems that you think all of the 7 series GPU's and X1k series are nothing but refreshes. This is false. Adding features, technology and designing different GPU's that incorporate different architectures are more than just refreshes.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
feeling confused are we Josh?

G7x is a refresh of nv40. R5xx is a new design and I never claimed it was a refresh of r3/4xx.

As for HDR+AA & revised AF, we've know since well before g70's lauch (and long, long before r5xx's launch) that G80 would have these features (among many others)...
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: dug777

Oh & that crap you posted notioceably doesn't look at moving images, which is essentially the only placve where you can see shimmerinhg. So yeah, if you bought your card to look at screenshots, maybe the IQ was similar...but i bought mine to play games ;)

do you have any better "crap" that compares the IQ of R4xxx and NV4x? a couple other posters agreed with me that the NV40 and R4xxx IQ is comparable, too
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: josh6079

It seems that you think all of the 7 series GPU's and X1k series are nothing but refreshes. This is false. Adding features, technology and designing different GPU's that incorporate different architectures are more than just refreshes.

G70 is heavily based off the NV40 architecture, it could be considered a refresh :) all it adds is worse AF, gamma corrected AA, and transparency AA