FiringSquad does image quality comparisons...again

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Cookie Monster
Originally posted by: BFG10K
Because Q nVidia AF most certainly does look different to standard ATi AF.

In what ways?

Put your driver settings on Q and tell me if the shimmering looks good.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,004
126
In what ways?
This is a joke, right? How many times has the shimmering issue been discussed already?

nVidia's Q AF shimmers like crazy and on top of this textures often wiggle as if they were flags blowing in the wind.

The default image quality is attrocious and unusable.

Here is a comparison of nVidia's AF differences. The top one is the default setting (i.e. Q with anisotropic & trilinear optimizations on) while the bottom one is HQ (all otpimizations off, among other things).

Notice how poorly the mip-map transitions are filtered in the top shot and the lack of blending between them.

ATi's default quality is roughly equal nVidia's high quality. Also nVidia typically takes a 10%-20% performance hit in order to enable HQ, hence the skewed benchmarks we see when reviewers use Q.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Ensure the LOD bias clamp is enabled and all 3 optimizations are disabled under Quality and you'll find the shimmering is pretty much nonexistant.

Shimmering is a divine punishment from above for those too lazy or too arrogant to go in and set things correctly.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: Wreckage
Originally posted by: Ackmed
Nice to see some videos, that clearly show the shimmering problem NV can have.
Of course ignoring the many people who have complained about shimmering on ATI cards. :roll:

NV shimmer much worse than ATi, that is a fact. Im sorry you dont like it.


Ive said it for over a year, got called a liar, FUD spreader, and everything else you can think of.
Hey, if the shoe fits.....

Except it doesnt. People like you, are the ones who claim it. When all along, I was right. You and your pals have said it doesnt exist, and not a problem, because real hardware sites never mention it. Now they do.

Now multiple review sites are finally acknowledging the problem. Its easy to see it in the videos, and its much easier to see (I couldnt ignore it) on a 24" LCD right in front of you. As Ive said many times.
bla, bla, bla, we know your song to bad it's on a broken record.

Broken record that happens to be right. I said well before HardOCP brought it to light, that shimmering is much more noticable on a large LCD. They have backed me on it now. Shimmering is a real issue.

Originally posted by: josh6079
Originally posted by: Ackmed
And yeah, they did gloss over shimmering. At least its mentioned for a change though.

Now you're saying:

Nice to see some videos, that clearly show the shimmering problem NV can have.

Where are these videos? No where did the OP's link, nor anyone elses post, provide them. I'm not saying you're wrong, but where did that come from?

Yeah, Im changing what I said. Why? Because they edited the article, took out the pictures, and added videos. 2 days after the article was out, and after my post that said they glossed over shimmering.

http://www.firingsquad.com/hardware/ati...ge_quality_showdown_august06/page5.asp

Originally posted by: Gstanfor
Ensure the LOD bias clamp is enabled and all 3 optimizations are disabled under Quality and you'll find the shimmering is pretty much nonexistant.

Shimmering is a divine punishment from above for those too lazy or too arrogant to go in and set things correctly.

This does not make it "pretty much nonexistant". It reduces it a lot, its still distracting and easy to see on certain hardware (again, larger LCD's). So now you're admiting that these options need to be changed, to get close to ATi's IQ? I guess its a good thing that reviewers dont listen to you, or NV's frames would be down in reviews.
 

lopri

Elite Member
Jul 27, 2002
13,314
690
126
At least people started to quote the appropriate part of posts they're responding to. Thank you all! It's much easier to follow when a thread gets long and trying to catch up quickly.

This is a totally off-topic and I feel like an idiot by saying this but why am I thinking it's "FiringSquid" instead of "FiringSquad", everytime the site is mentioned? Yumm.. :D
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Ackmed, I don't know why you think I "am admitting" these options need to be changed - I've spent what feels like forever here *telling* people to change these options if shimmering bothers them in Quality mode. No it won't completely eliminate shimmer with Quality, because of the different way the various modes access the texturing units. Your LCD problem is just that - your LCD problem and not one I suffer from :)

Now, before you go on the attack once more over your lovely wording "So now you're admiting", just remember who spelled out in their signature (not there now, I know, but it can be if you want it to be...) that they have NO affiliation to anyone, and therefore NOTHING to admit to and who has run from every opportunity to deny or acknowledge certain "focus groups"...
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
The image quality comparison is almost certainly a draw on all but one account. NVIDIA?s AA routine, especially with Transparency AA activated, is better at drawing very fine lines, like those in the Half-Life 2 fence screenshot. Whether it?s the thin left side of the fence or the small branches on the trees, NVIDIA shows you more, and even more importantly, what they show is in sharper contrast than what ATI?s routines deliver.

NVIDIA?s anisotropic filtering looks better in screenshots. You?ll remember that you can see the cobblestone pattern in the Call of Duty 2 screens far beyond where the ATI image blurs them into flat ground

 
Apr 6, 2006
32
0
0
Originally posted by: Wreckage
The image quality comparison is almost certainly a draw on all but one account. NVIDIA?s AA routine, especially with Transparency AA activated, is better at drawing very fine lines, like those in the Half-Life 2 fence screenshot. Whether it?s the thin left side of the fence or the small branches on the trees, NVIDIA shows you more, and even more importantly, what they show is in sharper contrast than what ATI?s routines deliver.

NVIDIA?s anisotropic filtering looks better in screenshots. You?ll remember that you can see the cobblestone pattern in the Call of Duty 2 screens far beyond where the ATI image blurs them into flat ground

So you prefer to use your graphics card to watch static screenshots instead of playing games, great for you.

Selective quoting ftw.

NVIDIA?s optimizations create the shimmering effect we saw in the Battlefield 2 video, which can range from unnoticeable to distracting, depending on the game.

However, ATI generally has better AA smoothing once you bump things up to 6xAA. I?m going to call it a wash here.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Wreckage
The image quality comparison is almost certainly a draw on all but one account. NVIDIA?s AA routine, especially with Transparency AA activated, is better at drawing very fine lines, like those in the Half-Life 2 fence screenshot. Whether it?s the thin left side of the fence or the small branches on the trees, NVIDIA shows you more, and even more importantly, what they show is in sharper contrast than what ATI?s routines deliver.

NVIDIA?s anisotropic filtering looks better in screenshots. You?ll remember that you can see the cobblestone pattern in the Call of Duty 2 screens far beyond where the ATI image blurs them into flat ground

hmm.. regardless of whether one agress with brandon or not, how about you post the rest of his comment on this?

NVIDIA?s anisotropic filtering looks better in screenshots. You?ll remember that you can see the cobblestone pattern in the Call of Duty 2 screens far beyond where the ATI image blurs them into flat ground, but this comes at a steep price. NVIDIA?s optimizations create the shimmering effect we saw in the Battlefield 2 video, which can range from unnoticeable to distracting, depending on the game, the scene, and how sensitive you are to it. Also, the optimizations (nvidia's) produce a sharper point of delineation where the card switches from low detail to high detail textures, creating clear steps of detail change relative to the smooth transition of ATI?s upcoming hardware (or more accurately, all ati 1k series hardware).ATI?s own optimizations aren?t without fault either, as some users have reported shimmering with ATI?s latest cards as well, but as you can see in the videos, it isn?t nearly as pronounced, even in our scenario outlined with Battlefield 2.

further, the article is flawed in that it doesn't even cover all the features or adjustments included in the respective hardware. whether one happens to be an ati or nvidia fan, the article is far from complete, omitting key features.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Shimmering is a divine punishment from above for those too lazy or too arrogant to go in and set things correctly.
Nvidia makes it so from default a "divine punishment". It's interesting how you see Nvidia as divine.................

Again, why is this review even worth anything if we already have a better one here in the forums from nitromullet?
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: spank
Originally posted by: Wreckage
...
So you prefer to use your graphics card to watch static screenshots instead of playing games, great for you.

Selective quoting ftw.

NVIDIA?s optimizations create the shimmering effect we saw in the Battlefield 2 video, which can range from unnoticeable to distracting, depending on the game.
...
The great part about optimizations is they are precisely that - optimizations and can be explicitly disabled on nvidia cards - and without any cryptic "A.I" euphenisms too. The option is right there in front of you. Feel free to use it.
 
Apr 6, 2006
32
0
0
Originally posted by: Gstanfor
The great part about optimizations is they are precisely that - optimizations and can be explicitly disabled on nvidia cards - and without any cryptic "A.I" euphenisms too. The option is right there in front of you. Feel free to use it.

If you are going to change the default filtering options, ati just have to turn on there HQAF and nvidia lost the iq match to the superior ati filtering again.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
I've never said r5xx doesn't have better AF filtering than nvidia (when HQ AF is on). As I've said before, we have ATi to thank for igniting the AF IQ war that led to nvidia abandoning their excellent AF implimentation.

Everyone conveniently forgets it was ATi that first compromised trilear filtering quality and ATi who first decided that bilinear could be used for AF texture stage calculations...
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,004
126
Ensure the LOD bias clamp is enabled and all 3 optimizations are disabled under Quality and you'll find the shimmering is pretty much nonexistant.
True but this doesn't help with wiggling textures; the only thing that helps that problem is to switch to HQ.

At default quality my X800XL has slight wiggling textures on rare occasions (where my 7900 GTX doesn't) but then my 7900 GTX still shimmers on rare occasions where my X800 XL doesn't so I would call it a wash between the two.

nVidia definitely has superior AA though, no question there. Full screen super-sampling modes for single cards and the fact that Transparency AA is better than AAA are two such advantages for nVidia.

NVIDIA?s anisotropic filtering looks better in screenshots. You?ll remember that you can see the cobblestone pattern in the Call of Duty 2 screens far beyond where the ATI image blurs them into flat ground
Still screenshots are generally useless for showing AF or AA since movement during actual gameplay is what we're interested in. A -3.0 LOD for example looks great in screenshots but pokes your eyes out during gameplay.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
If you are seeing wiggling textures, then it's probably time to stop smoking/injecting whatever substance it is you are abusing...
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Gstanfor
Everyone conveniently forgets it was ATi that first compromised trilear filtering quality and ATi who first decided that bilinear could be used for AF texture stage calculations...
Oh yeah! I think I saw a paper on that somewhere here this morning, but I wiped my a$$ with it since it didn't matter anymore. Seriously if you're going to hang around in the yesterdays of the computer industry why are you so argumentitive with its forefront?

You're saying ATI improved their image quality. Good for them. I never thought I'd see you give them a compliment while trying to bash them, but only you could pull that off.
If you are seeing wiggling textures, then it's probably time to stop smoking/injecting whatever substance it is you are abusing...
Wow, I'm in awe at that intelligent and relevant comment. Grow-up and learn how to discuss something that is somewhat relevant to a thread. As far as drug hallucinations, didn't you try to convince everyone that the 7 series could use HDR+AA in Far Cry?

What other "divine (Nvidia) punishments" are you going to try and defend? The fact that Nvidia once had a lead in the AF department against ATI but decided to decrease its visual quality with more of the current cards is a compliment how?


 

lopri

Elite Member
Jul 27, 2002
13,314
690
126
Call of Duty 2 on GTX SLI was one of the best gaming experiences I've experienced. This game may lack the latest shader effects, but the quality of textures were the highest I've ever seen. Now that I've sold my GTXs, my next purchase is set to be either X1900XT or X1950XT depending on how prices will pan out. I'm looking forward to play this game again on Radeon card!
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,004
126
If you are seeing wiggling textures, then it's probably time to stop smoking/injecting whatever substance it is you are abusing...
Or better yet how about you consult an eye doctor?
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Oh, I do consult an eye doctor, every 18 months. My vision is just fine thanks very much.

Perhaps your monitor is on the way out? That can make things wiggle on screen. Other than that, this would have to be the most outlandish claim I've ever seen you trot out (and thats saying something).
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,004
126
Oh, I do consult an eye doctor, every 18 months.
You should ask for a refund.

Perhaps your monitor is on the way out?
:roll:

That can make things wiggle on screen.
So can nVidia's Quality mode.

Other than that, this would have to be the most outlandish claim I've ever seen you trot out (and thats saying something).
Outlandish? I think not. Fire up something like Serious Sam TFE and check out the pillars on the outside temple under Quality mode.

If you can't see those pillars wiggling like flags as you move towards them then it's time for you to change eye doctors.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
So, do the textures giggle when they wiggle? enquiring minds want to know...

Try turning the conformant texture clamp on you clueless moron... This is just like NOLF1 - and that is a GAME ENGINE FAULT, not a nvidia quality mode or driver fault...
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,004
126
Try turning the conformant texture clamp on you clueless moron
It's already on by default Sherlock.

This is just like NOLF1 - and that is a GAME ENGINE FAULT, not a nvidia quality mode or driver fault...
So you're admitting there is texture wiggling then?

Not to mention that NOLF is Direct3D but conformant texture clamp is solely for OpenGL.

Your antics are comical.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
No, I'm not admitting anything. The situation is just like the NOLF situation (and the farcry shadow + early GF6 situation) - the visual stuffups are caused by the DEVELOPER...
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Right, so that is why they're almost completely absent on another card? Or even the same card with a different setting?
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Thats what happens when you do dumb things like looking for specific deviceid's, capbits or OGL extensions in your code and base critical rendering decisions off of them. It's the reason why m$ is moving away from capbits in dx10 and why nvidia advises developers not to rely on deviceid's.