FiringSquad does image quality comparisons...again

yacoub

Golden Member
May 24, 2005
1,991
14
81
Interesting. As I wrote in their comments:

I recently upgraded from an X800XL to a 7900GT. Fresh out of the box, the 7900GT on 91.33 drivers playing CS:Source's cs_assault mission was absolutely HORRID. Nearly every texture with lines in it shimmered unbelievably badly when moving my view around. I'd never experienced that with my X800XL and was shocked and uncertain how to remedy it. I turned up AA and AF to little avail. Finally I was told to try locking a LOD Bias clamp (the actual words vary but just put a check next to the item under Direct3D and OpenGL that talks about clamping LOD or LOD Bias. That fixed it almost completely. Then upping AA and AF again completely removed what was left.

So straight out of the box? NVidia's shimmering was atrocious, ATI's was non-extistant. After tweaking, NVidia's went away pretty much completely and the overall graphical quality was indeed 'sharper' and 'better', although certain colors are not as vibrant as they were with ATi.

So if you want an easier test for shimmering, load up cs_assault and walk around outside a bit. It's readily apparent and completely obvious without having to go out of your way to look for a specific spot of it somewhere.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
I liked this review, well maybe some people prefer higher resolutions without AA vs lower resolution with AA? Who knows.
 

Kromis

Diamond Member
Mar 2, 2006
5,214
1
81
To be quite frank, I can't tell small differences in image quality because its the action that takes away my attention.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
I tend to agree with most of their AA findings (I've always found TrAA to look better than AAA) but I really feel they totally glossed over the shimmering issue and downplayed it.

nVidia's default quality is really quite attrocious and unusably bad.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Using a 7800GT @570/1300 right now. TrAA is better than AAA. Although, I am missing the vibrance and better AF that I had. :(
 

Raloth

Member
Jun 12, 2006
65
0
0
Very disappointing review for one reason. Why would anyone try to illustrate differences in image quality and then put up low quality jpeg closeups? I can't get a good feel for the image with such noticable compression artifacts.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: yacoub
Interesting. As I wrote in their comments:

I recently upgraded from an X800XL to a 7900GT. Fresh out of the box, the 7900GT on 91.33 drivers playing CS:Source's cs_assault mission was absolutely HORRID. Nearly every texture with lines in it shimmered unbelievably badly when moving my view around. I'd never experienced that with my X800XL and was shocked and uncertain how to remedy it. I turned up AA and AF to little avail. Finally I was told to try locking a LOD Bias clamp (the actual words vary but just put a check next to the item under Direct3D and OpenGL that talks about clamping LOD or LOD Bias. That fixed it almost completely. Then upping AA and AF again completely removed what was left.

So straight out of the box? NVidia's shimmering was atrocious, ATI's was non-extistant. After tweaking, NVidia's went away pretty much completely and the overall graphical quality was indeed 'sharper' and 'better', although certain colors are not as vibrant as they were with ATi.

So if you want an easier test for shimmering, load up cs_assault and walk around outside a bit. It's readily apparent and completely obvious without having to go out of your way to look for a specific spot of it somewhere.

Agreed.
It was a real shock when i went from the 9800 pro to a 6600GT.

Why anyone would use anything below HQ/BQ is beyond me ;)
 

RobertR1

Golden Member
Oct 22, 2004
1,113
1
81
Testing IQ but ignoring HQ setting and HQAF? What a waste of an article. An extra page dedicated to that would go a long way.
 

Auric

Diamond Member
Oct 11, 1999
9,591
2
71
NVIDIA has always been known for subpar quality and/or taking an inordinate performance hit when the quality settings were turned up. Perhaps that was acceptable in ye olden days but to me it is backwards. I care most about IQ so reckon the highest should be the default and only dumbed down when absolutely necessary (such as after having the card for a couple years but wanting to play then current games while holding out for the next upgrade). I am certainly not going to compare competing cards at low quality settings. Both NVIDIA and ATI seem to have pretty good hardware these days so they should put an end to the software shens if it is intentional or otherwise make some significant changes to the software side of their bidnesses to minimize the taint.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: BFG10K
I tend to agree with most of their AA findings (I've always found TrAA to look better than AAA) but I really feel they totally glossed over the shimmering issue and downplayed it.

nVidia's default quality is really quite attrocious and unusably bad.

And except for the dumbed down AF quality (thanks ATi for starting an AF filtering war and trying to redefine AF as using bilinear instead of trilinear filtering - it's just what we all wanted...) its all completely fixable and ends up better than ATi's quality (AF & HDR AA excepted).
 

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
Originally posted by: RobertR1
Testing IQ but ignoring HQ setting and HQAF? What a waste of an article. An extra page dedicated to that would go a long way.


I agree.

Im also confused as to why they show TRAA and AAA (which they should) in this article, yet they never benchmark with them?

And yeah, they did gloss over shimmering. At least its mentioned for a change though.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: Gstanfor
Originally posted by: BFG10K
I tend to agree with most of their AA findings (I've always found TrAA to look better than AAA) but I really feel they totally glossed over the shimmering issue and downplayed it.

nVidia's default quality is really quite attrocious and unusably bad.

And except for the dumbed down AF quality (thanks ATi for starting an AF filtering war and trying to redefine AF as using bilinear instead of trilinear filtering - it's just what we all wanted...) its all completely fixable and ends up better than ATi's quality (AF & HDR AA excepted).

well, they should bench at HQ then, you just have to read either thread which used the 84.43s, 91.33 betas and 91.33whqls to see the impact HQ takes over default Q...
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Framerate really does not worry me very much 99% of the time, Dug. Like I said in another thread, we've (those of us who have followed the "issue" anyway) known the performance impact of HQ vs Q since very soon after nv40 launched.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: 5150Joker
Originally posted by: josh6079
Using a 7800GT @570/1300 right now. TrAA is better than AAA. Although, I am missing the vibrance and better AF that I had. :(

Where's your XTX?


Beat me to it! Did something happen to it?
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: josh6079
Using a 7800GT @570/1300 right now. TrAA is better than AAA. Although, I am missing the vibrance and better AF that I had. :(

You can actually do a lot about the vibrance with the digital vibrance slider.

http://images.anandtech.com/reviews/vid...orceware91xx/adjustdesktopsettings.png

Digital Vibrance is kind of like hot sauce though (a little bit can go a long way), so bump the slider up in very small increments to find the setting that you like the best. I think I settled on somewhere between 3-6 (definitely under 10). With DV I found the image to be much more pleasing then the default setting, which feels somewhat cold and drab to me in comparison to ATI's default.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Originally posted by: nitromullet
Originally posted by: josh6079
Using a 7800GT @570/1300 right now. TrAA is better than AAA. Although, I am missing the vibrance and better AF that I had. :(

You can actually do a lot about the vibrance with the digital vibrance slider.

http://images.anandtech.com/reviews/vid...orceware91xx/adjustdesktopsettings.png

Digital Vibrance is kind of like hot sauce though (a little bit can go a long way), so bump the slider up in very small increments to find the setting that you like the best. I think I settled on somewhere between 3-6 (definitely under 10). With DV I found the image to be much more pleasing then the default setting, which feels somewhat cold and drab to me in comparison to ATI's default.

Yeah, i think i'm at about 9% now, and any more it starts to look cheesy...too much is plain horrible.
 

Trevelyan

Diamond Member
Dec 10, 2000
4,077
0
71
Here seems to be the generalization that I found to be true:

Nvidia default:
-Faster
-Image quality suffers

ATI's default:
-Slower
-Image quality is noticeably better

Nvidia tweaked:
-Slower
-Image quality is better with AA

ATI tweaked:
-Faster
-Image quality is better with AF


For me, I notice the AF differences WAY more than the AA differences, and plus, ATI's have always given me better performance, probably because I play mainly Source engine games where ATI has an advantage.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
No surprises here.

NV for best AA.
ATI for best AF.

Overall, equal solutions in perf/IQ overall that you should pick off of warranty (Nvidia), power consumption (Nvidia), multiGPU implementation (Nvidia), quiet operation (Nvidia), driver quality/features/support (Nvidia).

Very easy analysis, and thats why I use Nvidia.
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: RobertR1
Testing IQ but ignoring HQ setting and HQAF? What a waste of an article. An extra page dedicated to that would go a long way.

i brought up a similar point over at FS. Brandon replied to me they plan on a part 2:


cainam (1) Aug 23, 2006 - 01:46 am
what really leaves me scratching my head is that HQAF is not used in an article regarding image quality. wth??

nvidia's texture filtering is horrid with regard to mipmap transistions, and if compared with ati's HQAF, it's also bland and fuzzy in comparison.

come on, FS, if you're going to compare image quality, let's not completely ignore a huge feature which improved IQ dramatically - even if it's only available from one manufacturer.

» Reply to this
GX-Brandon (38) Aug 23, 2006 - 04:05 am
That's for Part 2...

» Reply to this
GX-Brandon (38) Aug 23, 2006 - 05:38 am
Okay, now that the NDA has expired on the X1950 XTX we can confirm that the ATI card used was in fact the X1950. This article was actually supposed to go inside the X1950 XTX article, but the premise expanded once we decided to include video footage, so we felt it was worthy of its own dedicated article. Once you've read the article you can then proceed to the X1950 story to see the actual benchmarks we got at those settings.

In Part 2 we plan on opening the control panel up a bit more and really going further into the topic. Again, any feedback/suggestions you've got would be great to hear...

 

Auric

Diamond Member
Oct 11, 1999
9,591
2
71
Re: BF2 "hi-res" screenshots... someone please tell FS they must be retarded trying to do an IQ comparison at a distorted aspect ratio. As it is though, NVIDIA looks surprisingly better and it is difficult to believe that 4xAA is enabled on ATI. I wonder if the Stretch-O-Vision affects/interferes with each differently. Haven't seen the shimmer vid yet.

-
I took some screenshots at the same location with both 2xAA and 4xAA and it is clearly not enabled at any level in their ATI pic (20.png). Also, if they didn't want to be trebly teh dumbs they would have taken identical screenshots from a fixed position -namely vehicles, with the view centered on a suitable pixel or thereabouts.

The NVIDIA shimmer is rather bad. Surely that's not "The Way It's Meant To Be Played", and so should not be the driver default. I wonder if the hardware guys at these companies are disgustipated with the software guys over such shens?
 
Apr 6, 2006
32
0
0
Originally posted by: Crusader
Overall, equal solutions in perf/IQ overall that you should pick off of warranty (Nvidia), power consumption (Nvidia), multiGPU implementation (Nvidia), quiet operation (Nvidia), driver quality/features/support (Nvidia).

Very easy analysis, and thats why I use Nvidia.

warranty: ATI have Powercolor and visiontek offering lifetime warrenty.
power consumption: 1950xtx paired with an ati chipset have a lower power consumption then a comparable nvidia card on an nforce chipset.
multiGPU implementation: I will quote HardOCP here who have been bad talking crossfire a lot.
I just wanted to take an aside and talk about CrossFire here a bit. CrossFire isn?t exactly known as the friendliest multi-GPU platform to get working. We?ve had many bouts of trouble with CrossFire platforms here in the past resulting in it simply not working for us.
I have to say with the current setup I used for testing shown here today and the 5 games we tested in I did not experience a bit of trouble setting up CrossFire.
quiet operation: The new 1950xtx cooler aint that loud, and with it dumping the heat outside the case, you dont have the need for extra case fans pulling the hot air out of the case.
quality/features/support in what ways are nvidia drivers better?

I would say that the fastest multicard solution for directx games would be two x1950xtx offering the lowest power draw and best image quality avalable.
 

LittleNemoNES

Diamond Member
Oct 7, 2005
4,142
0
0
Originally posted by: Crusader
No surprises here.

NV for best AA.
ATI for best AF.

Overall, equal solutions in perf/IQ overall that you should pick off of warranty (Nvidia), power consumption (Nvidia), multiGPU implementation (Nvidia), quiet operation (Nvidia), driver quality/features/support (Nvidia).

Very easy analysis, and thats why I use Nvidia.

Awesome! I'll take 3!
Wait... your NOT selling these? WTF.