Anyone else notice this?

Therk

Senior member
Jul 15, 2005
261
0
0
I remember back nearly 2 years ago I upgraded from a 9600 XT to 6800GS. As soon as I launched the first game with the 6800GS I thought the image quality was qite poor compared to the ATI. I thought it just seemed that way to me. I brought this up in a conversation and most people seemed to agree with me!

They also agreed with me saying that the ATI cards provided much sharper and 'more defined' image quality and that the nvidia Anastrophic filtering was really lacking.... I'm thinking that this might be the reason nvidia always seems to be on top..... Did anyone else notice this? (Mainly concerning 6xxx/7xxx series)
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
It's a well known fact that ATI had superior IQ compared to NVidia's offerings in the 6xxx/7xxx series cards.

That all changed with G80 however. Pending the IQ tests of R600, Nvidia has better IQ than anything you can buy from ATI at the moment.
 

Captante

Lifer
Oct 20, 2003
30,340
10,859
136
I had the same experience moving from my original 9800 Pro to 6800GT especially in 2D & my X1900XTX improved on the GT as well. Going from the X1900XTX to my current 8800GTX, 2D IQ hasn't changed at all but 3D has gotten better so it works both ways.
 

PingSpike

Lifer
Feb 25, 2004
21,756
600
126
Same deal with me, I moved from a 6800nu to a x850xt and it was much better looking. Nvidia finally seems to have gotten onboard with the 8800 series though.
 

brencat

Platinum Member
Feb 26, 2007
2,170
3
76
Therk, my brother games with a nv6800 and I have the Radeon X800XL. No contest...the image on my ATI card is way better. Blatantly obvious on games like BF2. Best way to describe the difference is the image is much softer with the ATI, whereas on his nv card the blowing grass, the tank barrels, artillery explosions, etc look pixelated. Kind of like looking at a TV where the sharpness is turned up much too far. And I'm not even using AA either.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Yep, ATI had better AF image quality and more vibrant colors (by default) with the X19xx than NV with 7xxx, but NV had better AA image quality with SSAA than ATI's AAA. The colors could be adjusted in the NV control panel with digital vibrance to make them 'pop' more like ATI's. With the 8-series NV has absolutley fantastic image quality over all, even better than the X19xx series now that the old SSAA modes have been re-introduced in the driver. When AMD/ATI's new cards launch though, I'm pretty confident that they will catch up or even raise the bar again - there is a rumored 24xAA!!!! mode for singe cards.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: nitromullet
Yep, ATI had better AF image quality and more vibrant colors (by default) with the X19xx than NV with 7xxx, but NV had better AA image quality with SSAA than ATI's AAA. The colors could be adjusted in the NV control panel with digital vibrance to make them 'pop' more like ATI's. With the 8-series NV has absolutley fantastic image quality over all, even better than the X19xx series now that the old SSAA modes have been re-introduced in the driver. When AMD/ATI's new cards launch though, I'm pretty confident that they will catch up or even raise the bar again - there is a rumored 24xAA!!!! mode for singe cards.

QFT.

Most people feel ATi had better IQ, because the colours on "default" were much more vibrant. But really, when you use DV, you can match the "softness" just like ATi in that sense.
 

Atty

Golden Member
Aug 19, 2006
1,540
0
76
I find the colors and image quality better with my 8800 then I did with my x1900, although that pretty much is to be expected by what others are saying.

The Digital Vibrance on nVidia is incredible though, it really helps to bring out the colors on my TN Panel. :)
 

Therk

Senior member
Jul 15, 2005
261
0
0
So what should I do with Digital Vibrance to get the same or similar effect with ATI cards? Putting it on Low seems to have improved it heaps.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
The G7x in particular had really poor out-of-the-box AF quality. Running under HQ made things a lot better but my X800 XL still had less shader aliasing in many places.

G80 OTOH has the best AF implementation we've ever seen in consumer space.
 

Zenoth

Diamond Member
Jan 29, 2005
5,202
216
106
I moved for a Radeon X1800XL over a GeForce 7 series back in late 2005 (in December I believe it was) for two reasons, one being the price difference back then, and ATi's High-Quality AF which simply surprised me when I first saw test shots on known review sites, comparing images in given games with and without HQ AF, etc. I loved the IQ from ATi and I decided to go with that, and I never regretted the purchase.

It's been almost a year and a half since then, and now my system is starting to age, but it still does the job.

When I first saw nVidia's newer AF in their GeForce 8 series I was blown away by the results on paper. I looked at the screen shots comparisons between their new AF implementation and ATi X1K's HQ AF. It honestly blew me away in some situations. I have seen the GeForce 8 in action at my local store (he runs demonstration systems sometimes, not always though) and had some time to see the AF in Far Cry, AOE III, Prey, H-L 2 and CS: Source. Now I know that GeForce 8's best AF quality is indeed honoring the engineers work at nVidia, they've surpassed everything the GeForce 7 series could have ever wished to reach. Well, without exaggerations though, there were a few games where I couldn't find a single different during game-play. But games like H-L 2 and Far Cry showed the advantages clearly­. I play those games at home often enough to remember how they look with my system so I knew what I was seeing, I could already compare the IQ in my head.

Now, the question is ... will R600 bring even better A-F technology/methods to the table over the already better-then-decent HQ A-F ? Will there be something like HQ AF 2 ? Or even HD AF ? I certainly hope so, because I personally could care less about AA. It's all about textures and filtering for me. If I had to chose between having to play a game with blurred textures all around and everywhere in the game, or a game where all edges are jagged, then I'd go with the jagged edges game any day. Give me a game with jagged edges but very well filtered, crisp and crystal clear textures far up to the horizon and I'll be in heaven. I would even blindly ignore the jaggies with pleasure.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Now, the question is ... will R600 bring even better A-F technology/methods to the table over the already better-then-decent HQ A-F ?
I suspect they'll match the G80's implementation but it'll be tough to beat it because the method is almost perfect.

One way they could beat it is taking by more samples and offering a 32x setting or something. That would really cause a stir I think. ;)
 

Auric

Diamond Member
Oct 11, 1999
9,591
2
71
Gee, I thought it was common knowledge that until lately ATI dominated IQ-wise.