Anyone feel that ATI/AMD still has better image quality than Nvidia?

sindows

Golden Member
Dec 11, 2005
1,193
0
0
I thought this was commonplace knowledge 4-5 years ago but I thought that Nvidia had "caught" up with the 7x00 series but IMO the disparity still exists at least in my experience.

Granted I'm not using the most up to date technologies but I'm playing around with my pc and I notice a huge difference in image quality from a 9550 compared to a 7600GS. Okay the Radeon has nowhere near the power of the 7600 but the 3d image it produces seems to be much clearer/sharper/colorful. It feels like the 7600 adds a blur to games although by default. Its the same pc hooked up to the same monitor and the games I've installed have not had their settings changed so its pretty much an apples-apples comparison.

Am I doing something wrong with the 7600gs? I don't think it has to do with leftover drivers because I do the 1)uninstall, 2)boot into safe mode and use drivercleaner, 3)boot into regular mode and use driver cleaner and 4)reboot and install drivers routine.

Anyone feel the same? opposite?
 

Quiksilver

Diamond Member
Jul 3, 2005
4,725
0
71
I don't know, but I wish someone would actually compare the two with a number of games, anisotropic filtering and anti-aliasing levels so we can finally put this silly question to rest; no matter how small the difference is.
 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
The 7 series is known to have poor quality AF and texture shimmering. However, the "colorful" thing sound like just an issue of changing the color profile settings in the driver.

The 8 series, however, probably has the best IQ available now, with its excellent AF and multitude of AA options. I believe the 2900/3870 AF is identical to the HQ AF on the X1 series, which is good but does have a bit of room for improvement.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: sindows
I thought this was commonplace knowledge 4-5 years ago but I thought that Nvidia had "caught" up with the 7x00 series but IMO the disparity still exists at least in my experience.

Granted I'm not using the most up to date technologies but I'm playing around with my pc and I notice a huge difference in image quality from a 9550 compared to a 7600GS. Okay the Radeon has nowhere near the power of the 7600 but the 3d image it produces seems to be much clearer/sharper/colorful. It feels like the 7600 adds a blur to games although by default. Its the same pc hooked up to the same monitor and the games I've installed have not had their settings changed so its pretty much an apples-apples comparison.

Am I doing something wrong with the 7600gs? I don't think it has to do with leftover drivers because I do the 1)uninstall, 2)boot into safe mode and use drivercleaner, 3)boot into regular mode and use driver cleaner and 4)reboot and install drivers routine.

Anyone feel the same? opposite?

you are right about one thing ... you are not up-to-date

i had a g80 8800GTS 640-OC to compare with a r600 HD2900xt and they are the same [beautiful] IQ; in fact several of us did reviews this Summer:

In House HD2900XT vs. 8800GTS 640

read the first 5 posts.

the earlier series - i had 6800GS - was not quite as good as my X1950p/512m, imo.
 

Blacklash

Member
Feb 22, 2007
181
0
0
Technically speaking, nVidia has had the best AF since the release of the 8 series. I was a loud critic of the nVidia 7 series AF and some of their optimizations that caused shimmer. I went from a 7800GT to an overclocked X1900XT. That situation turned when the 8800s hit.

I've used current products from both camps and 90% of the time I can't tell the difference. In some cases the nVidia 8 series AF will clean up distant textures at certain angles better than their competition.
 

LOUISSSSS

Diamond Member
Dec 5, 2005
8,771
57
91
ATi had the lead up until their x1950 series besting anything nvidia had at that time.

currently, i'm not sure who's image quality is better, but i believe they are more or less the same, i would like to read a detailed comparison too.

currently i'm running an 8800gt coming from x1900xt but would like to see a 3870 series card compared to the g92.

Originally posted by: ExarKun333
No.

If by image quality you mean slower frame rates, then yes.

ignor.
Thats a stupid statement.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
nVidia has the IQ crown with the 8xxx series, both with AF and AA.
 

qbfx

Senior member
Dec 26, 2007
240
0
0
Originally posted by: ExarKun333
No.

If by image quality you mean slower frame rates, then yes.

but if by image quality he means who's the top noob in AT, then yes - it's you!
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: qbfx
Originally posted by: ExarKun333
No.

If by image quality you mean slower frame rates, then yes.

but if by image quality he means who's the top noob in AT, then yes - it's you!

Actually, his statement could be accurate for reasons other than what you'd think at first glance:

Comparing single GPU cards, there are probably a lot of games where ATi's best single GPU card (the HD3870) would not be able to run as high of resolution and/or AA settings as a number of NVIDIA cards.

Also, ATi's best dual GPU card (the 3870X2) would be in the same position for any game that doesn't scale with Crossfire.

So, technically what he said could be accurate. (although if this was his intent, it wasn't very clearly stated)

As far as the OP goes:

Your topic for the thread is somewhat misleading. It implies you're referring to the current state of affairs, when you're actually referring to years old video cards people likely aren't even buying anymore.

NVIDIA's AF is angle independent (performs filtering on all angles) while ATis is not. The difference isn't large, but it's impossible for ATi to have better AF due to this.

NVIDIA also has some AA modes that are considered better than what ATi can do at this point.

So the answer to your question would have to be "No".
 

qbfx

Senior member
Dec 26, 2007
240
0
0
Originally posted by: nRollo
Originally posted by: qbfx
Originally posted by: ExarKun333
No.

If by image quality you mean slower frame rates, then yes.

but if by image quality he means who's the top noob in AT, then yes - it's you!

Actually, his statement could be accurate for reasons other than what you'd think at first glance:

[...]

A statement like this I find lame and useless regardless of its authors' intent. This is disinformation, I can say "If by performance you mean slower floating-point operation then yes, Q6600 is better than K6" right :D
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: qbfx
Originally posted by: nRollo
Originally posted by: qbfx
Originally posted by: ExarKun333
No.

If by image quality you mean slower frame rates, then yes.

but if by image quality he means who's the top noob in AT, then yes - it's you!

Actually, his statement could be accurate for reasons other than what you'd think at first glance:

[...]

A statement like this I find lame and useless regardless of its authors' intent. This is disinformation, I can say "If by performance you mean slower floating-point operation then yes, Q6600 is better than K6" right :D

Whether it's lame or not, prove it wrong. If you can, so much the better. Simple.
 

qbfx

Senior member
Dec 26, 2007
240
0
0
Originally posted by: keysplayr2003
Originally posted by: qbfx
Originally posted by: nRollo
Originally posted by: qbfx
Originally posted by: ExarKun333
No.

If by image quality you mean slower frame rates, then yes.

but if by image quality he means who's the top noob in AT, then yes - it's you!

Actually, his statement could be accurate for reasons other than what you'd think at first glance:

[...]

A statement like this I find lame and useless regardless of its authors' intent. This is disinformation, I can say "If by performance you mean slower floating-point operation then yes, Q6600 is better than K6" right :D

Whether it's lame or not, prove it wrong. If you can, so much the better. Simple.

I do not support ATi/AMD over nVidia (or vice versa), and haven't stated the opposite to his comment so I don't think I have to prove anything. The thread isn't about frame rates anyway. I just find comments of this sort to be lame, that's it.
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Here's a somewhat interesting article with iq comparrisons between the 3870 and 8800GT.

It looks to me like the 3870 has a bit better definition/contrast in most of the pics, but the in-game Crysis pic in the water I think the 8800GT looks better. <shrug> I've never really tweaked the colors with my video cards, they always look fine to me, so I don't know if the images could be tweaked to look the same or whatever. Anyway, the pics are there, take them for whatever they are worth.
 
Dec 30, 2004
12,553
2
76
Originally posted by: Quiksilver
I don't know, but I wish someone would actually compare the two with a number of games, anisotropic filtering and anti-aliasing levels so we can finally put this silly question to rest; no matter how small the difference is.

won't ever be at rest; these things change any time they change it in the drivers. A bit is a bit is a bit. A bit which the driver changes is not.
 

JACKDRUID

Senior member
Nov 28, 2007
729
0
0
Originally posted by: SlowSpyder
Here's a somewhat interesting article with iq comparrisons between the 3870 and 8800GT.

It looks to me like the 3870 has a bit better definition/contrast in most of the pics, but the in-game Crysis pic in the water I think the 8800GT looks better. <shrug> I've never really tweaked the colors with my video cards, they always look fine to me, so I don't know if the images could be tweaked to look the same or whatever. Anyway, the pics are there, take them for whatever they are worth.

I'd agree... 3870 has better contrast but 8800 look much cleaner/sharper. if i was to rate it, I would rate them the same.
 

ja1484

Platinum Member
Dec 31, 2007
2,438
2
0

How you "feel" is pretty irrelevant, because numerous IQ comparos on numerous reputable hardware sites have objectively shown that IQ is even these days.
 

Quiksilver

Diamond Member
Jul 3, 2005
4,725
0
71
Originally posted by: SlowSpyder
Here's a somewhat interesting article with iq comparrisons between the 3870 and 8800GT.

It looks to me like the 3870 has a bit better definition/contrast in most of the pics, but the in-game Crysis pic in the water I think the 8800GT looks better. <shrug> I've never really tweaked the colors with my video cards, they always look fine to me, so I don't know if the images could be tweaked to look the same or whatever. Anyway, the pics are there, take them for whatever they are worth.

Nice find, it does appear that ATI does seem to appear the have better contrast than nvidia, but as far as AA and AF goes; I can barely tell the difference, I mean the only real clue is to look at the tree line in the crysis images; other than that its exactly the same.
 

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
Originally posted by: qbfx
Originally posted by: nRollo
Originally posted by: qbfx
Originally posted by: ExarKun333
No.

If by image quality you mean slower frame rates, then yes.

but if by image quality he means who's the top noob in AT, then yes - it's you!

Actually, his statement could be accurate for reasons other than what you'd think at first glance:

[...]

A statement like this I find lame and useless regardless of its authors' intent. This is disinformation, I can say "If by performance you mean slower floating-point operation then yes, Q6600 is better than K6" right :D

I agree, its a trolling statement. Its clear as day to see. Of course the NV fans will claim that dual GPU cards dont count now, when they did in the past. Hypocrisy as usual.

ATi has the fastest overall card out, deal with it.
 

qbfx

Senior member
Dec 26, 2007
240
0
0
Originally posted by: Ackmed
Originally posted by: qbfx
Originally posted by: nRollo
Originally posted by: qbfx
Originally posted by: ExarKun333
No.

If by image quality you mean slower frame rates, then yes.

but if by image quality he means who's the top noob in AT, then yes - it's you!

Actually, his statement could be accurate for reasons other than what you'd think at first glance:

[...]

A statement like this I find lame and useless regardless of its authors' intent. This is disinformation, I can say "If by performance you mean slower floating-point operation then yes, Q6600 is better than K6" right :D

I agree, its a trolling statement. Its clear as day to see. Of course the NV fans will claim that dual GPU cards dont count now, when they did in the past. Hypocrisy as usual.

ATi has the fastest overall card out, deal with it.

I guess some people need more clear to see it.
 

lopri

Elite Member
Jul 27, 2002
13,310
687
126
Well, on topic I *feel* this unexplainable difference in image quality between current generation AMD and NV.

In 3D

AMD: Smooth, Thin, Soft
NV: Strong, Crude, Saturated

Overall I like NV's image quality better although there is subtle difference that attracts me from AMD's presentation.

HD playback, on the other hand, I should say AMD has an upper hand. NV somehow manages perfect score in HQV testing, which I can only understand by cheating or something of the sort. Although I haven't compared G92 variants vs RV760 yet. The aforementioned impression (HD playback, especially H.264) is from 2600XT vs 8600GT.
 

batmang

Diamond Member
Jul 16, 2003
3,020
1
81
I wouldn't know as of today. But from my experience in the past ATi always had better image quality. I'm now on a all out AMD/ATi machine and I love it very much. I've always felt like games were smoother on ATi cards if that matters at all.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
This is basically the difference in the default color settings of ATi and nVIDIA. Turn up Digital Vibrance up a notch and you will see that change.

Then again, its all up to the person. But technically speaking, nVIDIA has the best AF algorithm, and some of their AA modes are considered to be the best, eg xS hybrid AA mode.