Testing Nvidia vs. AMD Image Quality

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I have trouble enjoying titles with no AA and AF, and thankfully, there are tools to add or enhance IQ.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
I didn't see it when I breezed through the thread, but did anyone mention AMD's superior desktop image quality? I spend most of my time on my computer doing work and NVIDIA's sub-par 2D IQ makes me wary to invest in any of their cards. It's improved over the years but I still find it lacking compared to AMD's, especially at 2560x1600 while viewing text.
 

Absolution75

Senior member
Dec 3, 2007
983
3
81
I didn't see it when I breezed through the thread, but did anyone mention AMD's superior desktop image quality? I spend most of my time on my computer doing work and NVIDIA's sub-par 2D IQ makes me wary to invest in any of their cards. It's improved over the years but I still find it lacking compared to AMD's, especially at 2560x1600 while viewing text.

This is odd. You aren't connecting via analog vga are you?

I've never heard/seen 2d image differences between the two companies.
 

peonyu

Platinum Member
Mar 12, 2003
2,038
23
81
I didn't see it when I breezed through the thread, but did anyone mention AMD's superior desktop image quality? I spend most of my time on my computer doing work and NVIDIA's sub-par 2D IQ makes me wary to invest in any of their cards. It's improved over the years but I still find it lacking compared to AMD's, especially at 2560x1600 while viewing text.


You sure its not a placebo effect? No offense but Ive heard people say this before, and I have used both companies products myself and never noticed a difference.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
This is odd. You aren't connecting via analog vga are you?

I've never heard/seen 2d image differences between the two companies.

I think it depends on how picky the person is and what videocards are being compared. I went from GeForce 6600 to 8800 GTS and didn't notice any difference. Then I went from 8800GTS to HD4890 and noticed how much worse 8800GTS was at 1920x1080 in 2D. After upgrading to GTX470, I couldn't notice any difference coming from the 4890 at all in terms of 2D clarity. On my LCD screen, I did notice that GF6, 8 and GTX470 all had slightly lower contrast compared to my Radeons (all comparisons were done with DVI out on the videocard).
 

Liet

Golden Member
Jun 9, 2001
1,529
0
0
Nah. Nick Stam wrote the blog ABOUT the other 4 tech site findings. And I don't think this a marketing trick Arkadrel. I think Nvidia just might be fed up with it.

Oh snap. I used to work with Nick Stam ages ago at PC Mag. The dude is SMART, in caps.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
I didn't see it when I breezed through the thread, but did anyone mention AMD's superior desktop image quality? I spend most of my time on my computer doing work and NVIDIA's sub-par 2D IQ makes me wary to invest in any of their cards. It's improved over the years but I still find it lacking compared to AMD's, especially at 2560x1600 while viewing text.

I use NV and mostly 2D desktop stuff (or whatever its called for your basic apps in Win7 these days, not sure how aero figures in there)...with dual screen. (1920x1200)

Would an ATI card really deliver a different perception to my eyes? I don't see its possible, text is text isn't it? What's there to screw up or do wrong?
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
I went from a 8600gt to a ati 4770 and noticed no difference.
But when I went to the gtx 460, I did. I believe the cards react differently to Microsoft clear text , which I had set up to suit the 4770. The 460 I recalibrated to what looked best to me.
 

Xarick

Golden Member
May 17, 2006
1,199
1
76
Went from an 8800gts to a 5850 and my desktop definitely looks much nicer. Games not so much though.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
I use NV and mostly 2D desktop stuff (or whatever its called for your basic apps in Win7 these days, not sure how aero figures in there)...with dual screen. (1920x1200)

Would an ATI card really deliver a different perception to my eyes? I don't see its possible, text is text isn't it? What's there to screw up or do wrong?
You would think, but I definitely noticed a difference going from my GTX 295 to the 5xxx series. It could be my monitor because it depends on the graphics card to scale, but that shouldn't be a problem if I'm running native resolution, I would think. I actually had the chance to hook up a 4xx series GPU the other day, totally slipped my mind though.
I went from a 8600gt to a ati 4770 and noticed no difference.
But when I went to the gtx 460, I did. I believe the cards react differently to Microsoft clear text , which I had set up to suit the 4770. The 460 I recalibrated to what looked best to me.
Yeah, changing clear text can help sometimes I remember reading.
Went from an 8800gts to a 5850 and my desktop definitely looks much nicer. Games not so much though.
Good to see more input. I put this out there because I've noticed it and I know I've read reports about it, and I think this is a great thread to discuss it in.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Agreed. :thumbsup:

You’re focused more on artistic content, while I'm focused more on accurate rendering.

:thumbsup:

*puts his rendering hat on*

If I saw that in the more intense rendering modes, the "newer" videocard with almost 2x as many shaders was providing lower framerate in games, I would be seriously disappointed myself - what was the point of the upgrade then I would be asking myself.

From that angle, I understand your frustration when you just spent $300 and actually got a 'downgrade' under the same rendering quality. It would be the same for me if I upgraded in order to gain playability with Enthusiast shaders in Crysis, but ended up with a card that was actually slower with Gamer shaders than my previous card. :D

BTW, I saw you provided your input in this thread.

I think an ABT article on comparing NV vs. AMD AA modes would be very helpful. For example, see how AnandTech arranged HD4890, 5870, 258 and 480 in a way that makes it easy to compare:
http://www.anandtech.com/show/2977/...tx-470-6-months-late-was-it-worth-the-wait-/7#

If you look at the middle palm tree and farthest right foilage near the gun, ATI 5870 seems to have superior anti-aliasing on the leaves with 4AA + SSAA than the 480 or 285 accomplish at 4AA + TrSS. Have you found the same in practice?
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
This is odd. You aren't connecting via analog vga are you?

I've never heard/seen 2d image differences between the two companies.
ATI/AMD cards most certainly deliver a cleaner and crisper images. I am not 100% sure why. I've seen the results on an NEC professional monitor, the Nvidia card did not give the image that little extra clarity that the ATI card did.

Keep in mind the difference is not mind blowing, it's subtle but noticeable where once you see the difference you will not want to go back.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
ATI/AMD cards most certainly deliver a cleaner and crisper images. I am not 100% sure why. I've seen the results on an NEC professional monitor, the Nvidia card did not give the image that little extra clarity that the ATI card did.

Keep in mind the difference is not mind blowing, it's subtle but noticeable where once you see the difference you will not want to go back.

That is odd. Digital 2D output is Digital 2D output. As long as you're using DVI or HDMI and not Analog or any DVI to RGB adapters.

AnandThenMan has a good point though. Even though the difference isn't mind blowing, it's subtle but noticeable and once you see it you will not want to go back applies perfectly for this thread topic. If AMD cuts subtle corners lowering image quality, and look closely and see this, you may not want to go back to them if you now know they are doing this. It makes sense. Good comparison ATM.
 

Rifter

Lifer
Oct 9, 1999
11,522
751
126
I noticed no differnce in 2d or 3d going from a 4890 to a GTX460. same monitor same cables.
 
Feb 18, 2010
67
0
0
I noticed a difference in 2D image quality by going from nVidia 7600GS to ATi 4870. Nice and crisp image on the ATI card.
 

KCfromNC

Senior member
Mar 17, 2007
208
0
76
If AMD cuts subtle corners lowering image quality, and look closely and see this, you may not want to go back to them if you now know they are doing this.

But nVidia is also cutting corners in their drivers in 3-D quality by messing around with default settings just like AMD. So if their 2-D quality is also worse, seems like it's two strikes to one.

Of course we don't have any real evidence that the 2D quality is worse, but apparently this is the "ask leading questions so we can jump to conclusions" thread so it fits right in.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
But nVidia is also cutting corners in their drivers in 3-D quality by messing around with default settings just like AMD. So if their 2-D quality is also worse, seems like it's two strikes to one.

Of course we don't have any real evidence that the 2D quality is worse, but apparently this is the "ask leading questions so we can jump to conclusions" thread so it fits right in.

LMAOOOO.... That was great!!

Of course now we'll need examples. Whenever you're ready.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
I did not notice any difference in 2D text from my 5770 to my GTX 280. For that matter, I did not notice any 3D difference either.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Examples of what? Didn't you read this thread? Check out post 84, for example.
Unsurprisingly overlooked. Both companies will optimize drivers to give the best performance advantage to their hardware. Claiming one is better than the other is ludicrous because it's subjective.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Thats what third party web-sites are for and to offer their investigations. Tolerances are really subjective to me.
 

KCfromNC

Senior member
Mar 17, 2007
208
0
76
Unsurprisingly overlooked. Both companies will optimize drivers to give the best performance advantage to their hardware. Claiming one is better than the other is ludicrous because it's subjective.

Yeah. I think it's being blown way out of proportion. Both companies do it. But how many are attempts to cheat versus legitimate attempts to optimize speed without hurting image quality (even if those optimizations sometimes aren't as image-quality neutral as hoped in all cases)? I'm willing to give both sides the benefit of the doubt. Certain posters (who coincidentally are also given free hardware by nVidia) seem quicker to accuse ATI of cheating.

From what I've seen so far a few older games slipped through the cracks during a large overhaul of how these sorts of quality<->performance tradeoffs are presented to the user on ATI's side. If you disagree with the default choices ATI has made, bump the quality up or down a notch in the driver control panel and you'll be happy.

On nVidia's side, they made a silent change in the driver which degraded the image quality TrAA. Can you undo this "optimization" like you can with ATI cards? Maybe, if you download a third party app and know where to look to hack it back in.

An honest thread on AMD vs NV image quality would discuss both problems. Instead, it looks like NV is trying to get reviewers to disable IQ-neutral optimization in on ATI's cards to make the competition more "fair".

I own a GTX 275 and like it, even if it's getting a bit old. I just wish NV would focus on making their product better rather than whining about their competition doing exactly what NV themselves do.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Claiming one is better than the other is ludicrous because it's subjective.

Not really! There was a time when nVidia was aggressive with default settings while ATI had less aggressive optimizations and some sites would bench with default ATI and high quality nVidia. The amazing part is the pro nVidia or extremists basically did say what you're saying now.
 
Last edited: