AMD vs Nvidia image quality

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ZGR

Platinum Member
Oct 26, 2012
2,052
656
136
This is a waste of time. Anyone actually game on both AMD and Nvidia GPU's on a daily basis? I do. I don't notice a single visual difference if the settings are the same.

Want to make your game look your game look more colorful, less washed out? Install GemFX/SweetFX.

What we should be discussing is how Nvidia and AMD sponsors game titles which in turn has favorable graphical settings to skew their GPU's in benchmarks. This is basically the new image quality war.
 
  • Like
Reactions: Grubbernaught

UglyDuckling

Senior member
May 6, 2015
390
35
61
I was not casting aspersions to yourself personally. 'Random forum poster' and 'moronic kid' was referring to your sources. Somehow you thought I was addressing you so your 'dickhead' response.. OK, whatever. So rather than providing 'proper sourcing' of outstanding issues re Nvidia image quality, you found these halfwit kids whom as you can see by the quality of their posts were rather clueless about their issues and potential causes.

We KNOW that Nvidia had an issue with HDMI not putting out full RGB a few years ago that was resolved with driver updates long since. There were workarounds for that by switching to DVI-DL or DP. Are you able to provide proper sources that do not refer to THAT (since it is a non-issue today)? There have been countless folks (here and elsewhere) switching from both GPU vendors over the years whom do not agree with your assessment that Nvidia has in any way lesser IQ than AMD or vice versa. All you have going on were based on myths arising from NVs 2003 cheat incident and an issue (non-issue to most who knew the workarounds) long since resolved (HDMI output) . Thus far you have failed to provide proper sources for your case that describe any issue of the sort that is relevant today.

p.s. I think we know the no.1 person who would jump on any IQ issues if they were to be found on an Nvidia card and who would relentlessly pursue it if there were any merit to it. Interestingly, he does comment on that briefly (Nvidia having 'richer colors' than Vega but puts that down to shadowplay). So we know his eye is determinedly on the lookout for these things. As well as him having cards from both vendors that he tests, compares and works on almost on a daily basis.
This is a waste of time. Anyone actually game on both AMD and Nvidia GPU's on a daily basis? I do. I don't notice a single visual difference if the settings are the same.

Want to make your game look your game look more colorful, less washed out? Install GemFX/SweetFX.

What we should be discussing is how Nvidia and AMD sponsors game titles which in turn has favorable graphical settings to skew their GPU's in benchmarks. This is basically the new image quality war.

Which is LOD bias, AA, heck even smaller things, it's what was bought up.

Do you suck <redacted>.for a living?

It's funny actually, i was member of OCN.

Back when i had a GTX 480 and a overclocked 2500k and i made a thread about the CPU bottlenecking the GPU which it clearly was, yet everyone dismissed the problem.

BFBC2 has TERRIBLE LOD pop in on Radeon 5000 series GPU's vs GTX 400 series, let us guess? no one ever knew about it? oh right yeah..

http://www.overclock.net/t/1387067/i5-2500k-is-not-enough-for-battlefield-3s-multiplayer-proof/80


But guess what we must sourse "legit" places ... hahaha suck a bag of <redacted>.

Profanity is not allowed in the technical forums
Markfw
Anandtech Moderator

This is an additional warning to all participants in this forum: reprehensible behavior such as demonstrated above by UglyDuckling is not going to be tolerated here. If you disagree with someone's posting, either do so respectfully or report it if it violates rules.
-- stahlhart
 

Excessi0n

Member
Jul 25, 2014
140
36
101
Alright, after actually taking the time to check the driver it looks like I was thinking of the texture filtering setting. Nvidia has it set to "quality" by default, not "high quality." Dunno how big of an impact it makes and I'm feeling a bit too lazy to take comparison screenshots...
 

dogen1

Senior member
Oct 14, 2014
739
40
91
Alright, after actually taking the time to check the driver it looks like I was thinking of the texture filtering setting. Nvidia has it set to "quality" by default, not "high quality." Dunno how big of an impact it makes and I'm feeling a bit too lazy to take comparison screenshots...

IIRC, the difference is minimal if it's even visible at all. As far as I know, it just doesn't use trilinear filtering where it's not needed.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
Proof or it didn't happen @crisium

Otherwise keep your FUD in AMD reddit

Funny guy. But odd that this is news to you.

Anyway, this was known among enthusiasts who could change it. Nvidia HDMI drivers defaulted to limited range over HDMI on many monitors. Non-enthusiast users who don't go into settings risked living with limited HDMI range for years. So for these users, yes Nvidia did have worse image quality in some regard.

It's already been discussed in this thread. But some more sources for you.

It could happen on Displayport sometimes too:
https://pcmonitors.info/reviews/dell-u2414h/

"When using either DisplayPort or HDMI the GPU sent out the wrong colour signal (‘Limited Range RGB 16-235’ instead of ‘Full Range RGB 0-255’). This reduced gamma, skewed white point, hugely impacted contrast and simply gave everything a washed out look. We are quite used to seeing this with Nvidia GPUs connected via HDMI as that is their default behaviour – treat the connected device as an HDTV. But we aren’t used to seeing this over DisplayPort"

And note their solution: they had to create a custom resolution. Because this was before they added a simple drop down in driver 347.09 in December 2014. Even after that, defaulting to limited range is common so casual users can still be stuck with it.

More info here, including issues unique to both Nvidia and AMD and the Nvidia 347.09 patch:
https://pcmonitors.info/articles/correcting-hdmi-colour-on-nvidia-and-amd-gpus/

More about how it occurred for years, and how there did not used to be a simple drop down (this article is just days before Nvidia released a drop down option actually):
www.pcgamer.com/nvidia-cards-dont-full-rgb-color-through-hdmiheres-a-fix/
 

godihatework

Member
Apr 4, 2005
96
17
71
Funny guy. But odd that this is news to you.

Anyway, this was known among enthusiasts who could change it. Nvidia HDMI drivers defaulted to limited range over HDMI on many monitors. Non-enthusiast users who don't go into settings risked living with limited HDMI range for years. So for these users, yes Nvidia did have worse image quality in some regard.

It's already been discussed in this thread. But some more sources for you.

It could happen on Displayport sometimes too:
https://pcmonitors.info/reviews/dell-u2414h/

"When using either DisplayPort or HDMI the GPU sent out the wrong colour signal (‘Limited Range RGB 16-235’ instead of ‘Full Range RGB 0-255’). This reduced gamma, skewed white point, hugely impacted contrast and simply gave everything a washed out look. We are quite used to seeing this with Nvidia GPUs connected via HDMI as that is their default behaviour – treat the connected device as an HDTV. But we aren’t used to seeing this over DisplayPort"

And note their solution: they had to create a custom resolution. Because this was before they added a simple drop down in driver 347.09 in December 2014. Even after that, defaulting to limited range is common so casual users can still be stuck with it.

More info here, including issues unique to both Nvidia and AMD and the Nvidia 347.09 patch:
https://pcmonitors.info/articles/correcting-hdmi-colour-on-nvidia-and-amd-gpus/

More about how it occurred for years, and how there did not used to be a simple drop down (this article is just days before Nvidia released a drop down option actually):
www.pcgamer.com/nvidia-cards-dont-full-rgb-color-through-hdmiheres-a-fix/
Fair enough. I was not aware of the reduced color output because the graphics card assumed that 1080p meant television rather than monitor.

Interesting that no comments have been made about AMD falling down when 1920x1080 resolution is auto detected as well (edit - more than this, looks like it was HDMI in general?). One could argue that pixel skewing and graininess are far more severe failings than truncated color output, but I digress.

It seems that this is a specific use case that negatively impacted both IHVs in different manners.
 
Last edited:

amenx

Diamond Member
Dec 17, 2004
3,902
2,119
136
The HDMI thing was never an issue to those who had the DVI option, and most monitors around that time fortunately had one. It only affected those who did not know what was going on or how to fix it and/or did not know how to properly search for a solution. Either way, there was always a solution to it. But for anyone to use that as a general 'Nvidia has inferior IQ to AMD' is just ludicrous.
 
Last edited:
  • Like
Reactions: ZGR

LordVaderMTFBWU

Junior Member
Oct 25, 2017
5
3
51
I have used AMD and nvidia at different points in my life. Right now, I own a gtx 1060 6GB,as well as R9 270x. Apart from Physx, I dont believe there is any difference between both company's cards. Let me explain the Physx part:

Me and my friend play Borderlands 2 sitting side by side in co op. There is absolutely no difference in IQ (in this game or any other) EXCEPT physx enabled clothing (flags would move in 1060 vs. static in 270x. At some points in the game, there are no clothes in same areas on AMD card and there is a moving big cloth on nVidia card) and particles. I can see a lot of snow particles on land after shooting off the enemies on my nvidia rig and none whatsoever in AMD rig. If I were to take screenshots at those exact locations at same exact angles on both systems, of course there will be difference. But that difference is that of some things being there or not. But on the highest settings, floor looks exactly the same, enemies look exactly the same, and so on.

You will see a similar behavior in other physx enabled games like Batman Arkham series. There are a lot of flying paper in Batman AA, while there is a hell lot of smoke coming out the batmobile in batman AK. All this is obviously absent on AMD card. Otherwise, they look the same at same settings on both systems.
 
  • Like
Reactions: amenx and Muhammed

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Prior to swapping over to NV, the wife (then girlfriend) was always an NV girl. So our household normally had two setups, her AMD_CPU+NV_GPU, and I was always Intel_CPU+AMD_GPU. It wasn't until Bulldozer that she swapped over to Intel, but up until 2-years ago I always had a Radeon in my rig.

We always compared performance (because we're sort of competitive to a degree). The only times I would say AMD let me down was a weird shadow bug in WoW (so weird I'm pretty sure not many noticed it, but I did, and it wasn't present on her NV setup). Beyond that, the multiple games we played CO-OP there were no glaring "this isn't being rendered" or "why does your game look awful" complaints. Only complaints from me were for TWIMBTP titles, where her card would be faster, but that was expected. EDIT: Oh, NV also borking hybrid PhysX really annoyed me while playing Borderlands 2.

TL;DR:
For years I've used NV/AMD cards, and again unless I'm blind, I don't recall any massive differences in image quality.