Question Switched to Nvidia, blurry games

rh71

No Lifer
Aug 28, 2001
52,853
1,048
126
My kid just got a GTX 3060 for his birthday, upgrading from a Radeon 580. So it's a decent improvement in terms of FPS, but he's saying there's a noticeable blurriness (lack of sharpness) compared to before even when not moving, on game menu screens, etc. in games like Warzone and Fortnite. He's already tried tweaking 3D settings. He's asking to go back to AMD but I'm not sure that's the issue.
Is there a subtle difference that maybe he's picking up since he's in front of these games all day? CPU is an i5-9400F and an LG monitor (https://www.lg.com/us/monitors/lg-27gk65s-b-gaming-monitor), and he's already tried turning on/off Nvidia G-Sync.
 

biostud

Lifer
Feb 27, 2003
18,193
4,674
136
He doesn't seem to be the only one.

 
  • Like
Reactions: Kaluan and Leeea

amenx

Diamond Member
Dec 17, 2004
3,848
2,013
136
My guess is that some displays default to RGB limited rather than full. Was a known issue with HDMI on Nvidia drivers supposedly fixed some time back. If so, easily fixed by going to driver resolution tab and selecting RGB FULL instead of limited.
 

psolord

Golden Member
Sep 16, 2009
1,875
1,184
136
My guess is that some displays default to RGB limited rather than full. Was a known issue with HDMI on Nvidia drivers supposedly fixed some time back. If so, easily fixed by going to driver resolution tab and selecting RGB FULL instead of limited.

Yes this.

My stupid noname HDTV causes this all the time. Even my Samsung one sometimes goes back to limited, with new driver installations.

Meanwhile I never had this problem with my now 10yo 7950.

As for the game image quality itself, no I have not seen something different. The newest Radeon I have tested is a friend's 5700 though.
 
  • Like
Reactions: Kaluan and Leeea

DiogoDX

Senior member
Oct 11, 2012
746
277
136
It is a old poblem that Nvidia default image on HDMI looks very bad compared with AMD. You have to configure to have the same image.

Set color depth to Full RGB
Increase the sharpness and vibrance of the colors to your preference

All the times that I change back to Nvidia and then again to Radeon I notice those differences (7970>980Ti>1080Ti>6800XT). And now the default difference is even bigger because of the new shapening filter that is on by default in the AMD driver (I dial back to 50%).
 

ZGR

Platinum Member
Oct 26, 2012
2,052
656
136
The nvidia game filter can really boost sharpness in certain games as well, so try that out if you haven’t. You must have nvidia’s game overlay enabled though. It is comparable to using reshade, but with a slightly cleaner UI.

I remapped the game filter overlay from alt F3 to alt 3 so I won’t worry about accidentally closing the application.
 
  • Like
Reactions: Leeea

Leeea

Diamond Member
Apr 3, 2020
3,599
5,340
106
@rh71
It's well known that AMD cards have better image quality than Nvidia.
Is this actually true?

As an AMD fan I would love it to be true, but it seems unlikely.

Is this something that can be solved by just tweaking the settings?


Because if this is actually true, you would think the reviewers would mention it. That sort of thing is a massive bonus for AMD.
 

Meghan54

Lifer
Oct 18, 2009
11,504
5,027
136
Is this actually true?

As an AMD fan I would love it to be true, but it seems unlikely.

Is this something that can be solved by just tweaking the settings?


Because if this is actually true, you would think the reviewers would mention it. That sort of thing is a massive bonus for AMD.


That has been bandied about back when ATi was the brand. Always "sharper text" and "clearer/sharper images" vs. Nvidia...but this has always been countered by Nv fanbois with the "but the drivers" refrain. Been going on a looooooong time.
 

amenx

Diamond Member
Dec 17, 2004
3,848
2,013
136
That has been bandied about back when ATi was the brand. Always "sharper text" and "clearer/sharper images" vs. Nvidia...but this has always been countered by Nv fanbois with the "but the drivers" refrain. Been going on a looooooong time.
It was actually borne out of an incident during ATI days some 15+ years ago when Nvidia was caught cheating in 3DMark by lowering its default IQ settings to get a higher score. Plus the HDMI bug that with some displays it would default to RGB limited instead of full (correctable in driver settings) and supposedly fixed with newer drivers. Whatever anomalies may exist today are glitches of some sort and not some general rule. Otherwise it would be a juicy article for the tech media to sink their teeth into.
 

Tup3x

Senior member
Dec 31, 2016
944
925
136
In terms of 3D and 2D image quality there's no difference. It's either wrong display settings or other user error. There's one thing that NVIDIA has not properly fixed: banding when you apply calibration for display (doesn't do dithering well). Also if you mess with gamma and stuff like that in driver. There are workarounds and even registry tweaks but out of the box it's not optimal.
 
  • Like
Reactions: Leeea

GodisanAtheist

Diamond Member
Nov 16, 2006
6,715
7,004
136
It was actually borne out of an incident during ATI days some 15+ years ago when Nvidia was caught cheating in 3DMark by lowering its default IQ settings to get a higher score. Plus the HDMI bug that with some displays it would default to RGB limited instead of full (correctable in driver settings) and supposedly fixed with newer drivers. Whatever anomalies may exist today are glitches of some sort and not some general rule. Otherwise it would be a juicy article for the tech media to sink their teeth into.

- Not just 3D mark but everything. The infamous FX5000 series of cards. NV reduced color output from 32bit to 16 bit to make up for their shader deficiencies. In game movement things looked fine but as soon as you stopped moving the banding was real.

Doom 3 was especially bad thanks to the high contrast produced by shadows, its incredible how bad the game looked running on FX hardware.

ATI wasn't the most honest broker, I believe they always defaulted to 24-bit color output regardless of whether the game called for 16 or 32, but the higher bit rate was basically indistinguishable from 32-bit on the standard monitors at the time.

DAAMIT also managed to put out the first "perfect" Anisotropic Filtering solution on their HD5000 cards as well that didn't see texture banding and shimmering as textures "moved further into the distance".
 

CP5670

Diamond Member
Jun 24, 2004
5,508
586
126
DAAMIT also managed to put out the first "perfect" Anisotropic Filtering solution on their HD5000 cards as well that didn't see texture banding and shimmering as textures "moved further into the distance".

This was the worst thing about this era. You could see the shimmering everywhere with AF turned on. Both companies were guilty of it, although ATI was generally better for a while. Even today, I turn the Nvidia driver image quality setting to "high quality" and turn off trilinear/anisotropic optimization first thing when I install a new driver.

I think ATI did actually have better 2D quality at one point, back in the VGA era with analog CRTs, but it should make no difference on DVI/HDMI/DP.
 
Last edited:
  • Like
Reactions: Leeea

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,839
3,174
126
Given what 3060s sell for versus the 6600XT. I would have returned it and got him the AMD card. It was his birthday, and I doubt he uses ray tracing in those competitive games.

actually some of the 3060's are so overpriced, it would probably even qualified for a cheaper AIB 6700xt like a gigabyte Eagle, or a xfx speedster.

This for example:

And a 6700XT vs a 3060 (non ti) is not even a competition.... its SPARTA, AMD being the Spartans.
 

rh71

No Lifer
Aug 28, 2001
52,853
1,048
126
actually some of the 3060's are so overpriced, it would probably even qualified for a cheaper AIB 6700xt like a gigabyte Eagle, or a xfx speedster.

This for example:

And a 6700XT vs a 3060 (non ti) is not even a competition.... its SPARTA, AMD being the Spartans.
Very tempting. We got the 3060 for $420 on Amazon and so this may be a worthwhile exchange if this take-it-with-a-grain-of-salt comparison is accurate: https://gpu.userbenchmark.com/Compare/Nvidia-RTX-3060-vs-AMD-RX-6700-XT/4105vs4109.

What's with the marketshare discrepancy? People afraid of AMD for support reasons, brand name, or what?

Whilst the drought in the GPU market continues, street prices for AMD cards are around 50% lower than comparable (based on headline average fps figures) Nvidia cards. Many experienced users simply have no interest in buying AMD cards, regardless of price. AMD’s Neanderthal marketing tactics seem to have come back to haunt them. Their brazen domination of social media platforms including youtube and reddit resulted in millions of users purchasing sub standard products. Be wary of sponsored reviews with cherry picked games that showcase the wins and ignore the losses. Experienced gamers know all too well that headline average fps are worthless when they are accompanied with stutters, random crashes, excessive noise and a limited feature set. [Jan '22 GPUPro]
 
Last edited:
  • Like
Reactions: Leeea

GodisanAtheist

Diamond Member
Nov 16, 2006
6,715
7,004
136
Very tempting. We got the 3060 for $420 on Amazon and so this may be a worthwhile exchange if this take-it-with-a-grain-of-salt comparison is accurate: https://gpu.userbenchmark.com/Compare/Nvidia-RTX-3060-vs-AMD-RX-6700-XT/4105vs4109.

What's with the marketshare discrepancy? People afraid of AMD for support reasons, brand name, or what?

-Oh my god userbenchmark's vendetta against AMD strikes again.

AMD isn't the brand name in GPUs and often times are excellent when what you're looking to do is play video games but lacking in cool secondary features that Nvidia tends to promote.

AMD drivers by all accounts have been very solid in recent memory and on this latest generation of cards.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,271
19,907
146
Last edited:

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,839
3,174
126
Very tempting. We got the 3060 for $420 on Amazon and so this may be a worthwhile exchange if this take-it-with-a-grain-of-salt comparison is accurate: https://gpu.userbenchmark.com/Compare/Nvidia-RTX-3060-vs-AMD-RX-6700-XT/4105vs4109.

What's with the marketshare discrepancy? People afraid of AMD for support reasons, brand name, or what?

You would seriously need to GIMP the 6700XT for a fair competition.
The only reason you have not to pick a 6700Xt over a regular 3060 is because your running a 500W PSU and do not have a double 8 pin (8+6 actually)PCI-E Connector native to your PSU.

Here is a actual gaming comparision:

The only game the 6700XT loses is in GTAV because rockstar heavily coded that for the cuda processor.
But every other game shows greater then 20% advantage.

Again... Its not Madness, its SPARTA.

Nvidia needs to bring out a 3060Ti, and it will still lose in abulk of the game. Also at that stupid price, your better off saving adding a bit more and getting a 3070 but then that brings another monster AMD has called the RX 6800XT into the arena.

RX cards are excellent.
I have a few, and i like them a lot.
Unless Raytracing is important, AMD cards are excellent buys, especially with how floored the pricing got on them recently.
Freesync monitors are a lot cheaper then real Gsync monitors.
Adaptec Gsync is MEH, it works, but its not as good as Gsync ultimate which again are $$$.
Most games have Vulkan support, and Vulkan runs better natively on its Native platform, AMD.

Now if i was dual purposing the card, for mining duty as well, i would probably stick with a Nvidia, but for pure gaming, and at your price range, i would definitely grab a 6700XT and not look back.
Especially if its from amazon with its generous return policy.
 
Last edited:

maddie

Diamond Member
Jul 18, 2010
4,722
4,625
136
@rh71

Guru3D benchmarks start here https://www.guru3d.com/articles_pages/amd_radeon_rx_6700_xt_(reference)_review,10.html

Techpowerup benchmarks start here
https://www.guru3d.com/articles_pages/amd_radeon_rx_6700_xt_(reference)_review,10.html

Userbenchmark might be good for a few laughs, if it were a parody site. But being as they are running a disinformation campaign against AMD, they are good for even more laughs, just for different reasons. They are worse than useless as a consequence.
Very good diplomatic writing.
 
  • Like
Reactions: Glo. and Leeea