Text quality @ 4K60, differences in quality between NV and AMD?

VirtualLarry

No Lifer
Aug 25, 2001
56,571
10,206
126
I really don't get this. I have a pair of 40" 4K UHD TVs I picked up on BF, Avera brand. They have four HDMI ports on them, and I believe that three, or possibly all four, are all HDMI2.0.

Anyways, all of my Polaris-based AMD graphics cards, have no problems driving them at full resolution and color depth (?), at 4K60.

Likewise, my NV cards will drive them fine, at 4K30. But when I switch the monitor to 60Hz refresh with my NV cards (currently using an GT1030 right now), the text, all the vertical lines get NTSC-like red- and blue-tinges.

This may be an issue with ClearText, and NV not bothering to put this TV into their database as to how they are supposed to be set up.

Or maybe NV cards really don't have HDMI2.0 ports like AMD Polaris cards do, and they just use over-driven HDMI1.4 hardware?

The reason that I say that is, I can use a GT730 card with HDMI1.4 ports, to drive my 4K UHD TVs at 4K60 as well, due to some sort of driver hack path that overdrives and reduces the color depth of the 4K signal, such that it can be driven with an HDMI1.4 port.

I can understand the reduced color depth, when using an HDMI1.4 port, but when using an NV card with a supposed HDMI2.0 port, I still experience that same loss of color depth at 4K60.

So, something strange is going on with NV cards. It may be a compatibility issue specific to my displays, in that AMD specifically tested and supports them, and NV didn't.

(I had something similar happen to me, with my Westinghouse 24" 1080P HDTVs that I was using before my 4K UHD TVs, there was an HDMI audio handshake issue with those displays, and it took a few months before both AMD and NV supported them properly. Before that, if the monitor went to sleep, if I woke it up, I wouldn't have audio.)
 

Tweak155

Lifer
Sep 23, 2003
11,449
264
126
Are you using the same HDMI cable & HDMI port on the TV in both scenarios? It would be odd a GTX1030 would need to downgrade the quality to push 4K60.
 

amenx

Diamond Member
Dec 17, 2004
4,405
2,725
136
I'm running a 40" UHD Samsung TV and have no issues whatsoever with 4k 60hz and full rgb (4.4.4) via hdmi 2.0 and gtx 1070. And text is ultra sharp as it should be.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,571
10,206
126
Are you using the same HDMI cable & HDMI port on the TV in both scenarios? It would be odd a GTX1030 would need to downgrade the quality to push 4K60.
Yes, exact same. Using newest drivers, too.

Edit: If I go to NV Control Panel, Change Resolution, scroll down, it is by default set to "Use Default Color Settings".

If I change that to "Use NVidia Color Settings", I get four menus.

"Desktop Color Depth", only one option, "Highest (32-bit)".

Output Color Depth, only one option, "8 bpc".

Output Color Forum, only one option, "YCbCr420".

Output dynamic range, only one option, "limited".

Does this indicate that something is amiss?

I know that the 4K UHD TV only does 4:2:2 or 4:2:0 when driving 4K60, and 4:4:4 when driving 4K30, but my AMD video cards can seem to finess the color depth in ways that my NV cards cannot.

Such that I see NTSC-style red/blue fringing on all of my black-on-white text, such as this forum. That does not happen on my AMD cards, even when running at 4K60.

Edit: If I select "NVidia Color Settings", and click apply, it's basically the same.
 
Last edited:

96Firebird

Diamond Member
Nov 8, 2010
5,738
334
126
From an Amazon review:

Here is a summary of settings changes that you will want to make if using this as a 4K computer monitor:
1) Menu, Setup > Other Settings > HDMI 4K - changing it from Standard to Enhancement enables 4:4:4 chroma for 4K at 60hz.
2) Sharpness (service menu) - needs to be turned down to 30 or lower. Setting it to 0 will completely remove ringing.
3) Color (service menu) - needs to be turned down to 45 or lower.

Are both of the TVs set up the same? If your text isn't sharp, your chroma most likely isn't 4:4:4.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,571
10,206
126
1) Menu, Setup > Other Settings > HDMI 4K - changing it from Standard to Enhancement enables 4:4:4 chroma for 4K at 60hz.
Interesting. I didn't know that this TV was capable of 4:4:4 at 4K60. Let me try that.

Edit: Well, that was pretty bogus. Got a black screen. Couldn't even shut off the TV.

So I shut down the PC, which caused the TV to shut down, then I started the TV, reset to factory defaults, and then went in and set "HDMI Mode" to "Enhancement", and then booted the PC. I saw POST, but when Windows came, I saw the sign-in screen, with green lines through it, then blam, black screen again.

Seems as though, something's not right here. Either the cable I'm using can't handle HDMI2.0 bandwidth (possible, it's a Generic HDMI "High Speed"), or there's still something about NV cards.

Remember, my AMD cards can do sharp text at 4K60, evidently with HDMI1.4 bandwidth? Or maybe AMD is using HDMI2.0, but the NV card, having overdriven the HDMI1.4 port, then switches to 2.0, maybe it was running with an overdriven 2.0 port? Just a theory.

Edit: I tried swapping machines, and put my Ryzen rig with RX 570 as primary GPU, connected to same HDMI cable / connection / UHD TV.

The AMD rig showed lots of static, flashing up on portions of the screen.

So now, I'm going to try a new HDMI 2.0-rated / 18Gb/sec cable. I see Newegg carries a SilverStone branded one for under $20.
 
Last edited:

nathanddrews

Graphics Cards, CPU Moderator
Aug 9, 2016
965
534
136
www.youtube.com
Strange, when I get home I'll take a look at my setups. I'm using a GT 1030 on my Sony X930E and a GT 1030 on my TCL Roku TV, both do 4K60 444 perfectly.
 

OlyAR15

Senior member
Oct 23, 2014
982
242
116
Get a UHD-certified cable. Not all high speed HDMI cables can handle the bandwidth. I learned this the hard way with my 4K tv. Not all cables are the same.
 
  • Like
Reactions: Headfoot

Muhammed

Senior member
Jul 8, 2009
453
199
116
I have a Samsung 4K60 TV, and it works fine on my 1080 with HDMI 2.0, I am using 32-bit mode with RGB 8bpc and Full dynamic range. Text is fine.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
You mentioned that ClearType might not be setup for this TV for Nvidia. Have you tried manually adjusting ClearType?
 
Last edited:

DiogoDX

Senior member
Oct 11, 2012
757
336
136
You have to select full RGB on the driver painel and select 4:4:4 on the TV. On my old JU6500 is UHD Color.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,571
10,206
126
Yeah I would suggest testing a different cable.

Yeah, looks like I need a "Premium HDMI Cable" / "4K60 4:4:4" / "HDMI 2.0" certified cable, instead of the generic that I'm using.

What panels do Avera use? Never even heard of the brand.

No idea. Pictures decent, though, and text isn't really that bad, even at 4:2:0, well, actually, I'm on my AMD rig now, so text looks "normal" (no NTSC fringing). Maybe AMD managed 4:2:2, whereas NVidia defaults to 4:2:0? (Nope, according to "Radeon Settings", I'm outputting 4:2:0 on AMD too.)

Edit: Anyways, ordered some HDMI 2.0 / 4K UHD cables, brand-name off of ebay through the factory store, 10' for $9.99. Not too expensive, if they really will carry 18Gbit/sec.
 
Last edited:
  • Like
Reactions: Headfoot

VirtualLarry

No Lifer
Aug 25, 2001
56,571
10,206
126
https://www.newegg.com/Product/Product.aspx?Item=N82E16812189053

I had forgotten that I had ordered a bunch of these cables, from "Link Depot", that are generic, but are 28AWG, and claim support for HDMI2.0 and 18Gbit/sec transfer rates. ("4K UHD support").

Anyways, I tried one, plugged into HDMI4 on my Avera 40" 4K UHD TV, and ... it works better than my other generic cables, that's for sure. I enabled "HDMI 4K : Enhanced", and now Radeon Settings reports that the pixel format is 4:4:4, and I can choose pretty-much any pixel format below that, if I wanted.

However, I'm getting "flashes", every few minutes, like I was with the other generic cables, after one of them got damaged. This one was new out of the package. Also, I'm listening to internet radio over the HDMI port, and the audio dropped out for a split-second earlier.

So, the claim that these support 18Gbit/sec is doubtful, or only partially.

Of note, these cables are NOT "HDMI Premium Certified", like the ones I have on order, that were more akin to $10/ea, rather than $2/ea. We'll see how those do, when they arrive.

But it's pretty neat to be able to see "4:4:4". On my AMD card rig, it doesn't seem like the text is any sharper, but I'll hook up the rig with the GT1030 soon enough, and try out that one, and see if I still get the red/blue fringing on vertical lines on text.

Edit: I set the Display option in Radeon Settings for GPU Scaling : ON, and now the occasional static flashes appear to be gone! Interesting that a software setting would have cause that issue. Now to see if Nvidia cards have a similar issue or not.
 
Last edited:
  • Like
Reactions: Hans Gruber

Hans Gruber

Platinum Member
Dec 23, 2006
2,501
1,342
136
https://www.newegg.com/Product/Product.aspx?Item=N82E16812189053

I had forgotten that I had ordered a bunch of these cables, from "Link Depot", that are generic, but are 28AWG, and claim support for HDMI2.0 and 18Gbit/sec transfer rates. ("4K UHD support").

Anyways, I tried one, plugged into HDMI4 on my Avera 40" 4K UHD TV, and ... it works better than my other generic cables, that's for sure. I enabled "HDMI 4K : Enhanced", and now Radeon Settings reports that the pixel format is 4:4:4, and I can choose pretty-much any pixel format below that, if I wanted.

However, I'm getting "flashes", every few minutes, like I was with the other generic cables, after one of them got damaged. This one was new out of the package. Also, I'm listening to internet radio over the HDMI port, and the audio dropped out for a split-second earlier.

So, the claim that these support 18Gbit/sec is doubtful, or only partially.

Of note, these cables are NOT "HDMI Premium Certified", like the ones I have on order, that were more akin to $10/ea, rather than $2/ea. We'll see how those do, when they arrive.

But it's pretty neat to be able to see "4:4:4". On my AMD card rig, it doesn't seem like the text is any sharper, but I'll hook up the rig with the GT1030 soon enough, and try out that one, and see if I still get the red/blue fringing on vertical lines on text.

Edit: I set the Display option in Radeon Settings for GPU Scaling : ON, and now the occasional static flashes appear to be gone! Interesting that a software setting would have cause that issue. Now to see if Nvidia cards have a similar issue or not.

Larry, your 4K TV will let you know if your HDMI cables are not 2.0 capable of 60hz. It will give you a message like your cables do not support this resolution if they are HDMI 1.4 or low grade fake 4K 60hz cables that will do only 30hz which 1.4 HDMI will do.
 

Micrornd

Golden Member
Mar 2, 2013
1,341
221
106
I didn't see where you mentioned it, but have you tried switching the driven TV, to make sure both TV's respond the same way to both cards?
 

hemla

Junior Member
Jul 29, 2017
18
0
36
I have different issue, not sure if related to cable but my Zotac GT1030@HDMI Acer H233H displays bleak colors. I have tried same cable with Radeon HD5670 and Intel Integrated HD630 and colors were lively. This isn't issue with color palletes because I have tried many and tests made by this website: http://www.lagom.nl/lcd-test/ clearly indicate that I lose contrast/sharpness when adjusting GT1030 while I don't lose them with HD5670/HD630. But those differences while clearly noticeable do not concern me.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,571
10,206
126
Well, I booted up into my FX-8320E and GT1030 2GB rig, with "HDMI: Enhanced" set on my TV input, and things seem to be a LOT better with the text, with no color-fringing anymore that I can detect.

I opened up NVidia Control Panel, and it defaulted to RGB, but I changed it to "YCrCb 4:4:4", and hit apply, and now it only gives me the option for "limited" range, and not "full". Problem with HDTV versus monitor detection in the drivers?

It does seem like the orange, is a slightly different shade, on the banner on top of these forums, once I switched from RGB to 4:4:4.
 
  • Like
Reactions: Hans Gruber

Hans Gruber

Platinum Member
Dec 23, 2006
2,501
1,342
136
Well, I booted up into my FX-8320E and GT1030 2GB rig, with "HDMI: Enhanced" set on my TV input, and things seem to be a LOT better with the text, with no color-fringing anymore that I can detect.

I opened up NVidia Control Panel, and it defaulted to RGB, but I changed it to "YCrCb 4:4:4", and hit apply, and now it only gives me the option for "limited" range, and not "full". Problem with HDTV versus monitor detection in the drivers?

It does seem like the orange, is a slightly different shade, on the banner on top of these forums, once I switched from RGB to 4:4:4.

Larry, run it with limited range vs full range on RGB. I forgot the error running RGB with full range but I now run 4:4:4 limited range.
 
  • Like
Reactions: Crono

Dranoche

Senior member
Jul 6, 2009
302
68
101
Well, I booted up into my FX-8320E and GT1030 2GB rig, with "HDMI: Enhanced" set on my TV input, and things seem to be a LOT better with the text, with no color-fringing anymore that I can detect.

I opened up NVidia Control Panel, and it defaulted to RGB, but I changed it to "YCrCb 4:4:4", and hit apply, and now it only gives me the option for "limited" range, and not "full". Problem with HDTV versus monitor detection in the drivers?

It does seem like the orange, is a slightly different shade, on the banner on top of these forums, once I switched from RGB to 4:4:4.

There's really only a single range for YCbCr, though it's often referred to as "limited" as it's less than the dynamic range for full (PC) RGB at any given bit depth. Generally, as long as the TV knows what it's receiving, there shouldn't be differences between RGB and YCbCr444 at the same bit depth aside from some very minor color changes or gradient banding. The color changes are probably the result of how the TV is handling the signal it's receiving, and the banding is a limit of 8-bit.
 
  • Like
Reactions: Crono

DiogoDX

Senior member
Oct 11, 2012
757
336
136
Well, I booted up into my FX-8320E and GT1030 2GB rig, with "HDMI: Enhanced" set on my TV input, and things seem to be a LOT better with the text, with no color-fringing anymore that I can detect.

I opened up NVidia Control Panel, and it defaulted to RGB, but I changed it to "YCrCb 4:4:4", and hit apply, and now it only gives me the option for "limited" range, and not "full". Problem with HDTV versus monitor detection in the drivers?

It does seem like the orange, is a slightly different shade, on the banner on top of these forums, once I switched from RGB to 4:4:4.
I tink you are doing something wrong. Look how I set Full RGB 4:4:4 on my JU6500 4K TV:

On the TV menu select the 4:4:4 option for the HDMI port that you are using. On my TV is only the HDMI1 and is called HDMI UHD Color.

On the driver select 4K, 60Hz, RGB, Full

On the TV configure the HDMI to PC Mode

All the text in this picture shold be clean and crisp: http://cdn.avsforum.com/b/b4/b4a44044_vbattach208609.png
 

Dranoche

Senior member
Jul 6, 2009
302
68
101
I tink you are doing something wrong. Look how I set Full RGB 4:4:4 on my JU6500 4K TV:

On the TV menu select the 4:4:4 option for the HDMI port that you are using. On my TV is only the HDMI1 and is called HDMI UHD Color.

On the driver select 4K, 60Hz, RGB, Full

On the TV configure the HDMI to PC Mode

All the text in this picture shold be clean and crisp: http://cdn.avsforum.com/b/b4/b4a44044_vbattach208609.png

I really don't like how Samsung has labeled things.

The HDMI UHD Color setting on Samsung TVs enables the TV to accept RGB signals and YCbCr signals with chroma subsampling of 4:2:2 and 4:4:4 and higher bit depths for 4:2:0; however, Samsung labels RGB as "RBG 4:4:4" which doesn't mean anything and doesn't tell you if it's full or limited dynamic range. I'm not sure why you would ever have this setting to OFF but Samsung claims having it ON can potentially cause issues with non-UHD signals. RBG444 is a bit of a misnomer, since you can't do chroma subsampling on RGB. I think some cameras and video editing software use RGB444 labels to clarify full range when comparing with various YCbCr subsampling levels, but it isn't a direct comparison and all that seems to do is create confusion down the line when people only see labels of RGB instead of RGB444.

Samsung doesn't explain what happens when you set the device type to PC, but it presumably tells the TV to expect an RGB signal as that's standard for PCs. Again, no idea if full or limited, but since the text in the test image was sharp then the TV knows to expect an RGB signal. Whether or not it's expecting full or limited can only be determined by checking contrast in a picture or video - as long as brighter areas are not washed out and darker areas maintain detail then it's correct, otherwise the TV is expecting a limited range RGB signal. The RGB 4:4:4 label may be their messed up way of saying full range RGB.

RGB and YCbCr444 should, for practical purposes, display exactly the same as long as the display knows what signal to expect from the source.
 

Despoiler

Golden Member
Nov 10, 2007
1,968
773
136
You need a "premium certified" HDMI cable. That is the official branding. If it doesn't have the below orange logo then don't buy it. Monoprice has them at reasonable prices.

https://www.hdmi.org/manufacturer/premiumcable/faq.aspx

ccpLabel.png