Performance degradation when enabling HDMI Deep Color Support?

TestKing123

Senior member
Sep 9, 2007
204
15
81
I have two 980Ti (SLI) connected to a 65 inch 4k LG 3DTV. I have noticed an odd behavior when trying to enable HDMI deep color.

Using Witcher 3 as an example. I have a save point on a town near the shore, in Skellige.

With HDMI ULTRA HD Deep Colour OFF (from the TV), my 4k performance is 45 - 50 fps. The HDMI color is 4:2:0 in Nvidia Control panel.

When I turn ON HDMI ULTRA HD Deep Colour (from the TV), and select 4:4:4 in the Nvidia Control Panel, my performance in the exact same spot is low to mid 30's. SLI is still working because I can turn on the visual SLI indicator and I get full green bars.

Now, it seems if I set the Nvidia Control panel to 4:2:0, performance is STILL low (still mid to low 30's).

However, if I turn OFF HDMI ULTRA HD Deep Colour from the TV, performance goes back up to mid 40's to 50's, using that same spot for reference. The problem here is that I'm back to chroma 4:2:0, and I CAN tell a difference in the color (looks so much vibrant in chroma 4:4:4)

Is this some kind of bug or limitation? Can't I have my cake and eat it too?

The 980 Ti is output to the TV directly. I have a secondary displayport to receiver output just for audio (I disabled this during my testing, same issue with performance).
 

xorbe

Senior member
Sep 7, 2011
368
0
76
The obvious guess is that the game is also chugging at a higher bit depth for calculations. But you probably already considered that.
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
I read somewhere recently that GPU's manage colour in 10bit internally, and output to 8 bit. I believe I was looking up 10bit capable GPU's and monitors.
 

TestKing123

Senior member
Sep 9, 2007
204
15
81
The obvious guess is that the game is also chugging at a higher bit depth for calculations. But you probably already considered that.

If that's the case, why isn't it more "publicized"?

And the performance difference is pretty big. Surely it would have been noticed from others.
 

TestKing123

Senior member
Sep 9, 2007
204
15
81
I read somewhere recently that GPU's manage colour in 10bit internally, and output to 8 bit. I believe I was looking up 10bit capable GPU's and monitors.

Have no idea, but I will do some more testing tomorrow and post some videos. I'm really curious if others experienced this, but I can't seem to find anything on the net, which leads me to believe there might be a bug or limitation somewhere along the chain in my setup.
 

Actaeon

Diamond Member
Dec 28, 2000
8,657
20
76
I played quite a bit a 4:2:0 and recently enabled 4:4:4 Chroma/UHD Color on my TV. While I have only played Fallout 4 and Starcraft 2 so far, I haven't noticed a drop in performance, but I haven't been looking for one nor do I have a good way of precisely measuring it.

This is with a single 980 Ti.
 

xorbe

Senior member
Sep 7, 2011
368
0
76
If that's the case, why isn't it more "publicized"?

And the performance difference is pretty big. Surely it would have been noticed from others.

I googled for "nvidia deep color performance hit" and one of the links on the first page seem to imply an implied performance hit, but no details. :\
 

TestKing123

Senior member
Sep 9, 2007
204
15
81
The first link was a link to this very thread lol. I did find other links in which higher precision performance hit was "theoretical", but noone said they experienced the same thing. I wonder if it affects SLI scaling.

But I did more across the board tests. I checked Witcher 3 in areas where I was previously locked at 60fps. The same spot with HDMI Deep Color enabled (4:4:4) drops to about 40-45fps.

Seems like it's an "issue" easily solved by simply gaining more GPU power. I only need 60fps for gaming on my 4k TV, so maybe just being patient for dual Pascal's is the fix.
 

psolord

Platinum Member
Sep 16, 2009
2,098
1,239
136
op, Can you fire up msi afterburner and see your gpu usage?

the numbers you mention sound too much like vsync issue.