Is there a difference between "HDR" in 4K televisions and HDR in video games?

Sho'Nuff

Diamond Member
Jul 12, 2007
6,211
121
106
At CES this year there was a ton of talk about HDR capable televisions. I was a bit surprised at this interest (or even the apparent indication that HDR is something "new" in the display industry), because HDR has been implemented for a very long time in the video game industry. Indeed I believe it was Valve that introduced the video gaming world to the concept of HDR back in 2004, when it released its "lost coast" technical demo for its then to-be released blockbuster, namely Half-Life 2. Since that time, HDR has been implemented in innumerable games, provided that a consumer's graphics card was capable of rendering HDR content. Notably, new monitors were not required to display HDR content. Existing monitors could do so, provided the renderer (i.e., graphics card), could perform the necessary processing.

With the foregoing in mind, I have heard that in the 4k TV context, new panels (presumably implementing different display technology) will be needed to display HDR content. Juxtaposed against what happened in the PC context, this leads me to ask - what is different about HDR in the 4K tv context, relative to the PC context? Why is it that age old PC monitors can display HDR content rendered by a graphics card, but the latest and greatest TV panels are incapable of doing so even though they are able to push an insane amount of pixels?

My question particularly relates to the evolutionary TV line produced by Samsug, e.g., the 8550, 8700, and 9000 series 4K TV's. My impression (perhaps incorrect) is that when the panels of those TV's is connected to Samsung's oneconnect box, all of the image processing is done by the guts of the oneconnect box, and not by the circuitry within the panel. In essence, the oneconnect box is acting as a videocard to drive the panel. If that is the case, why couldn't HDR be implemented by Samsung in a new one connect box? Why would a new panel be required?
 

SithSolo1

Diamond Member
Mar 19, 2001
7,740
11
81
I could be completely off base but I think HDR in TVs is actual HDR vs simulated HDR in video games.

What I mean by that is in games the HDR is rendered via the software and then displayed on your monitor to simulate the HDR effect. Pretty much any current HDTV can display this form of HDR, the same as a monitor, as none of this changes the actual brightness/contrast of the display.

On TVs I think the set is physically capable of extreme contrast and brightness. *Made up examaple* Think of this as the difference between the screen showing the sun in space at 200 nits with a semi-dark background and the screen showing the sun at 2000 nits with a pitch black background.

CNET article
http://www.cnet.com/news/high-dynamic-range-arrives/
 
Last edited:

quikah

Diamond Member
Apr 7, 2003
4,209
752
126
I am not sure how to relate HDR in video games with that in TVs. Take this with a grain of salt, I might have some incorrect information.

There are 2 parts of HDR, the display and the processing. The processing can be taken care of with upgradable components that Samsung has. The capabilities of the display will be a limiting factor.

HDR needs a very high contrast ratio and a wide color gamut. The contrast ratio can only be achieved with OLED or FALD (I don't think plasma can go bright enough). Edge lit sets cannot hope to achieve the same contrast ratio. Wide color gamut is dependent on how many colors the panel is able to display. think you need a 12-bit panel.
 

dpodblood

Diamond Member
May 20, 2010
4,020
1
81
These guys are right. HDR in video games is nothing more than a graphical effect - kind of bloom on steroids. HDR in terms of TV technology means the display can physically show much brighter and much darker shades. In other words much better dynamic range.
 

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
Sorry for the necro-thread, but there have been some revelations in the past 10 months.

HDR doesn't yet have a specific minimum or maximum requirement for nits or contrast ratios (there are several competing standards at the moment), but generally what we know is that more nits (over 500 nits) and high contrast ratios (5,000:1 static, not dynamic) are a solid starting point. We also know that content will be delivered primarily within the UHD Blu-ray spec of 10-bit 4:2:0 Rec. 2020. Generally speaking, any native 10-bit monitor/projector should be able to display HDR content with the right software/hardware. Obviously, any HDTV advertised as HDR would as well.

However, the creator of MadVR has included HDR Rec. 2020 mapping in his latest build, which allows for just about any display (even 8-bit) to display HDR content, but to varying degrees of impact and accuracy depending upon the capabilities of said display. MadVR requires some specific setup steps to ensure accurate mapping to your specific display.

With CES just a month away, things are about to get real exciting.
 

sdifox

No Lifer
Sep 30, 2005
100,603
17,992
126
HDR was watered down for tv apparently so that more of them can be called HDR...
 
Last edited: