HDR - Does anyone actually know what it is?

TheThirdMan

Member
Jul 5, 2011
113
11
81
Hey all,

I've seen a lot about HDR on various tech sites, yet I can't find any solid information regarding what it actually means for the consumer. I understand HDR requires:

-Greater colour depth - 10bit over of 8 bit.
-Greater colour range- expanding from Rec.709 to BT.2020
-"4k" resolution at least.

What I don't get is does this mean we're going to get pro level gear (10 bit monitors and TVs) at consumer prices? From general reading in comments ("I don't want his monitor, it doesn't have HDR") and dishonest comparisons between regular TVs and HDR TVs ("reds are redder!") it seems to just be a buzzword with little understanding as to how it will compare. The most honest picture I could find is this one:

orbYGoc.jpg


Greater bit depth does not mean more saturation, it doesn't mean more contrast and it doesn't mean super blacks. Yet it seems this is what people seem to think they'll be getting. A screen will not suddenly display blacker blacks because you change the video you're playing from a bluray to an HDR bluray.

An OLED TV playing a bluray will have extraordinary picture quality. Will an HDR video/game played on a cheap monitor with "HDR Support" provide a better picture quality? A 10-bit panel monitor right now will costs thousands of dollars (not 10-bit internal processing to 8-bit display!)- are these monitors now going to suddenly only cost a few hundred dollars? Or is HDR meant to remain high-end? It seems extraordinary to me that we're going to get 34" curved super widescreen, true 10-bit monitors displaying expanded colour gamuts for a reasonable price.
 
Last edited:

mnewsham

Lifer
Oct 2, 2010
14,539
428
136
It is far more about the brightness and contrast levels.

A "normal" monitor only displays ~8-12 stops of brightness.
An HDR monitor can display 12-18+ stops of brightness on the same frame.

The 10bit color is important here because this increased brightness makes color banding far more noticeable, the 10-bit color depth is in part to combat any color banding.

Currently consumer level HDR monitors are only slightly better than normal consumer monitors, true HDR monitors require good OLED panels, most consumer monitors are using VA panels with large LED backlight arrays.



TLDR, it's larger color space, higher bit-depth to combat color banding, and a much wider potential contrast in the same frame, you can have VERY bright spots, and VERY dark spots on the same frame of a movie, without color banding or the black areas turning grey because of the bright areas nearby on the frame.



Right now most consumer HDR capable monitors and TVs are just not worth it as they barely meet the HDR requirements.
 

TheThirdMan

Member
Jul 5, 2011
113
11
81
TLDR, it's larger color space, higher bit-depth to combat color banding, and a much wider potential contrast in the same frame, you can have VERY bright spots, and VERY dark spots on the same frame of a movie, without color banding or the black areas turning grey because of the bright areas nearby on the frame.

Right now most consumer HDR capable monitors and TVs are just not worth it as they barely meet the HDR requirements.

Right, I get that, as part of a standard. But the mere existence of a standard doesn't mean current hardware at current prices is suddenly going to be professional level. As you say, OLED is pretty much the only display tech that will take advantage of the new standard. I don't see how a VA display with LED backlight for £1000 is going to do bt2020, 10-bit colour and the contrast required by the standard. Are we just waiting for a load of OLED gaming monitors? In comments on new monitor announcements, the usual comment is "When can I have GSynz 120hz, OLED, HDR, 34 21:9 widescreen!?". That monitor now would cost $5,000+ and there's zero indication that it will go any lower in the near future.

My initial comment is pretty confusing to be honest. I think HDR is fantastic and much better than the 3d fad.

My issue (or confusion) is that I don't see how this standard, which is as high, if not higher in spec than many professional standards, is expected to be implemented for consumers. I get OLED HDR TVs are doing it right, but they cost a huge amount. It seems to me it's only going to be a home theatre tech, not something for gamers or casual users.

Additionaly, I don't think there's any correct information out there for consumers, or perhaps that most coverage gives the wrong impression of what HDR is going to do. There's terrible comparison shots showing a faded image vs a saturated image to show what HDR will do, which is simply not true. It doesn't make red more red. Consequently, unless you get an OLED panel, it's really going to have little effect on a VA or IPS panel. Those are still old tech, they still display mostly 6 or 8-bit (10 if you're lucky), and they still have poor contrast compared to OLED.
 

topmounter

Member
Aug 3, 2010
194
18
81
The CE manufacturers have really confused the issue by marketing 4K panels as the next step up, when the real next step up (at least the HDR spec minimums) is a panel that supports extended color palette, high dynamic range AND 4K resolution. Layer on Dolby Vision and things get even more convoluted.

Demo HDR content on real panels really does pop, but that's to be expected. I'm more curious as to how it pops in my living room with proper setup in the middle of the afternoon with real content... not how it pops when everything is turned up to 11 in a dark room with demo content at CES or wherever.

In my opinion it all still comes down to: buy the best panel you can afford, when you can afford it and when you need it.
 

biostud

Lifer
Feb 27, 2003
18,251
4,765
136
Do I need to know more than it is new, bright and shiny, and I want it?
 

biostud

Lifer
Feb 27, 2003
18,251
4,765
136
Well quite right! At least HDR is a genuinely excellent advancement in display tech. I just can't see it happening for consumers.

Obviously it will come, but the time frame could be much longer, than we would like.
 

Tweak155

Lifer
Sep 23, 2003
11,448
262
126
My understanding is that, in simple terms, the HDR signal simply has more image data associated with it that current monitors cannot do anything with because the hardware does not support the brightness and other additional information being sent. There are current competing formats (akin to BluRay vs HD-DVD) in how to compress and send this data and we're really just in the beginning stages of this whole ordeal. So, you could have a monitor (or TV) that supports "HDR", but it may not necessarily support all HDR streams.

That said, the HDR TV's I have seen definitely look better with HDR content, but until the majority of the content that can be consumed is both converted to HDR and a single standard is supported and used, I don't see the point in jumping on the bandwagon just yet.

And, with all things, I do expect one day that it will all be consumer level. There will always be a next best thing in our lifetimes which will be professional level.