Will HDR on PC be a disaster?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
I'm excited for HDR but more for VR then on standard monitors. HDR adds immersion which should help advance VR. Sure we'll get some crappy LED TV's to choose from that claim 1000000000000000:1 contrast and support some half-assed HDR standard. I'll let all that dust settle and pick up a good 8K TV in five years that will hopefully be able to deliver a proper experience.

I agree. How do we expect some backlit thing will do this right? I don't care if it has a million LED backlights. They bleed like crazy. HDR in gaming so far has been to emulate that feeling of driving towards the sun around sunset and your visor can't block it.

Proper implementation is more important than throwing in features and more acronyms.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
OLED is actually the best technology for HDR because of its infinite contrast. I know of some "high contrast" screens that achieved the higher contrast through changing the local backlight brightness, but those are not proper HDR. HDR itself does not play with the backlight brightness, it's just a fancy way of saying that your screen has much higher contrast (more natural) and higher peak brightness (also more natural).

Proper HDR content will need to take care to not blind the viewer randomly though, glaring brightness should only be used when the viewer expects a glaring light.

Watch them exaggerate the effect because they can. I can live without getting a TV that I need sunglasses to watch. Also I agree OLED is the only way this technology will work right except front projection possibly. This feature will be mostly good to deliver that blinding effect which happens in real life and which is why sun visors were invented. Of late display technologies have been chasing all of sorts of things to get people to buy new TVs. Let's get basic static contrast right first. You know learning to walk before learning to run.
 

Pottuvoi

Senior member
Apr 16, 2012
416
2
81
I agree. How do we expect some backlit thing will do this right? I don't care if it has a million LED backlights. They bleed like crazy. HDR in gaming so far has been to emulate that feeling of driving towards the sun around sunset and your visor can't block it.

Proper implementation is more important than throwing in features and more acronyms.
Actually in gaming the HDR has been used to render images that look good, not just for over bright effects.
The ability to handle calculations in a way that you have more than 256 levels of brightness is paramount to get realistic lighting on surfaces. (Dynamic range is huge in real world scenarios as sun is millions times more bright than dark room.)

HDR in TV/monitor is good for both light and dark scenarios.
Classic TV the white is pretty much a same as a white paper in an office and HDR highlights can be ~4x of that.
Then there is better sample distribution of the colors, getting better quality overall. (Better definition of color space.)

I'm sure that photographers will like the change how their photos captured in RAW will look.
Previously they couldn't see the 12-14 stops of dynamic range their cameras could capture, just an tone mapped approximation.
 
Last edited:

nathanddrews

Graphics Cards, CPU Moderator
Aug 9, 2016
965
534
136
www.youtube.com
The thing to remember with HDR displays is that they're all as different as the HDR content being shown. Some movies are mastered with a peak brightness of 4,000 nits, some at 1,000, while theoretically anything mastered in Dolby Vision can be up 10,000 nits. Your display, whether OLED (0-750 nits) or LCD (0.014-1500 nits) is going to do one of two things: fit what it can and clip the rest OR scale to the display's capabilities. If you use an OLED anywhere but a dark room, you will destroy its infinite contrast advantage - it will still be there, but you won't be able to see it. For most users, the higher brightness of LCD will provide a better HDR experience.

It's rather funny - in an ironic way - that up until now, most professional reviewers and enthusiasts have balked at the "torch mode" of displays and been overly critical of display brightness. In an SDR world, that makes sense. However, in the HDR era, maximum brightness is proving to be more popular among critics and laymen alike. I think due to the dark room requirements of OLED, LCDs are proving more versatile for customers based upon maximum brightness alone. The 2:1 price difference doesn't help matters.
 

Sonikku

Lifer
Jun 23, 2005
15,901
4,927
136
No I just think the adoption rate will be painfully slow. 4k+hdr will not be adopted fast as 1080p is still "good enough" for people.

Personally I'm mindboggled how people are still OK with 1080p 60hz. 4k 120hz hdr is what I want but I'll have to settle for something less than that sadly.

What irritates me is the confusion the tv makers are intentionally throwing out there to confuse consumers. Vendors are telling people that a lot of models "support" hdr, that is, "supports" it in the sense that it can "display" HDR content but otherwise without any of the benefits of running HDR. It's the same fakeout with 120hz. TV's would plaster "120 SPS" (scenes per second?) or some other bullshit on the box in bold letters that is just a made up standard that works out to 60hz. And obvious attempt to dupe all but the most savvy of techies.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
While interesting for TVs, I am unlikely to care about HDR for gaming, and neither am I going to connect a PC to a TV again now that Media Center is dead.

I want an affordable ($2000-2300) 80"+ OLED 4k tv, though. Replace my current 75" 1080p 120hz.
Which model is that that you have at 75 inches doing 120 hz?
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
No I just think the adoption rate will be painfully slow. 4k+hdr will not be adopted fast as 1080p is still "good enough" for people.

Personally I'm mindboggled how people are still OK with 1080p 60hz. 4k 120hz hdr is what I want but I'll have to settle for something less than that sadly.

I'm not "ok" with it but I just dont see an option even for 120Hz tv. Do you know of a decent reputable tv brand that offers a 48-56" tv with 1080p @ 120Hz input from a PC?
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Actually in gaming the HDR has been used to render images that look good, not just for over bright effects.
The ability to handle calculations in a way that you have more than 256 levels of brightness is paramount to get realistic lighting on surfaces. (Dynamic range is huge in real world scenarios as sun is millions times more bright than dark room.)

HDR in TV/monitor is good for both light and dark scenarios.
Classic TV the white is pretty much a same as a white paper in an office and HDR highlights can be ~4x of that.
Then there is better sample distribution of the colors, getting better quality overall. (Better definition of color space.)

I'm sure that photographers will like the change how their photos captured in RAW will look.
Previously they couldn't see the 12-14 stops of dynamic range their cameras could capture, just an tone mapped approximation.

I see. It would be great for photographers first and foremost. For that reason alone it must come to PC without these hang ups.

I agree about real world dynamic range but I'd say your eyes are the limit on both ends. The eyes also dynamically control light using an iris so I'm just curious what the static range for the eyes is.

My first experience with HDR was in half life 2. It added to the scene because it was wel done. Then you have games like AC unity which seriously go crazy with that dynamic iris effect. It looks realistic or rather simulates it well but I just found it distracting.

I don't know if I would bend over backwards in cost and trouble for my display to simulate that driving into the sunset look. I think even current displays can very well burn my eyes.

Again if I were an avid photographer I would go out of the way for it in still images and I think that's a great use case. But usually the difficulty is in the darker areas of the screen and I think LCD backlighting is the huge issue there. OLEDs and JVC projectors do a great job there.

We could have 65,000 levels of brightness in the signal but some of these technologies will never display it correctly and even if they could I think your own iris would shut off some of the darker areas anyway.

I would like to learn what the static dynamic range of the eye is and take this discussion from there. I think that's the key factor.

Already we see display manufacturers push things like 4K on TVs that will be used from 10-15' away. At anything less than 70" its pointless in fact just look at visual acuity charts or talk to your optometrist about it. That doesn't stop them from making it and marketing it and arguing that well 8 million pixels is better than 2 million pixels. It's an argument that any lay person would see as "moar is better, duh". The marketers love selling stuff with such arguments. And people will buy stuff based on bigger numbers.

256 levels is probably low but maybe 4096 or something like that is sufficient. Again without an optometrist's input I can't say.



EDIT: Found some great info: http://wolfcrow.com/blog/notes-by-dr-optoglass-dynamic-range-of-the-human-eye/

Look so like 1000:1 but the eye can accommodate 1,000,000:1 with the iris.


Im sure current displays with 12 bits handle 1:1000 right?

My argument about moving pictures causing your iris to accommodate is that if one scene is bright and the next scene is dark then it's an inconvenient effect akin to waking into a dark theater straight from a bright sunny day or vice versa. Cinema and television is entertainment at the end of the day. I don't need these to make me feel blind or blinded. And I sure as heck ain't paying extra so that it can. Come on people can we look beyond what the marketers will say to get us to buy? This is a tech forum not the Disney forum. That's where I expect to see the "moar pixels, moar brightness" type arguments.

I'm not saying this aimed at you or anyone but simply that we need as a group need to be more discerning about marketer arguments especially. I'm sick and tired of them. You should check a place like HeadFi where common sense is lacking to the point they will drop $4k on a pair of plastic diaphragms attached to a frame and coil because some marketer made some stupid argument about material or design or tuning or something just incomprehensible in the realm of common sense. It goes up to 100khz. So what? I'm not buying it for my dog.
 
Last edited:

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
The thing to remember with HDR displays is that they're all as different as the HDR content being shown. Some movies are mastered with a peak brightness of 4,000 nits, some at 1,000, while theoretically anything mastered in Dolby Vision can be up 10,000 nits. Your display, whether OLED (0-750 nits) or LCD (0.014-1500 nits) is going to do one of two things: fit what it can and clip the rest OR scale to the display's capabilities. If you use an OLED anywhere but a dark room, you will destroy its infinite contrast advantage - it will still be there, but you won't be able to see it. For most users, the higher brightness of LCD will provide a better HDR experience.

It's rather funny - in an ironic way - that up until now, most professional reviewers and enthusiasts have balked at the "torch mode" of displays and been overly critical of display brightness. In an SDR world, that makes sense. However, in the HDR era, maximum brightness is proving to be more popular among critics and laymen alike. I think due to the dark room requirements of OLED, LCDs are proving more versatile for customers based upon maximum brightness alone. The 2:1 price difference doesn't help matters.

That's a good point. I love to trash LCDs. I can't stand them. But you are absolutely right that for an average consumer in an average room it's the right technology.

Few people are videophiles enough to build a dark room or invest in drapes or whatever it takes.

But I have to say that either OLED or JVC projection in a dark room is amazing to behold. I'd say the oled is easier to deal with in that to do a projector room right requires black/gray non reflective paint on walls and ceilings. Too much light leaks out the sides of lens and even back reflection from the screen reduces contrast. But OLED my gosh it looks unreal in a regular room with curtains.

Having said that I now see the point of improving LCD tech as it is the only tech that will deliver in a lit room unless OLED gets a lot brighter.

As a side note this new Apple Watch is capable of 1000nits. I guess that's great for viewing under direct sunlight.
 
Last edited:

Raduque

Lifer
Aug 22, 2004
13,140
138
106
Which model is that that you have at 75 inches doing 120 hz?

I'd like to know as well. I only know of a two 75" TVs with native 120Hz 1080p input - the 850D and the P75-C1. All of Sony's models and almost all of Vizio's M and P line will do 120Hz native input and display.

http://www.rtings.com/tv/reviews/by-usage/pc-monitor/best

Samsung UN75H6350
http://www.crutchfield.com/S-ub7V4X3dp5t/p_30575H6350/Samsung-UN75H6350.html
All the specs say it's 120hz. I feed it with an Xbox One through my Yamaha AVR.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
That's a good point. I love to trash LCDs. I can't stand them. But you are absolutely right that for an average consumer in an average room it's the right technology.

Few people are videophiles enough to build a dark room or invest in drapes or whatever it takes.

But I have to say that either OLED or JVC projection in a dark room is amazing to behold. I'd say the oled is easier to deal with in that to do a projector room right requires black/gray non reflective paint on walls and ceilings. Too much light leaks out the sides of lens and even back reflection from the screen reduces contrast. But OLED my gosh it looks unreal in a regular room with curtains.

Having said that I now see the point of improving LCD tech as it is the only tech that will deliver in a lit room unless OLED gets a lot brighter.

As a side note this new Apple Watch is capable of 1000nits. I guess that's great for viewing under direct sunlight.
I feel bright room viewing is overrated. Most people work during the times it is brightest.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
I agree. But it is a use case. I really think LCDs are best for outdoor advertising etc.
In your post you said for the average consumer in the average room it's the best tech.


I just said no I disagree, it's not the best, because most people work during the times it is brightest thus negating any of the benefits you listed.

We are not just listing use cases I'm stating a direct counter point to your argument about lcd being the best tech for the average consumer in an average room.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
In your post you said for the average consumer in the average room it's the best tech.


I just said no I disagree, it's not the best, because most people work during the times it is brightest thus negating any of the benefits you listed.

We are not just listing use cases I'm stating a direct counter point to your argument about lcd being the best tech for the average consumer in an average room.

Me personally I have plasma and oled and dla/lcos projection. Trust me nobody else I know chooses this kind of tech. They have the lights blasting at home all the time. They come over and mention how my displays are too dim for "real use".

I'll admit if you have people over for dinner most of them want to see their food. If the room is too dark they'll trip over stuff.

Most people are not technophiles or videophiles. They just want bright. The manufacturers for better or worse target these people. For average people with average tastes and habits it probably is the best for them. Most people can't bother to light control their living room. There are street lights that bleed in even at night. I don't know anyone with installed remote blinds or black out drapes. The average person doesn't care about contrast. They want 4K on a 60" display they sit 15' away from. Oh and it better be bright because the best they usually go for is Philips HUE lighting and use it for light shows instead of remote dimming.
 

imported_bman

Senior member
Jul 29, 2007
262
54
101
We could have 65,000 levels of brightness in the signal but some of these technologies will never display it correctly and even if they could I think your own iris would shut off some of the darker areas anyway.

I would like to learn what the static dynamic range of the eye is and take this discussion from there. I think that's the key factor.

Already we see display manufacturers push things like 4K on TVs that will be used from 10-15' away. At anything less than 70" its pointless in fact just look at visual acuity charts or talk to your optometrist about it. That doesn't stop them from making it and marketing it and arguing that well 8 million pixels is better than 2 million pixels. It's an argument that any lay person would see as "moar is better, duh". The marketers love selling stuff with such arguments. And people will buy stuff based on bigger numbers.

256 levels is probably low but maybe 4096 or something like that is sufficient. Again without an optometrist's input I can't say.



EDIT: Found some great info: http://wolfcrow.com/blog/notes-by-dr-optoglass-dynamic-range-of-the-human-eye/

Look so like 1000:1 but the eye can accommodate 1,000,000:1 with the iris.


Im sure current displays with 12 bits handle 1:1000 right?

My argument about moving pictures causing your iris to accommodate is that if one scene is bright and the next scene is dark then it's an inconvenient effect akin to waking into a dark theater straight from a bright sunny day or vice versa. Cinema and television is entertainment at the end of the day. I don't need these to make me feel blind or blinded. And I sure as heck ain't paying extra so that it can. Come on people can we look beyond what the marketers will say to get us to buy? This is a tech forum not the Disney forum. That's where I expect to see the "moar pixels, moar brightness" type arguments.

HDR is noticeable primarily because that static 1:1000 range will be better utilized by the brighter scenes enabled by HDR. It is likely that the static ratio of 1:1000 is just an average and the static range is dependent on the total brightness.

SDR is 0.1 nits to 100 nits
HDR10 mode 1 is 0.05 nits to 1000 nits (in practice, range is 0.0005nits to 10,000nits)
HDR10 mode 2 is 0.0005 nits to 540 nits (for OLEDS since their brightness is limited. Also the 0.0005 nits will not make a difference over 0.05 nits unless you are in an all black room wearing black clothing).
Dolby Vision is 0.004nits to 4000nits (in practice, range of the standard is 0.0001 nits to 10,000 nits)

The number of bits changes number of graduations. HDR10 which looks to be the industry standard is 10bits with static metadata. Dolby Vision is 12bits plus it has dynamic metadata which I think enables a shifting window for min and max brightness in a frame thus enabling finer graduations. HDR10 might get dynamic metadata, I think it supposed to be a part of the HDMI 2.1 spec. I believe 12bit colour is needed to eliminate all perceivable banding at all brightnesses, 10bit colour is good past 100 nits, and 8bit colour produces banding across the entire range (all these assuming static window and gradations, also 8bit banding is only really perceivable in grays or across slow gradients).

Code_levels.jpg

This luminance curve gives a good idea what HDR is, though it shows the full range of HDR10 when in practice clipping will occur at the extremes (<0.05 nits and >1000 nits).
 
Last edited:

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Read the optical link I posted. In higher brightness situations the human eye has less capability to handle dynamic range. All the tech in the world can't solve that issue short of optic implants. I'm not sure I'm ready to have this tech wired into my optic nerve just yet, and certainly not over some movie or video game. The range is 1000:1 in higher brightness situations. I'm sure if your TV feeds you something bright enough to constrict your iris you're not going to see those lower luminance parts of the signal.

The eye has higher static dynamic range in low light situations. There are some pretty obvious evolutionary reasons for that. You tend to survive if can see the poisonous snake in the grass under moonlight while a stream flashes that light into your eyes. But even look near the sun, good luck seeing anything well for a while. Avoiding bright lights is easy, just look away. That way you stand a chance at looking at potential dangers.

If some silly Dolby TV flashed 4000 nits on my face i doubt I'd give a care in the world about 0.004 nits signals. Or even 3500 nits or 3000 nits for that matter. The gradations in that range of insane brightness are not important. I'd care about the same as some silly $4000 super tweeter sending 100 Khz audio signals my way. I'm sure it's pretty accurate at 80KHz and 60Khz and 40Khz. The marketers got some fools to pay for those. Just because they make it doesn't mean it makes sense to buy it. The limits are your eyes and ears. But it doesn't stop people from arguing on some audiophile forum that they need a frequency response from 1-100,000Hz or it won't have the full sound spectrum. I'm not saying you won't see it unlike the audio example which you will not hear at all unless you are warewolf. You will see it but it will just be uncomfortable. You can get a iMac screen to be pretty darn uncomfortable at 500 nits. Then you will say but only a very small portion of the screen is at 4000 nits. Ok so this is going to look like that machine they use at the eye doctor where this tiny thing shines a bright light at you, do you care about the second bright light that is a 3rd less bright in your FOV? It's just two insanely uncomfortable lights on your face.

You can look at it from a hearing perspective in that the ears also have accommodation by tightening up the muscles around the malleus, incus and stapes. If you stand next to a jet engine at 140dB will you hear a pin drop? Or will you pick out the finer nuances of Beethoven's 9th symphony playing at 120dB next to you? This stuff is some marketer telling you that you need this and they put out some fancy graph to make you believe it. It makes you think maybe you are one of that elite group who have super vision or super hearing or maybe super taste buds that you need that $200 bottle of vodka. You can really tell.

I really don't want hear that Dolby "Labs" is a reputable company etc. We all know about that $300 Oppo player that got stuffed into a $3500 Lexicon case and it got THX certification.

http://www.audioholics.com/blu-ray-...-ray-oppo-clone/oppo-inside-lexicon-outside-1

Yeah that stuff is bought and paid for. These companies will sell you any story to stay in business. They exist to come up with the next marketing gimmick for theaters and electronics manufacturers. The real immersion entertainment research is in VR, that is real. This is just continuously polishing the same turd.
 
Last edited:

nathanddrews

Graphics Cards, CPU Moderator
Aug 9, 2016
965
534
136
www.youtube.com
Samsung UN75H6350
http://www.crutchfield.com/S-ub7V4X3dp5t/p_30575H6350/Samsung-UN75H6350.html
All the specs say it's 120hz. I feed it with an Xbox One through my Yamaha AVR.
The actual display may be refreshing at 120Hz, but it could just be duplicating the same frame twice, which is very common. If you haven't already, you may want to test it using the frameskipping test. Use your camera/phone/whatever to take a picture to verify that it's getting a 120Hz signal.
Native 120Hz will look like this.
60Hz doubled to 120Hz will look like this.

Which leads me to my next question - which Yamaha AVR do you have that it can pass a 120Hz signal? I have several models from mid to high end and not a single one can pass any signal over 60Hz. If you've found one that can, then I must have it! :D
 

nathanddrews

Graphics Cards, CPU Moderator
Aug 9, 2016
965
534
136
www.youtube.com
But I have to say that either OLED or JVC projection in a dark room is amazing to behold.
I agree, there is nothing better than this. Now they just need to make OLEDs that are more gaming oriented:

1. Get input lag down closer to CRT levels
2. Refresh rates up to 120Hz natively (why not 240?)
3. Adaptive sync would be nice to have (on any TV, really)
4. Some sort of configurable black frame insertion modes to reduce blurring (or just reduce the sample/hold timing)

Sadly I don't think that will ever happen with the TVs, we'll have to have for a $3,000 Asus Swift OLED or something...
 

mnewsham

Lifer
Oct 2, 2010
14,539
428
136
HDR10 might get dynamic metadata, I think it supposed to be a part of the HDMI 2.1 spec
Yes HDMI 2.0 supports HDR with static metadata, HDMI 2.1 will support HDR with dynamic metadata.

Displayport 1.4 also supports HDR static metadata, but I can't find anything to confirm if it supports dynamic metadata or not.
 

nathanddrews

Graphics Cards, CPU Moderator
Aug 9, 2016
965
534
136
www.youtube.com
Displayport 1.4 also supports HDR static metadata, but I can't find anything to confirm if it supports dynamic metadata or not.
DP1.4 uses flexible metadata packet transport to support future dynamic HDR.

I'm not sure that HDMI 2.1 hardware will be necessary for Dynamic HDR10 (if/when it is released). Samsung has gone as far as to demonstrate firmware updates to support it in their existing HDR sets, so it doesn't sound like a hardware change is necessary. It will probably be on a set-by-set basis. It would make sense that any Dolby Vision-capable set would be able to also take advantage of Dynamic HDR10 (if/when it is released).

From a marketing perspective, "HDMI 2.1" would certainly clarify things for consumers to ensure that a given TV/AVR/player supports all the necessary technology, but it would royally piss off customers that don't have compatible gear.
 

imported_bman

Senior member
Jul 29, 2007
262
54
101
Uniformed rant on how I am smarter than the sheep who buy marketing snake oil

Before you go on your next rambling rant maybe think through what you are talking about, heck read your own link!

Any value that falls on the ‘1’ end of a 1000:1 scene will appear black to the eye, even if it isn’t a black body. Any value that falls on the ‘1000’ end will appear white, or at least very near white. We have already seen that cones are responsible for vision during bright outdoor light. Rods, on the other hand, can take in about 1,000,000:1 or about 20 stops of light.

This means that the human eye has a greater dynamic range in the shadows than in bright light – but only when no bright light is directly hitting our retina – the dynamic range of the eye in the shadows is about 20 stops. Which means the dynamic range of the eye in bright outdoor sunlight is only about 10 stops.

Ambient day light is in the range of 10,000-30,000 nits (can go much higher at noon on a clear day during summer), SDR is supposed to map to 100 nits at max! So 100 nits is going to be on the part of the range where they eye has well over a 1:1000 contrast ratio. Did you even the read the link you posted or did you just Ctrl+F to contrast ratio of the eye?
 
Last edited:

Raduque

Lifer
Aug 22, 2004
13,140
138
106
The actual display may be refreshing at 120Hz, but it could just be duplicating the same frame twice, which is very common. If you haven't already, you may want to test it using the frameskipping test. Use your camera/phone/whatever to take a picture to verify that it's getting a 120Hz signal.
Native 120Hz will look like this.
60Hz doubled to 120Hz will look like this.

Which leads me to my next question - which Yamaha AVR do you have that it can pass a 120Hz signal? I have several models from mid to high end and not a single one can pass any signal over 60Hz. If you've found one that can, then I must have it! :D

I have the smooth motion stuff turned off. I think I have it set to 60hz in the menus, been a very, very long time since I even used the menu. I'm not worried about it. The TV looks great and content plays back as it should. The AVR is a v375.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Before you go on your next rambling rant maybe think through what you are talking about, heck read your own link!



Ambient day light is in the range of 10,000-30,000 nits (can go much higher at noon on a clear day during summer), SDR is supposed to map to 100 nits at max! So 100 nits is going to be on the part of the range where they eye has well over a 1:1000 contrast ratio. Did you even the read the link you posted or did you just Ctrl+F to contrast ratio of the eye?

SDR may be 100nits but no actual TV is that dim. Most TVs put out far more than that.
The reason is they are used in lit rooms. They may go from 0-1500 nits currently. But then what's the use of the dim parts of that screen? This might be helpful for that occasional scene of a headlight of a car coming at you but what else would you see in that scene? Even in real life? You see it and turn your head. That's cool a TV can reproduce that scene but really that's cool for a demo.

An afternoon scene is indeed that bright but the whole scene is, but a high brightness source against a dark room isn't natural to look at. Your eyes won't capture the detail in the low parts of the scene. Your iris will stop it down. In a lit room yes. Which is why I conceded that bright LCDs have a purpose.

I can see it useful in a situation where the screen is big enough to encompass your entire field of view. Maybe a good 120 degrees field of view and I suppose that would apply for widescreen computer monitors.

A list of current TVs shows some get close to 1500 nits.

http://www.rtings.com/tv/tests/picture-quality/peak-brightness

Even back in CRT days they recommend you put a 6500K lamp behind the TV to avoid overwhelming your eyes with the off screen contrast.

In outdoor advertising they say you can't get bright enough because those displays need to compete with day light which as you mentioned gets into the 10s thousands of nits. But still viewing even the world's brightest display in sunlight isn't going to give you much dynamic range. Anti glare can never be that good which is probably the better thing to address than this.

Right now I'm looking at the sun reflecting off a car window and I can see the brake calipers inside the wheels. But man that reflection is not comfortable to look at. I suppose it's good for shock and awe type scenes.