HDR gaming on PC. Do you feel like you're missing out?

moonbogg

Lifer
Jan 8, 2011
10,637
3,095
136
Is HDR a big deal to you? It has infected my mind and every time I game I can't help but think of what I might be missing. There's no HDR monitor that suites me at the moment, but I'm not sure if I'd like the extra brightness anyway as it may fatigue the eyes more. In know its just details that are supposed to be bright, but I have a feeling it might get fatiguing.
Anyway, do you feel like HDR is a big deal? Do you feel like you're missing out? You excited about the HDR monitors on the market right now? I'm looking forward to the ultrawide HDR monitors coming out, but I still don't think they will have enough backlight zones for properly detailed light control and contrast granularity.
Anyway, what do you guys think about it? You already have an HDR monitor? You want on? Will HDR just fade out like 3D did or remain niche like VR? Or will it take over and make all other gaming panels obsolete and crappy in comparison?
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
I played Assassin's Creed Origins (supports HDR) on my Sony x930e, and I didn't really notice much of a difference when switching to my desktop. Albeit, even in movies, I find the easiest thing to notice is the huge brightness change when something bright like the sun is on screen. I think I might actually hurt my eyes if the movie Sunshine ever comes out in UHD. :p
 
  • Like
Reactions: moonbogg

Midwayman

Diamond Member
Jan 28, 2000
5,723
325
126
Just like SDR monitors, you're supposed to calibrate them for brightness level. That is in relation to the room you'll be using them, so eye fatigue shouldn't be an issue. FWIW If you've had a plasma TV you've already experienced the sort of range a contrast range a HDR monitor is specified to. (At least in a dark room)
 
  • Like
Reactions: moonbogg

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
Anyway, do you feel like HDR is a big deal? Do you feel like you're missing out? You excited about the HDR monitors on the market right now?

Not in the slightest. Half of what passes off as "HDR" in games is just them adding fake color reshading filters on top of whatever genuine but often minor HDR effects are there. So you never actually get to see any real pure SDR vs HDR comparisons, it's all "we use two different sources - one watered down for SDR and one colorized for HDR, then we add HDR and pretend the difference is only due to HDR". A couple of my favorite comically fake "comparisons":-

HDR = Ugly artificial green tint!
HDR = Turning down the brightness / gamma!

The first pic just adds an ugly green-tint that you can do in any game yourself on SDR with SweetFX / Reshade. The second one isn't increasing any "range", it just unnaturally darkens everything (obvious by looking at the lights on the bridge which don't even look like lights). It looks like Bloom has been disabled as well to try and make it look "fake dark". In both cases, the SDR is the actual more realistic looking image, and the HDR the one that ends up "trying too hard to make itself look different". That doesn't mean there isn't a difference with HDR vs SDR, only it's far more subtle than most BS marketing slides show, and a lot of people expecting Blu-Ray vs VHS or color vs B&W levels of difference end up disappointed.

Reminds me of the bogus marketing BS behind HD-Audio, where some public demo's pitched a 44.1khz CD vs a 96khz DVD-Audio, then 'forgot' to mention the source of the latter has been made artificially 1-2db louder and the stereo image widened ever so slightly so like 99% of HDR "comparisons", you ended up not even comparing the thing that was being marketed, merely the stuff "stuck on top" that looks / sounds equally different when applied to SDR vs SDR...

Will HDR just fade out like 3D did or remain niche like VR? Or will it take over and make all other gaming panels obsolete and crappy in comparison?
I don't know if it'll fail as hard as 3D. I do believe it is niche and the number of enthusiasts pushing it are wanting it to be more mainstream than it is (plus TV manufacturers are still in "damage control" mode after the epic flop that was 3D). For video, even non-HDR 4K Blu-Ray is well behind DVD in terms of take up numbers, demand for premium media pricing and catalogue size. Most people streaming seem happy with 1080p or non-HDR 4K and want more quality stuff to watch than worrying about how to watch it. Most gamers I know care far more about the "direction" gaming is heading in (pay2win MT's, lootboxes, "Games as a Service", etc) than HDR, 8K or any other "must buy" thing that seems more "marketing push" than "consumer demand pull".
 
  • Like
Reactions: moonbogg

moonbogg

Lifer
Jan 8, 2011
10,637
3,095
136
Just like SDR monitors, you're supposed to calibrate them for brightness level. That is in relation to the room you'll be using them, so eye fatigue shouldn't be an issue. FWIW If you've had a plasma TV you've already experienced the sort of range a contrast range a HDR monitor is specified to. (At least in a dark room)

Actually, I happen to STILL be using a super old Samsung plasma 1080p TV. It still works and looks fantastic, lol. It seems to have deep blacks, great uniformity and just looks really good. So the contrast is special with plasma? To be honest I never really noticed.

Not in the slightest. Half of what passes off as "HDR" in games is just them adding fake color reshading filters on top of whatever genuine but often minor HDR effects are there. So you never actually get to see any real pure SDR vs HDR comparisons, it's all "we use two different sources - one watered down for SDR and one colorized for HDR, then we add HDR and pretend the difference is only due to HDR". A couple of my favorite comically fake "comparisons":-

HDR = Ugly artificial green tint!
HDR = Turning down the brightness / gamma!

The first pic just adds an ugly green-tint that you can do in any game yourself on SDR with SweetFX / Reshade. The second one isn't increasing any "range", it just unnaturally darkens everything (obvious by looking at the lights on the bridge which don't even look like lights). It looks like Bloom has been disabled as well to try and make it look "fake dark". In both cases, the SDR is the actual more realistic looking image, and the HDR the one that ends up "trying too hard to make itself look different". That doesn't mean there isn't a difference with HDR vs SDR, only it's far more subtle than most BS marketing slides show, and a lot of people expecting Blu-Ray vs VHS or color vs B&W levels of difference end up disappointed.

Reminds me of the bogus marketing BS behind HD-Audio, where some public demo's pitched a 44.1khz CD vs a 96khz DVD-Audio, then 'forgot' to mention the source of the latter has been made artificially 1-2db louder and the stereo image widened ever so slightly so like 99% of HDR "comparisons", you ended up not even comparing the thing that was being marketed, merely the stuff "stuck on top" that looks / sounds equally different when applied to SDR vs SDR...


I don't know if it'll fail as hard as 3D. I do believe it is niche and the number of enthusiasts pushing it are wanting it to be more mainstream than it is (plus TV manufacturers are still in "damage control" mode after the epic flop that was 3D). For video, even non-HDR 4K Blu-Ray is well behind DVD in terms of take up numbers, demand for premium media pricing and catalogue size. Most people streaming seem happy with 1080p or non-HDR 4K and want more quality stuff to watch than worrying about how to watch it. Most gamers I know care far more about the "direction" gaming is heading in (pay2win MT's, lootboxes, "Games as a Service", etc) than HDR, 8K or any other "must buy" thing that seems more "marketing push" than "consumer demand pull".

Great post. I agree that for games, it's hard to tell what's going on because they make a special version of the game for HDR. They change things artistically as far as colors go while they also make changes to brightness levels. I just felt so confused to have all these people saying SDR monitors are super limited in their color space and how they can't show real colors like HDR panels can, etc. This whole time I though IPS monitors were well suited for color work because of their ability to show nearly true colors and I thought the experience was already good. Now people are making it seem like SDR is the new crappy "TN panel" of the TV and monitor world, lol. Know what I mean? Like all of a sudden all the fancy viewing tech in the world instantly became garbage if it doesn't have HDR. Seems like yesterday that IPS was the holy grail of monitor tech. Now its junk. Maybe I'm just getting too old to keep up with this crap.
As far as making videos look more real, I don't think any miracles can happen there, at least not without some very fancy and expensive tech to make it happen. For instance, if I record a video and watch it on either my phone or computer, and then compare it simultaneously to the real world outside my window, it looks exactly the same to me. Videos look like real life already and I'm not sure how much value can be added by making the sun so bright that I want sunglasses when watching a movie at night. And if you have to "calibrate" the brightness so low to where it doesn't hurt, then what's the point in the first place of having super high brightness?
The only area where I feel I may be missing something is in games where they can just use the brightness as an artistic tool to make gunshots, lightning strikes, plasma weapons in DOOM-like games and fires all look brighter than normal. It would add a shiny and eye catching effect to the game while keeping blacks nice and dark. I guess that's fine. I still think I'd be lowering the brightness though. My monitor is set to something like 120nits, and when gaming at night if something bright happens, it hurts enough to make me squint. That's at around 120 nits of fixed brightness. Throw 600 or 1000 nits at me even in small details and I'm likely to turn HDR off and never turn it on again, lol.
Time will tell though. We'll see.
 
  • Like
Reactions: BSim500

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
Great post. I agree that for games, it's hard to tell what's going on because they make a special version of the game for HDR. They change things artistically as far as colors go while they also make changes to brightness levels. I just felt so confused to have all these people saying SDR monitors are super limited in their color space and how they can't show real colors like HDR panels can, etc. This whole time I though IPS monitors were well suited for color work because of their ability to show nearly true colors and I thought the experience was already good. Now people are making it seem like SDR is the new crappy "TN panel" of the TV and monitor world, lol. Know what I mean? Like all of a sudden all the fancy viewing tech in the world instantly became garbage if it doesn't have HDR. Seems like yesterday that IPS was the holy grail of monitor tech. Now its junk. Maybe I'm just getting too old to keep up with this crap.

I know what you mean. That's what I call "Enthusiast Anchor Drift". Basically, someone declares something that hasn't changed at all to be "obsolete" based more on their emotional need to talk up a new / desired personal "rat race" purchase rather than any objective analysis. In reality, IPS is fine for color work. The vast majority of professionals who do photo / video editing don't own HDR monitors, just decently (and regularly) calibrated IPS monitors. In fact, people who buy a HDR set "because IPS is now junk" who calibrate their TV / monitor once (if that) but then forget about regular re-calibration from then on is precisely what marks them out as non-professionals. Just like audiophiles gushing over their $200 "Monster" interconnects to playback music from recording studios / musicians who used professional / commercial brands (eg, Kelsey, Cordial, etc) at prices of around $2-3 per channel per meter to record and master it in the first place... :D

And if you have to "calibrate" the brightness so low to where it doesn't hurt, then what's the point in the first place of having super high brightness?

You're right about that. It's one thing to stand 8ft away from a TV for 10 mins in a brightly lit showroom, and quite another to sit only 2-3ft away from an uncomfortably bright monitor 8-9 hours per day. Same reason why a lot of people end up returning over-hyped giant +40" monitors, monitor / TV ergonomics is more a bell shaped curve (not too bright, not too dark, not too small but yes when you're only 2-3ft away, it really can get too big and start causing discomfort when it forces moving the neck muscles a lot more or feel "overloading" to "take in"), etc, than a never-ending straight line of every variable sky-rocketing forever as suggested by marketing... Biggest problem is you just can't tell where personal comfort zone limitations are with a 10min test. It's something that only start to become apparent after a week of regular usage.

Personally, I say just wait and see and try and test in person (an actual real "HDR only" test without the color-reshading nonsense which isn't easy for games as they change a lot more than HDR video). Everyone has different thresholds / preferences though and I wouldn't get hung up on ignoring "one size fits all advice" from enthusiasts if the end result doesn't feel comfortable to you. Eg, "THX seating guidelines" of having to sit exactly X distance away from a screen Y large of Z resolution doesn't mean a thing for most people if the end result is a family of 5 / group of 6 friends expected to all sit an impractical 3ft away from the TV or it just clashes with the natural shape / layout of the room. Or "High Dynamic Range (audio) movie soundtrack" being another way of saying "in your average family living room with your average noise floor, loud parts that are too loud and quiet parts just right, or quiet parts that are inaudible but loud just right - pick 1 of 2 or play 'see-saw' with the volume buttons every 7 seconds to view it 'properly'". You get the picture... ;)
 
  • Like
Reactions: moonbogg

Midwayman

Diamond Member
Jan 28, 2000
5,723
325
126
Actually, I happen to STILL be using a super old Samsung plasma 1080p TV. It still works and looks fantastic, lol. It seems to have deep blacks, great uniformity and just looks really good. So the contrast is special with plasma? To be honest I never really noticed.

Yah, 10,000:1 seems to be what HDR is promising. That's really good compared to LCD. Plasmas were quoting into the millions to 1 ratios from good sets. Mostly due to the black level. Plasmas just don't get that bright though, makes it hard to see that contrast untill you have a dark room. I have a panasonic st60 from the last year they made plasmas and its fantastic. Incredible black and the motion is super fluid. OLEDs are surpassing them these days, but there was a long dry spell where OLEDs were anywhere near a reasonable price.
 
  • Like
Reactions: moonbogg

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
That's at around 120 nits of fixed brightness. Throw 600 or 1000 nits at me even in small details and I'm likely to turn HDR off and never turn it on again, lol.

I've had that sort of thing happen in movies. The first HDR movie that I ever watched was Sicario, and I recall a scene near the beginning in a house where the sun was peeking in through a curtain or something. Whew... that sun was bright. That's not too surprising when Rtings lists the TV's sustained 10% brightness at 1510 nits! :eek:

Although, I've mentioned this in TV discussions before, but I've seen awkward scenes where changes in the backlight were very noticeable. One scene in Marvel's The Punisher had one character's back to a window and another inside the room, and whenever it would switch between them, you could see the backlight go from bright to dark. I've never seen it do that anywhere else, but it really stood out.
 
  • Like
Reactions: moonbogg

DigDog

Lifer
Jun 3, 2011
13,724
2,253
126
I would disable it anyway. I like to see whats actually happening, instead of a constant explod-o-rama.
 
  • Like
Reactions: moonbogg

zinfamous

No Lifer
Jul 12, 2006
111,091
30,023
146
Just like SDR monitors, you're supposed to calibrate them for brightness level. That is in relation to the room you'll be using them, so eye fatigue shouldn't be an issue. FWIW If you've had a plasma TV you've already experienced the sort of range a contrast range a HDR monitor is specified to. (At least in a dark room)

My beloved Samsung 720p plasma from 2007 still has/had a much better image than my current 2015 4k LED. I miss that thing. But I don't miss the weight. .....or the glare on bright days, lol.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I would disable it anyway. I like to see whats actually happening, instead of a constant explod-o-rama.

That's not what HDR is...geeze guys you're on a tech forum and have zero clue what HDR video is. It allows a wider range of colors, brighter highlights yes, but also a better contrast between bright and dark in a scene with more shadow detail present than an SDR display can show you. The picture is more detailed and life like. There is more available detail to be shown in a particular scene than you can view via an SDR display. An HDR display can show you all the detail without washing out.

It's not throwing eyeball burning brightness at you, that is NOT what HDR is about.


I'd ask that you all at least read up on HDR video and HDR Color Grading before you spout off nonsense.
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Actually, I happen to STILL be using a super old Samsung plasma 1080p TV. It still works and looks fantastic, lol. It seems to have deep blacks, great uniformity and just looks really good. So the contrast is special with plasma? To be honest I never really noticed.



Great post. I agree that for games, it's hard to tell what's going on because they make a special version of the game for HDR. They change things artistically as far as colors go while they also make changes to brightness levels. I just felt so confused to have all these people saying SDR monitors are super limited in their color space and how they can't show real colors like HDR panels can, etc. This whole time I though IPS monitors were well suited for color work because of their ability to show nearly true colors and I thought the experience was already good. Now people are making it seem like SDR is the new crappy "TN panel" of the TV and monitor world, lol. Know what I mean? Like all of a sudden all the fancy viewing tech in the world instantly became garbage if it doesn't have HDR. Seems like yesterday that IPS was the holy grail of monitor tech. Now its junk. Maybe I'm just getting too old to keep up with this crap.
As far as making videos look more real, I don't think any miracles can happen there, at least not without some very fancy and expensive tech to make it happen. For instance, if I record a video and watch it on either my phone or computer, and then compare it simultaneously to the real world outside my window, it looks exactly the same to me. Videos look like real life already and I'm not sure how much value can be added by making the sun so bright that I want sunglasses when watching a movie at night. And if you have to "calibrate" the brightness so low to where it doesn't hurt, then what's the point in the first place of having super high brightness?
The only area where I feel I may be missing something is in games where they can just use the brightness as an artistic tool to make gunshots, lightning strikes, plasma weapons in DOOM-like games and fires all look brighter than normal. It would add a shiny and eye catching effect to the game while keeping blacks nice and dark. I guess that's fine. I still think I'd be lowering the brightness though. My monitor is set to something like 120nits, and when gaming at night if something bright happens, it hurts enough to make me squint. That's at around 120 nits of fixed brightness. Throw 600 or 1000 nits at me even in small details and I'm likely to turn HDR off and never turn it on again, lol.
Time will tell though. We'll see.

No it's not the same. If you view a car window with a gleam of light shining off it, the car and all the detail is still visible. On an SDR display the only thing you see is the gleam of light and all the surrounding detail is washed away because of the limitations of the display and the color grading. HDR corrects this by giving you the ability to see all that detail that is there. The image captured by a video camera surpasses what traditional TVs can display and has for decades.
 
Last edited:

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,729
136
No it's not the same. If you view a car window with a gleam of light shining off it, the car and all the detail is still visible. On an SDR display the only thing you see is the gleam of light and all the surrounding detail is washed away because of the limitations of the display and the color grading. HDR corrects this by giving you the ability to see all that detail that is there. The image captured by a video camera surpasses what traditional TVs can display and has for decades.
The human eye is not a proper comparison against a display. A properly calibrated high quality display has an ideal output range in terms of brightness, contrast and color; you can't just expect your source video/image display the way it should on an SDR monitor without proper tone mapping and adjusting the level curves. Anybody who has handled editing RAW knows how to pull detail from shadows and make it look brighter in an approximation closer to how the eye perceives a scene. You don't need a fancy "HDR" monitor to view the dynamic range captured by your device.
 

DigDog

Lifer
Jun 3, 2011
13,724
2,253
126
brighter highlights
This is what quake looks like on my screen
311xfew.jpg
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
The human eye is not a proper comparison against a display. A properly calibrated high quality display has an ideal output range in terms of brightness, contrast and color; you can't just expect your source video/image display the way it should on an SDR monitor without proper tone mapping and adjusting the level curves. Anybody who has handled editing RAW knows how to pull detail from shadows and make it look brighter in an approximation closer to how the eye perceives a scene. You don't need a fancy "HDR" monitor to view the dynamic range captured by your device.

You do because in grading for HDR you get more stops of dynamic range to work with. It’s not as limited. Plus you get the rec.2020 color space.
 
  • Like
Reactions: Midwayman

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,729
136
You do because in grading for HDR you get more stops of dynamic range to work with.
Which can be done on any display. The dynamic range you work with is coming from the source file, which can be graded according to the display you're working with. This is nothing new. Pro photographers who work with multiple exposures to create an HDR image don't need fancy HDR displays all of a sudden.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Which can be done on any display. The dynamic range you work with is coming from the source file, which can be graded according to the display you're working with. This is nothing new. Pro photographers who work with multiple exposures to create an HDR image don't need fancy HDR displays all of a sudden.

No dude the display can show more dynamic range. That’s the whole point.

Go look it up and see how it’s not what you think it is. There’s a ton of videos about the subject.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,729
136
No dude the display can show more dynamic range. That’s the whole point.

Go look it up and see how it’s not what you think it is. There’s a ton of videos about the subject.
Viewing the right kind of image on a proper SDR screen will also show tons of dynamic range. How do you think something like this or this were created on a calibrated standard display before the advent of HDR monitors if according to you an SDR display can't show the full dynamic range?

By the way dynamic range is measured in stops - log_base2(contrast ratio). Film and the best digital sensors have a dynamic range of ~15 stops. To achieve such DR ratio you need a contrast ratio of 30000:1, which is a far cry from what these "HDR" displays are advertising.
 

Alpha One Seven

Golden Member
Sep 11, 2017
1,098
124
66
Viewing the right kind of image on a proper SDR screen will also show tons of dynamic range. How do you think something like this or this were created on a calibrated standard display before the advent of HDR monitors if according to you an SDR display can't show the full dynamic range?

By the way dynamic range is measured in stops - log_base2(contrast ratio). Film and the best digital sensors have a dynamic range of ~15 stops. To achieve such DR ratio you need a contrast ratio of 30000:1, which is a far cry from what these "HDR" displays are advertising.
Those examples are just oversaturated images and don't really look that good to me.
HDR will stay around I believe, just as color stayed around when it came to the screen. HDR is also about the amount of light used to light the image measured in NITS and where the light is placed (only lighting the pixels that need it with a discriminate light source is essential). This can be done with a full array LED set up that adds thickness to the display, or the way Sony does it on some models with what they are calling "Slim Backlight Drive"
Peak brightness is also a key factor for really good looking HDR and my TV I use as a monitor has 1184 cd/m² which is outstanding. (Much higher than you can get from an OLED)
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Those examples are just oversaturated images and don't really look that good to me.
HDR will stay around I believe, just as color stayed around when it came to the screen. HDR is also about the amount of light used to light the image measured in NITS and where the light is placed (only lighting the pixels that need it with a discriminate light source is essential). This can be done with a full array LED set up that adds thickness to the display, or the way Sony does it on some models with what they are calling "Slim Backlight Drive"
Peak brightness is also a key factor for really good looking HDR and my TV I use as a monitor has 1184 cd/m² which is outstanding. (Much higher than you can get from an OLED)

Except the native contrast ratio of an OLED can allow highlights to appear brighter in the picture than they are because of perfect blacks. Neither technology is perfect and both are improving. Though I think LCD tech is reaching its peak and OLED will continue to evolve with each new model for a while yet.

Trying to say SDR displays can produce the same image as an HDR display is laughable at best and really reaching.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Viewing the right kind of image on a proper SDR screen will also show tons of dynamic range. How do you think something like this or this were created on a calibrated standard display before the advent of HDR monitors if according to you an SDR display can't show the full dynamic range?

By the way dynamic range is measured in stops - log_base2(contrast ratio). Film and the best digital sensors have a dynamic range of ~15 stops. To achieve such DR ratio you need a contrast ratio of 30000:1, which is a far cry from what these "HDR" displays are advertising.

OLED TVs have perfect blacks so the contrast ratio is basically infinite and only limited by the content and brightness capabilities. You can argue in favor of SDR all you want but it is an inferior picture, full stop.
 

Alpha One Seven

Golden Member
Sep 11, 2017
1,098
124
66
Except the native contrast ratio of an OLED can allow highlights to appear brighter in the picture than they are because of perfect blacks. Neither technology is perfect and both are improving. Though I think LCD tech is reaching its peak and OLED will continue to evolve with each new model for a while yet.

Trying to say SDR displays can produce the same image as an HDR display is laughable at best and really reaching.
No one said SDR can produce HDR quality imaging...not sure where you got that from.
 

Alpha One Seven

Golden Member
Sep 11, 2017
1,098
124
66
Which can be done on any display. The dynamic range you work with is coming from the source file, which can be graded according to the display you're working with. This is nothing new. Pro photographers who work with multiple exposures to create an HDR image don't need fancy HDR displays all of a sudden.
HDR in photos has to do with lightening the shadows and not blowing out the highlights, it's completely different that an HDR display.
HDR for TVs is essentially a display process. It refers to a TV's ability to recognize specialized HDR content and display it in a way "normal" TVs can't.
HDR for cameras is a capture process. This is where multiple photos with different exposures are combined to create an effect that can look more (or less) realistic than a single exposure could.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,729
136
Those examples are just oversaturated images and don't really look that good to me.
HDR will stay around I believe, just as color stayed around when it came to the screen. HDR is also about the amount of light used to light the image measured in NITS and where the light is placed (only lighting the pixels that need it with a discriminate light source is essential). This can be done with a full array LED set up that adds thickness to the display, or the way Sony does it on some models with what they are calling "Slim Backlight Drive"
Peak brightness is also a key factor for really good looking HDR and my TV I use as a monitor has 1184 cd/m² which is outstanding. (Much higher than you can get from an OLED)
No they aren't, you'd probably have seen what an oversaturated image looks like, and how displays can make things look oversaturated, especially on oled mobile screens.
HDR in photos has to do with lightening the shadows and not blowing out the highlights, it's completely different that an HDR display.
HDR for TVs is essentially a display process. It refers to a TV's ability to recognize specialized HDR content and display it in a way "normal" TVs can't.
HDR for cameras is a capture process. This is where multiple photos with different exposures are combined to create an effect that can look more (or less) realistic than a single exposure could.
Yes HDR is an image processing technique. But I am talking about the dynamic range of content you view on screen. HDR displays can't match the output dynamic range of digital images, in fact they are barely better than SDR displays.