• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

High Dynamic Range (HDR) [TFTCentral]

nathanddrews

Graphics Cards, CPU Moderator
Super Moderator
Aug 9, 2016
965
534
136
www.youtube.com
Only problem with that article is that is leaves out HDMI 2.1 and its game mode and variable refresh. I realize it's "not out yet", but neither is FreeSync 2 or G-Sync HDR.
 

Rifter

Lifer
Oct 9, 1999
11,519
745
126
so basically no computer monitors are even trying to get HDR right to the specs that have been decided on by the UHD peeps. At least thats what i took away from that article. TV's doing it right monitors doing it wrong.
 

Sonikku

Lifer
Jun 23, 2005
15,641
4,231
136
What pisses me off is all the HDR and "fake" HDR labels everyone's putting on their tv boxes to fool consumers. So it can "display" HDR content without any of the benefits of HDR. WHAT THE FUDGE DOES THAT EVEN MEAN? What "can't" display HDR without any of the benefits? ugh. Talk about a concerted attempt to confuse consumers.
 
  • Like
Reactions: Valantar

gorobei

Diamond Member
Jan 7, 2007
3,181
321
126
its another marketing point that is premature. everyone upgrading to 4k as part of normal obsolescence cycle is facing a commodity choice(all versions more or less equal) so one more buzzword is being hyped to bring the fud of buying something outdated.

until OLED or higher resolution local dimming arrays are more common on pc monitors all they are doing is prepping the standard. the hardware for the actual contrast differential isnt cheap enough yet(not by a few years yet) but the LUT standards for the signal processing can be incorporated now. so a tv can be hdr compliant but not really capable of doing anything with the extra data.

the anime fansub crowd were on this 5 years ago. they switched to 10bit(per channel) color encoding back then in order to better process the color gradient data from the raws, even though there were no hdr monitors. it was simply about better fidelity of the image even on monitors only capable of 8bit/channel. the gradient transitions were better with less banding but still just a bandaid for the problem of 24bit color.

until real static contrast ratios improve, this is ignorable.
 

EliteRetard

Diamond Member
Mar 6, 2006
6,490
1,017
136
Color was a great upgrade (from the mono green KayPro 10), but I remember the most epic being the upgrade from 16 to 256 color.
After that was 16 bit, a very good upgrade...but not epic like 16 to 256 color.
Then there was 24 bit, don't have much to say about that.
Then 32 bit, not sure I can tell 32 from 24 bit.
Dunno what HDR is.
 

EliteRetard

Diamond Member
Mar 6, 2006
6,490
1,017
136
Funny that, decided to look it up and found out that 24 and 32 bit do in fact have the same color. The extra 8 bits are for transparency/shadows stuff like that. Color bands wont show it, but looking at test photos there is a difference. It's been way to long, I don't remember seeing a difference when I compared them way back in the day...but I've been using 32 bit since like 2003.
 

Mondozei

Golden Member
Jul 7, 2013
1,043
40
86
its another marketing point that is premature. everyone upgrading to 4k as part of normal obsolescence cycle is facing a commodity choice(all versions more or less equal) so one more buzzword is being hyped to bring the fud of buying something outdated.

until OLED or higher resolution local dimming arrays are more common on pc monitors all they are doing is prepping the standard. the hardware for the actual contrast differential isnt cheap enough yet(not by a few years yet) but the LUT standards for the signal processing can be incorporated now. so a tv can be hdr compliant but not really capable of doing anything with the extra data.

the anime fansub crowd were on this 5 years ago. they switched to 10bit(per channel) color encoding back then in order to better process the color gradient data from the raws, even though there were no hdr monitors. it was simply about better fidelity of the image even on monitors only capable of 8bit/channel. the gradient transitions were better with less banding but still just a bandaid for the problem of 24bit color.

until real static contrast ratios improve, this is ignorable.
So, basically HDR is overhyped for the moment, is what you're saying. Wait for OLED?
 

nathanddrews

Graphics Cards, CPU Moderator
Super Moderator
Aug 9, 2016
965
534
136
www.youtube.com
So, basically HDR is overhyped for the moment, is what you're saying. Wait for OLED?
In the PC monitor space, HDR hasn't yet caught up to the TV space.

If you need to have HDR today, get yourself a local-dimming HDR/WCG TV. You'll spend the same or less than an HDR PC monitor and get a much better image, more features, and much larger screen. Of course, there are only like 5-6 true HDR games currently available, but there is a lot more HDR video content from UHD Blu-ray, Netflix, Amazon, YouTube.

If you have a dark room and lots of cash, get the LG OLED or wait for the Sony OLED or Panasonic OLED (if you live in Europe). Doesn't get super bright as VA, but it's blacks and contrast are insane.
If you have a bright room and money is tight, get a VA Samsung KS8000 series or comparable Sony model. Gets really bright, but doesn't have the same black and contrast as OLED.
 

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
Color was a great upgrade (from the mono green KayPro 10), but I remember the most epic being the upgrade from 16 to 256 color.
After that was 16 bit, a very good upgrade...but not epic like 16 to 256 color.
Then there was 24 bit, don't have much to say about that.
Then 32 bit, not sure I can tell 32 from 24 bit.
Dunno what HDR is.
Back when I swapped out my old 24" Samsung TN monitor for a Dell U2711 IPS, that gave me a feeling quite close to the move from green fluorescent displays to 256 colors (don't think I ever experienced anything 16-color, but I was like 6 at the time, so idk).
 

tential

Diamond Member
May 13, 2008
7,363
641
121
its another marketing point that is premature. everyone upgrading to 4k as part of normal obsolescence cycle is facing a commodity choice(all versions more or less equal) so one more buzzword is being hyped to bring the fud of buying something outdated.

until OLED or higher resolution local dimming arrays are more common on pc monitors all they are doing is prepping the standard. the hardware for the actual contrast differential isnt cheap enough yet(not by a few years yet) but the LUT standards for the signal processing can be incorporated now. so a tv can be hdr compliant but not really capable of doing anything with the extra data.

the anime fansub crowd were on this 5 years ago. they switched to 10bit(per channel) color encoding back then in order to better process the color gradient data from the raws, even though there were no hdr monitors. it was simply about better fidelity of the image even on monitors only capable of 8bit/channel. the gradient transitions were better with less banding but still just a bandaid for the problem of 24bit color.

until real static contrast ratios improve, this is ignorable.
I want 4K pixels though.....
I mean sure, 4K is premature, and the vast majority of people buying it don't need it. I don't understand how people are even tricked into it. But for gamers....
I mean, if you don't need high refresh rate, a 4K display and SLI 1080Ti ain't so bad.....
 

giantpandaman2

Senior member
Oct 17, 2005
580
11
81
The great hope is that 4k OLED screens have begun coming out on laptops. I'd really love to see a 32" 4K OLED with HDR.

Ultimately, though, the premium PC display market is being dominated by 144Hz+ monitors. I mean, the holy grail would be a 4k HDR 144Hz OLED monitor...but that doesn't seem to be happening anytime soon. If it is...well, I don't even want to think about what it will cost.

It'd also be nice if MS got its head out of its butt and supported color space beyond sRGB properly.

It's weird. For the longest time monitors always had the resolution, color, and framerate advantage over TV's. Now they're lagging far, far behind.
 

GoodRevrnd

Diamond Member
Dec 27, 2001
6,788
570
126
LG is likely to have 120hz+ OLED monitors for 2018. I think this is also when their new OLED plant is supposed to go online and offer a lot of scale improvements to drive costs down. Who knows if any of this will benefit monitors though or if they'll be sticking to the 55"+ TV space. As much as I wanted to wait for the Sony for superior motion processing, I was able to snag a 65" E6P for $2300 and it would be another year before the Sony hit $3500 if it even gets that low.

Meanwhile I've been sitting here for 6 months waiting for a competent 27"+ 100hz+ 1440p+ VA monitor and am repeatedly met with disappointment.

HDR looks great! I did look at some YouTube videos comparing games to HDR on/off and was not impressed. For example, while Last of Us had some bolder more natural colors in some areas relative to the luminance of the scene, overall it just looked like they turned off dust/haze filters to make colors look brighter so the game had kind of an awkward dated graphics look (think old school Unreal shiny surface syndrome).
 

repoman0

Diamond Member
Jun 17, 2010
3,369
1,657
136
I couldn't hold out for PC monitors to get this right. 32" 4k IPS is really nice, even with those relatively pitiful blacks and only 60Hz. I can fit side by side code editors on top left, two terminals bottom left, web browser on the right ... SO MUCH ROOM. Plus I still trust normal IPS and sRGB more than anything more modern to get colors accurate for photography.

Won't hesitate to upgrade if something significantly better comes along, as long as colors are not only deeper and higher bit depth but accurate to print as well.
 

giantpandaman2

Senior member
Oct 17, 2005
580
11
81
I couldn't hold out for PC monitors to get this right. 32" 4k IPS is really nice, even with those relatively pitiful blacks and only 60Hz. I can fit side by side code editors on top left, two terminals bottom left, web browser on the right ... SO MUCH ROOM. Plus I still trust normal IPS and sRGB more than anything more modern to get colors accurate for photography.

Won't hesitate to upgrade if something significantly better comes along, as long as colors are not only deeper and higher bit depth but accurate to print as well.
I hear you on the color accuracy for prints. As a non-photographer gamer, though, I just want better contrast/colors @ >90 Hz. I would expect color accuracy to improve dramatically for OLED once it penetrates the PC market. The high end who will pay for it will demand it. I'm sure there are plenty of professionals in the creative visual markets who are willing to spend quite a bit for it.
 

tential

Diamond Member
May 13, 2008
7,363
641
121
I couldn't hold out for PC monitors to get this right. 32" 4k IPS is really nice, even with those relatively pitiful blacks and only 60Hz. I can fit side by side code editors on top left, two terminals bottom left, web browser on the right ... SO MUCH ROOM. Plus I still trust normal IPS and sRGB more than anything more modern to get colors accurate for photography.

Won't hesitate to upgrade if something significantly better comes along, as long as colors are not only deeper and higher bit depth but accurate to print as well.
Same, after I got my first 4K display though I am completely satisfied. The fact that I wanted a second 4K display for multimonitor even though I really doubt I'd make use of it lets me know that I won't just be tossing this monitor away when the high refresh rate monitors come out.

A large hurdle point for these new technologies is that people for the most part are happy with what they have.
 

ASK THE COMMUNITY