Someone explain to me why HDR matters?

RaynorWolfcastle

Diamond Member
Feb 8, 2001
8,968
16
81
So nearly all LCDs on the market are 8-bit panels with a constant brightness backlight, this means that they can only display 256 shades of gray. AFAIK, once the rendering is done the colours are converted by the video card to 8-bit RGB for display (whether LCD or CRT).

So basically, monitors make no provision for HDR... So right now the choices are:

A) Enable HDR, live with a crapload of blooming and inaccurate brightness because your monitor doesn't support HDR.

B) Low dynamic range, but correct color ranging.

So what exactly is the point of using HDR again? Marginally improved quality if you ignore the obvious errors in brightness?
 

AsianriceX

Golden Member
Dec 30, 2001
1,318
1
0
AFAIK, HDR is used to help simulate the human eye's natural pupil adjustment. So, say you were in a dark room for a while and suddenly you walk outside where it's still very bright. Your pupil wouldn't adjust immediately to the brightness. Instead it slowly closes so there's a simulation of too much light entering your eyes (blooming) and it gradually becoming normal.
 
Mar 11, 2004
23,444
5,850
146
Originally posted by: RaynorWolfcastle
So nearly all LCDs on the market are 8-bit panels with a constant brightness backlight, this means that they can only display 256 shades of gray. AFAIK, once the rendering is done the colours are converted by the video card to 8-bit RGB for display (whether LCD or CRT).

So basically, monitors make no provision for HDR... So right now the choices are:

A) Enable HDR, live with a crapload of blooming and inaccurate brightness because your monitor doesn't support HDR.

B) Low dynamic range, but correct color ranging.

So what exactly is the point of using HDR again? Marginally improved quality if you ignore the obvious errors in brightness?

Well current HDR is just meant to simulate that. They do it in such a way to make it seem like its real HDR, when in fact they can't trully do it.

Valve talked about this in some article discussing Lost Coast.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: darkswordsman17
Originally posted by: RaynorWolfcastle
So nearly all LCDs on the market are 8-bit panels with a constant brightness backlight, this means that they can only display 256 shades of gray. AFAIK, once the rendering is done the colours are converted by the video card to 8-bit RGB for display (whether LCD or CRT).

So basically, monitors make no provision for HDR... So right now the choices are:

A) Enable HDR, live with a crapload of blooming and inaccurate brightness because your monitor doesn't support HDR.

B) Low dynamic range, but correct color ranging.

So what exactly is the point of using HDR again? Marginally improved quality if you ignore the obvious errors in brightness?

Well current HDR is just meant to simulate that. They do it in such a way to make it seem like its real HDR, when in fact they can't trully do it.

You can make a fair argument that doing it the way they do is better than simply crushing the values beyond the normal 'maximum' brightness (which is also an 'error', just a different type of error). It's not perfect, but neither is anything else in computer graphics. Unless you have a display with a near-unlimited dynamic range, you will need to use something like tone mapping.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: RaynorWolfcastle
So nearly all LCDs on the market are 8-bit panels with a constant brightness backlight, this means that they can only display 256 shades of gray. AFAIK, once the rendering is done the colours are converted by the video card to 8-bit RGB for display (whether LCD or CRT).

So basically, monitors make no provision for HDR... So right now the choices are:

A) Enable HDR, live with a crapload of blooming and inaccurate brightness because your monitor doesn't support HDR.

B) Low dynamic range, but correct color ranging.

So what exactly is the point of using HDR again? Marginally improved quality if you ignore the obvious errors in brightness?

I'm not sure you can expect a monitor sitting on your desktop to be able to provide zero light at all, or light equal to the brightness of the sun.
 

RampantAndroid

Diamond Member
Jun 27, 2004
6,591
3
81
Originally posted by: Jeff7181
Originally posted by: RaynorWolfcastle
So nearly all LCDs on the market are 8-bit panels with a constant brightness backlight, this means that they can only display 256 shades of gray. AFAIK, once the rendering is done the colours are converted by the video card to 8-bit RGB for display (whether LCD or CRT).

So basically, monitors make no provision for HDR... So right now the choices are:

A) Enable HDR, live with a crapload of blooming and inaccurate brightness because your monitor doesn't support HDR.

B) Low dynamic range, but correct color ranging.

So what exactly is the point of using HDR again? Marginally improved quality if you ignore the obvious errors in brightness?

I'm not sure you can expect a monitor sitting on your desktop to be able to provide zero light at all, or light equal to the brightness of the sun.

Zero light at all will be able to happen with OLED....not brightness of the sun, but I imagine OLED will be pretty bright though.

Current implemntations of HDR are good I think, I liked lost coast and look forward to CS:S levels that support it.....too bad far cry's HDR doesn't work very well.
 

RaynorWolfcastle

Diamond Member
Feb 8, 2001
8,968
16
81
Originally posted by: Jeff7181
I'm not sure you can expect a monitor sitting on your desktop to be able to provide zero light at all, or light equal to the brightness of the sun.

Well no, but if you look at HDR screens from games that use it you have clipping errors all over the place because the rendered dynamic range is well beyond what can actually being supported by the hardware.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: RaynorWolfcastle
Originally posted by: Jeff7181
I'm not sure you can expect a monitor sitting on your desktop to be able to provide zero light at all, or light equal to the brightness of the sun.

Well no, but if you look at HDR screens from games that use it you have clipping errors all over the place because the rendered dynamic range is well beyond what can actually being supported by the hardware.

Screenshot? Proper tone mapping prevents the exact sort of 'clipping' you are talking about. That is, it maps the full dynamic range of the scene down to the 24-bit color range of the output device, rather than crushing or clipping it. This isn't perfect, either, but is a heck of a lot better (in terms of rendering overbright objects) than what you can do without any HDR capabilities.
 

RaynorWolfcastle

Diamond Member
Feb 8, 2001
8,968
16
81
Originally posted by: Matthias99
Originally posted by: RaynorWolfcastle
Originally posted by: Jeff7181
I'm not sure you can expect a monitor sitting on your desktop to be able to provide zero light at all, or light equal to the brightness of the sun.

Well no, but if you look at HDR screens from games that use it you have clipping errors all over the place because the rendered dynamic range is well beyond what can actually being supported by the hardware.

Screenshot? Proper tone mapping prevents the exact sort of 'clipping' you are talking about. That is, it maps the full dynamic range of the scene down to the 24-bit color range of the output device, rather than crushing or clipping it. This isn't perfect, either, but is a heck of a lot better (in terms of rendering overbright objects) than what you can do without any HDR capabilities.

One
Two
Three

In all three of these, half the scene is clipped because of insufficient dynamic range.



 

M0RPH

Diamond Member
Dec 7, 2003
3,302
1
0
I agree with what you're saying here, RW, and that's why I'm not a big fan of HDR. So despite all they hype lately over it, rest assured you are not the only one that thinks it's highly overrated.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Wow...... :)

First we need to take a look at how color values are dealt with- RGBA- Red, Green, Blue, Alpha. For each color value we are limited to 256 possible choices- that represents all of the dynamic range of color we are allowed using 32bit. Because of this we are forced to make a choice- make a game bright or make it dark- not both at the same time. With only 256 possible values per component you can not have an area with both clear and distinct bright areas an dark areas- it can't be done.

So we move to HDR and go from 256 possible choices per component up to 65,536 choices per component. Obviously when you increase the possible color selections by a couple orders of magnitude, then double that, then add a little more for good measure you increase the amount of range you can put into a scene by an enormous amount. The differences in these values may at first seem like they won't do any good as you can only transmit 24bit color to your monitor but that ignores luminance which is really what HDR is all about. For more simplistic versions of HDR with downsampled buffers then you are relying on tone mapping to approximate values- much better then nothing but not what you can get if you use a full float buffer.

When you start combining the factors at play, particularly the per component color values and luminance together in terms of outputting to a screen you need to know that the display can handle it properly. In a realistic sense what we are currently considering HDR is actually a low dynamic range image even if it was being reproduced exactly, we need to move up to 128bit color before we hit 'real' HDR- but what we have now is still a couple of orders of magnitude better then what we have been looking at for years provided your monitor has enough contrast. When talking about HDR and luminance factors, contrast is king when it hits the display.

For our current levels of HDR you need a monitor with a contrast ratio in the area of 4,000:1 to get a decen approximation- anything lower and you are simply cutting the data off(4,000:1 actual, not rated). If your monitor has an actual contrast ratio of 256 then you are peaking with standard rendering- HDR is useless.

It comes down to if you have an extremely poor display, some Wal-Mart caliber POS CRT or any LCD- then HDR is not going to do much of anything for you. LCDs are really disgustingly poor for any sort of accurate rendering anyway, but everyone knew that when they bought one so that shouldn't upset them. Thankfully the end of those embarassments to displays is around the corner- OLEDs are looking to be in the best position to use for general HDR displays, until then it is a decent CRT or you are wasting your time.
 

Steelski

Senior member
Feb 16, 2005
700
0
0
Originally posted by: otispunkmeyer
http://www.bit-tech.net/hardware/2005/10/04/brightside_hdr_edr/1.html

you want one of these for HDR

it can display maximum darkness and maximum bright whiteness at the same time.......i cant wait

http://www.bit-tech.net/hardware/2005/10/04/brightside_hdr_edr/8.html

demo room

MMMMMMMMMMMMMMMMMMMMMMMNIIIIIIIIIIIIIIIIIIIIIIIIIIIICEEEEEE
I wonder if it is implementable with OLCD.

Does anyone want to loan me some money? about 49 thousand.......
This is truly the future .
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
Originally posted by: RaynorWolfcastle
Originally posted by: Matthias99
Originally posted by: RaynorWolfcastle
Originally posted by: Jeff7181
I'm not sure you can expect a monitor sitting on your desktop to be able to provide zero light at all, or light equal to the brightness of the sun.

Well no, but if you look at HDR screens from games that use it you have clipping errors all over the place because the rendered dynamic range is well beyond what can actually being supported by the hardware.

Screenshot? Proper tone mapping prevents the exact sort of 'clipping' you are talking about. That is, it maps the full dynamic range of the scene down to the 24-bit color range of the output device, rather than crushing or clipping it. This isn't perfect, either, but is a heck of a lot better (in terms of rendering overbright objects) than what you can do without any HDR capabilities.

One
Two
Three

In all three of these, half the scene is clipped because of insufficient dynamic range.

Maybe I don't understand what clipping is but I don't see anything wrong with those. Can you be more specific about what you're talking about? If I had to guess I'd assume you're talking about the overbright areas that have lost all detail. I don't see a problem with that though. Look into a dark shed on a sunny day and see what kind of detail you can make out in your peripheral vision on a beige house behind the shed.
 

RaynorWolfcastle

Diamond Member
Feb 8, 2001
8,968
16
81
Originally posted by: BenSkywalker
Wow...... :)

First we need to take a look at how color values are dealt with- RGBA- Red, Green, Blue, Alpha. For each color value we are limited to 256 possible choices- that represents all of the dynamic range of color we are allowed using 32bit. Because of this we are forced to make a choice- make a game bright or make it dark- not both at the same time. With only 256 possible values per component you can not have an area with both clear and distinct bright areas an dark areas- it can't be done.

So we move to HDR and go from 256 possible choices per component up to 65,536 choices per component. Obviously when you increase the possible color selections by a couple orders of magnitude, then double that, then add a little more for good measure you increase the amount of range you can put into a scene by an enormous amount. The differences in these values may at first seem like they won't do any good as you can only transmit 24bit color to your monitor but that ignores luminance which is really what HDR is all about. For more simplistic versions of HDR with downsampled buffers then you are relying on tone mapping to approximate values- much better then nothing but not what you can get if you use a full float buffer.

When you start combining the factors at play, particularly the per component color values and luminance together in terms of outputting to a screen you need to know that the display can handle it properly. In a realistic sense what we are currently considering HDR is actually a low dynamic range image even if it was being reproduced exactly, we need to move up to 128bit color before we hit 'real' HDR- but what we have now is still a couple of orders of magnitude better then what we have been looking at for years provided your monitor has enough contrast. When talking about HDR and luminance factors, contrast is king when it hits the display.

For our current levels of HDR you need a monitor with a contrast ratio in the area of 4,000:1 to get a decen approximation- anything lower and you are simply cutting the data off(4,000:1 actual, not rated). If your monitor has an actual contrast ratio of 256 then you are peaking with standard rendering- HDR is useless.

It comes down to if you have an extremely poor display, some Wal-Mart caliber POS CRT or any LCD- then HDR is not going to do much of anything for you. LCDs are really disgustingly poor for any sort of accurate rendering anyway, but everyone knew that when they bought one so that shouldn't upset them. Thankfully the end of those embarassments to displays is around the corner- OLEDs are looking to be in the best position to use for general HDR displays, until then it is a decent CRT or you are wasting your time.

Except it really doesn't matter what kind of display you're using because while video cards can internally handle HDR, since AFAIK none of them actually output HDR to the display's framebuffer. From what I understand, while video cards can internally render using floating point notation, the framebuffer that's right before the DAC is a 24-bit integer buffer and outputs that same 16.7M colors over VGA regardless of your monitor. The same obviously goes for any LCD over DVI because those monitors expect integer 24-bit colour information over the DVI link.

Tone mapping does help in that it tries to map an infinite dynamic range to a very limited range, but now you only have very coarse quantization of this space. You're still constrained to the same 256 gray levels you've always had, you're just using them differently. This is what results in the lack of detail, regardless of your monitor's contrast ratio.

Now this isn't to say that HDR is useless; if you had reworked a monitor's s color space to be floating point then HDR would work. Quite simply this is not the case today and would require a reworking of how mainstream video cards interface with monitors.

As a side note, it really doesn't matter what the monitor's contrast ratio is as far as getting correct lighting is concerned; HDR as it is defined here only changes HOW the monitor's different brightness levels are accessed. If you have a monitor that has 1,000,000:1 contrast ratio and you're using 24-bit colour (excluding alpha) you have access to a huge dynamic range in terms of brightness, but your quantization is very coarse. On the other hand, if your monitor has 100:1 contrast ratio, you have a very fine quantization but low dynamic range in terms of brightness.
 

rbV5

Lifer
Dec 10, 2000
12,632
0
0
You can use it now to increase the dynamic range of your photographs. Photoshop now supports some HDR fnctions so that you can merge different exposures to produce images otherwise not possible.

Link

fixed link
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
? to this discussion

As I understand HDR means internally graphic cards use longer numbers (more bits to a float) to allow them to calculate more complex lighting effects without running into rounding errors. To us it means they can now simulate lighting effects such as bloom which more accurately reflects the way our eyes respond to light.

This has nothing to do with the monitor, the output is still 24 bit colour and that's fine. You don't need a monitor that can be as bright as the sun - it's not the monitor that's doing hdr (e.g. individual pixels are so bright they blind you a bit) it's the graphics card, the monitor just displays the result as it always has.

As to clipping, I presume by that you mean bits of the scene look too white - it's meant to, if you stare at something really bright your eyes do that. The pictures are correctly showing what the HDR is trying to do.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Except it really doesn't matter what kind of display you're using because while video cards can internally handle HDR, since AFAIK none of them actually output HDR to the display's framebuffer.

They do output HDR to the display framebuffer now if it is requested- NV4x, G70 and R420 and up I believe(I know the R520 based parts do) all support this functionality. They also have DVI displays that input FP16 natively and display it that way(see Brightside, requires dual link obviously as 'normal' DVI is far too primitive).

The same obviously goes for any LCD over DVI because those monitors expect integer 24-bit colour information over the DVI link.

For LCDs of course, they are junk displays incapable of showing that level of detail, why bother?

Now this isn't to say that HDR is useless; if you had reworked a monitor's s color space to be floating point then HDR would work. Quite simply this is not the case today and would require a reworking of how mainstream video cards interface with monitors.

Use the HDR data to modify the luminance during scan out.

As a side note, it really doesn't matter what the monitor's contrast ratio is as far as getting correct lighting is concerned

It absolutely does. If your monitor can't handle the varriance in luminance the scene has then you will not get proper lighting displayed in front of you- there is no way around this.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
I think people are misunderstanding what HDR is actually supposed to accomplish. It's not supposed to create realistic lighting... it's supposed to simulate realistic lighting. If a monitor was capable of producing every intensity level from absolutely zero light to the brightness of the sun, it wouldn't be an issue. We wouldn't need this type of HDR. But that's not possible (yet), practical, or safe. Imagine people being blinded because they looked at a scene in a video game too long and experienced the same effect as if they looked at the sun for too long.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: BenSkywalker
It comes down to if you have an extremely poor display, some Wal-Mart caliber POS CRT or any LCD- then HDR is not going to do much of anything for you. LCDs are really disgustingly poor for any sort of accurate rendering anyway, but everyone knew that when they bought one so that shouldn't upset them. Thankfully the end of those embarassments to displays is around the corner- OLEDs are looking to be in the best position to use for general HDR displays, until then it is a decent CRT or you are wasting your time.

I wouldn't say 'any' LCD. What about the BrightSide display? Some LCDs with LED backlighting best CRTs AFAIK, or at least the BrightSide does (that's what the article sounded like). I know it's $50,000 but it's still an LCD, and hopefully it will come down in price. What is the contrast ratio of a decent AG CRT (but not some $50,000 specialized beast, just a consumer AG CRT)? Or do they even have CRTs that display a much vaster range than any of the best consumer AG CRTs? The Samsung 770P S-PVA LCD was measured at 230 / 0.2 cd/m²=1150:1 (marketed "1500:1"). I'm planning on getting one of these babies for X-mas and I want to see how that compares to other CRTs. Is a consumer shadow mask CRT (only ones still being produced) even any better than that?

http://www.behardware.com/articles/594-...ng-syncmaster-770p-pva-6ms-1500-1.html
As usual we measured the monitor´s real contrast ratio to compare it to Samsung?s 1500:1 claim. Before and after calibration, white is around 230 cd/m². For the first time we weren?t able to measure darkness as it was too deep for our first instrument and gives us a zero result. The second tool, which is supposed to be more precise, was unable to go below 0.2 cd/m². The final result is then ?0.2? but we are probably below this. Based on this measurement we can only say that the contrast ratio is at least 230 / 0.2 cd/m² or 1150: 1.

The NEC FE990 CRT only has a brightness of 100 cd/m² according to many sites including this one:
http://www.monitoroutlet.com/782013.html

I couldn't find specs for it on NEC's site because it was discontinued. Not sure if it's any good or whether it's a shadow mask or aperture grille.

Mitsubishi Diamondtron UWG RDF225WG:
http://www.necdisplay.com/corpus/L/U/RDF225WG_Brochure_0804.pdf

Even this $5000 Diamondtron only has a brightness of 100 cd/m². What's with that? Does the black level between 0.2 cd/m² and 0.0cd/m² make that big of a difference? Wouldn't that mean the LCD could display far brighter at 230 cd/m² or am I completely misunderstanding something (or it's measured different??)?
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
I thought the argument that LCDs sucked was based on their color precision, not their contrast ratios. IIRC correctly, LCDs beast CRTs in contrast ratios (mostly by being able to be far brighter, though usually they don't handle darks anywhere near as well), but their color precision sucks. That said, Lost Coast looked quite nice on my 6-bit LCD, so I don't think it's quite a waste to output HDR to an LCD.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: Fox5
I thought the argument that LCDs sucked was based on their color precision, not their contrast ratios. IIRC correctly, LCDs beast CRTs in contrast ratios (mostly by being able to be far brighter, though usually they don't handle darks anywhere near as well), but their color precision sucks. That said, Lost Coast looked quite nice on my 6-bit LCD, so I don't think it's quite a waste to output HDR to an LCD.

Color precision is hardly anything "god awful". http://www.behardware.com/articles/594-...ng-syncmaster-770p-pva-6ms-1500-1.html

Besides that, this LCD hits a 0.2cd/m² black level while a CRT hits very close to 0. But is it that significant? I'm looking at probably at least 150 cd/m² out of my backlight now and it's hardly anything bright. I really doubt you could tell the difference between 0 and 0.2. So maybe you could...is it going to make that much of a difference? You don't see pure black that often when gaming.
 

Nextman916

Golden Member
Aug 2, 2005
1,428
0
0
would you guys want panels to ever be as bright as the sun??? i mean you would be blind after an hour of gaming, even if its not direct it would be a huge strain on your eyes.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: Nextman916
would you guys want panels to ever be as bright as the sun??? i mean you would be blind after an hour of gaming, even if its not direct it would be a huge strain on your eyes.

No, I sure wouldn't want that. There are people that can't even stand current LCDs. They said the Brightside LCD had that effect. I'd be happy with a very high contrast ratio and a lower brightness like 500 cd/m².
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Except it really doesn't matter what kind of display you're using because while video cards can internally handle HDR, since AFAIK none of them actually output HDR to the display's framebuffer.
The display doesn't have a framebuffer, only the GPU does.

From what I understand, while video cards can internally render using floating point notation, the framebuffer that's right before the DAC is a 24-bit integer buffer and outputs that same 16.7M colors over VGA regardless of your monitor.
Sure but the data was composed from FP/shader blending and hence it has a much higher dynamic range due less truncation/rounding issues and similar while it gets to the framebuffer. This still makes a difference regardless of the integer storage at the end.

if you had reworked a monitor's s color space to be floating point then HDR would work.
No reworking is required to make HDR viable. When done right it looks great (Painkiller, Riddick, HL2, etc).

In all three of these, half the scene is clipped because of insufficient dynamic range.
If half of the scene was clipped then it wouldn't be rendered at all, which is clearly not the case.