Originally posted by: BenSkywalker
Wow......
First we need to take a look at how color values are dealt with- RGBA- Red, Green, Blue, Alpha. For each color value we are limited to 256 possible choices- that represents all of the dynamic range of color we are allowed using 32bit. Because of this we are forced to make a choice- make a game bright or make it dark- not both at the same time. With only 256 possible values per component you can not have an area with both clear and distinct bright areas an dark areas- it can't be done.
So we move to HDR and go from 256 possible choices per component up to 65,536 choices per component. Obviously when you increase the possible color selections by a couple orders of magnitude, then double that, then add a little more for good measure you increase the amount of range you can put into a scene by an enormous amount. The differences in these values may at first seem like they won't do any good as you can only transmit 24bit color to your monitor but that ignores luminance which is really what HDR is all about. For more simplistic versions of HDR with downsampled buffers then you are relying on tone mapping to approximate values- much better then nothing but not what you can get if you use a full float buffer.
When you start combining the factors at play, particularly the per component color values and luminance together in terms of outputting to a screen you need to know that the display can handle it properly. In a realistic sense what we are currently considering HDR is actually a low dynamic range image even if it was being reproduced exactly, we need to move up to 128bit color before we hit 'real' HDR- but what we have now is still a couple of orders of magnitude better then what we have been looking at for years provided your monitor has enough
contrast. When talking about HDR and luminance factors, contrast is king when it hits the display.
For our current levels of HDR you need a monitor with a contrast ratio in the area of 4,000:1 to get a decen approximation- anything lower and you are simply cutting the data off(4,000:1 actual, not rated). If your monitor has an actual contrast ratio of 256 then you are peaking with standard rendering- HDR is useless.
It comes down to if you have an extremely poor display, some Wal-Mart caliber POS CRT or any LCD- then HDR is not going to do much of anything for you. LCDs are really disgustingly poor for any sort of accurate rendering anyway, but everyone knew that when they bought one so that shouldn't upset them. Thankfully the end of those embarassments to displays is around the corner- OLEDs are looking to be in the best position to use for general HDR displays, until then it is a decent CRT or you are wasting your time.