- Feb 8, 2001
- 8,968
- 16
- 81
I meant the output buffer right before either the DAC or the DVI connection.The display doesn't have a framebuffer, only the GPU does.Except it really doesn't matter what kind of display you're using because while video cards can internally handle HDR, since AFAIK none of them actually output HDR to the display's framebuffer.
You still have the exact same dynamic range as before, you do realize that right? Your CRT still gets voltage signals in the same range over VGA and your LCD still gets the same numbers over DVI (excluding the Brightside LCD which no one actually has).Sure but the data was composed from FP/shader blending and hence it has a much higher dynamic range due less truncation/rounding issues and similar while it gets to the framebuffer. This still makes a difference regardless of the integer storage at the end.From what I understand, while video cards can internally render using floating point notation, the framebuffer that's right before the DAC is a 24-bit integer buffer and outputs that same 16.7M colors over VGA regardless of your monitor.
From the screenshot I've seen, it looks nice sometimes, and sometimes it looks horrid. Look at the screenshots I posted, they look horrid.No reworking is required to make HDR viable. When done right it looks great (Painkiller, Riddick, HL2, etc).if you had reworked a monitor's s color space to be floating point then HDR would work.
Clipping in the DSP sense, not in the 3D pipeline sense. Clipping in the sense that there's insufficient dynamic range so you have a crapload of quantization error. You see how half the scene is completely white? That's because HDR is useless in a 24-bit integer color space.If half of the scene was clipped then it wouldn't be rendered at all, which is clearly not the case.In all three of these, half the scene is clipped because of insufficient dynamic range.
Yes, you will get proper lighting with a very compressed dynamic range. You can split the range 0 to 1 with fp32 and get "correct" lighting in the same sense that you can split the range 0 to 1 billion with fp32 with "correct" lighting. Obviously you have better dynamic range with the larger range, but both are "correct". There's no absolute measure of brightness once the monitors are calibrated (at the factory and by you).It absolutely does. If your monitor can't handle the varriance in luminance the scene has then you will not get proper lighting displayed in front of you- there is no way around this.
BTW mainstream CRTs and LCDs didn't just suddenly acquire new dynamic range nor did it suddenly get "unlocked" by HDR. For VGA output, each sub-pixel receives a voltage that corresponds to the range [0,Vmax], which is exactly the range you had before video cards started displaying HDR. Same with LCDs, they still have the exact same dynamic range as they previously had; monitors don't suddenly acquire new ways to display images.
I'm not saying that my monitor should be so bright that I have to wear sunglasses to watch it, I'm just saying that the HDR screens that I've seen are for the most part ridiculously overexposed to show off a "blooming" effect. If Brightside uses fp16 to modulate brightness levels, then that's great; that's how HDR *should* be done.