Something to note is I expect you will have to recode the renderer in games to take advantage of HDR. So as I understand it renderers right now have to turn what is essentially full range output from the code into something that looks good on limited DR monitors. If it knew it was renderering to full HDR it should in theory should be able to pass the full range output straight out, but it won't do that unless someone has specifically coded it to work.
HDR is just high dynamic range, right? Maybe I didnt read any of the articles on it, but I know Oblivion (2006) had an option to enable HDR, and that was 10 years ago. So how is this any better/different?
HDRR is what we currently have - high dynamic range rendering. The game takes the darkest (ex: 0 nits) and brightest (ex: 23,000 nits) parts of any given scene in real time and effectively normalizes it all to fit within the range of 0 nits to 100 nits. Every game/engine probably uses a different rendering method to achieve this, but these HDR values are never sent to the display. The display only gets the standard 8-bit or 10-bit video stream (no HDR metadata) and reads it on the basis of that display's minimum and maximum brightness level (usually calibrated for 0-100 nits). This is an oversimplification. See
here for more details about methods.
I would love to see every HDRR game get an update to pass along that HDR data to the display, but I'm not sure if that would require a change to the engine (likely), the API (likely), or the game itself (likely). Most likely we'll see an avalanche of "Remastered in HDR!!!1!" game releases in the next couple years.
It is analogous to DSR/VSR where the GPU will render a scene at 4K, but then downsample it to 1920x1080 because that's all the resolution that your monitor can display. There is a visible improvement, but it's not the same as the real thing.