The Radeon X800 can do HDR, any good DX9 can do HDR using Int12 (4.12) (Except GeForce FX). The main problem is that it never was adopted. Even though the precision is lower than the FP16 based one, it's quite hard to spot the difference in a screen. Have you ever compared Far Cry HDR vs Half Life 2 HDR? Half Life 2 HDR looks better for me cause it has a more natural looking, what I don't like is that when it overbrights, the color and detail of the area tends to be lost, that's why we needed FP16 which doesn't have such problems. FP16 is nice too but sometimes makes the scenes a little too colorful for my tastes, but add a certain realism in dark places. Oblivion really looks beautiful with it on, I played on the X800XTPE with a mod that allowed me to use HDR through Pixel Shaders, and now I use the X1950XT with normal FP16 HDR, the only difference between them for me is that in the X800XT PE the low precision of the 96-Bit Pixel Shader created some color banding in the sky, (Seems that 96-Bit precision is good for anything else but HDR and Offline rendering) and in some places, the overbright is not that overbright, in the X1950XT, the HDR looks more convincent and with a wider gamma of colors in the HDR maps.