Originally posted by: jiffylube1024
	
	
		
		
			Originally posted by: AnnoyedGrunt
So, what is HDR doing that is so complex?  Looking @ the screenshots makes it look like they simply increased the brighness in some areas, and then gradually blended back down to the "stardard" brightness in the dark areas.  Couldn't a similar look be given to the game by modifying the lighting parameters?
I agree that it looks pretty cool, but I don't really see why it couldn't look the same in the "non-HDR" version, if you wanted it to look that way.  Perhaps it is the way the brightness blends from light to dark that is special?
Can anyone clarify what makes HDR different than just "really bright highlights on certain areas"?
Thanks,
D'oh!
		
		
	 
No HDR is not just "really brightening the colour," it is MUCH more than that.  HDR is a new way of rendering colour and it's purpose is to eliminate the limitations of a fixed number of colours (ie integers) by replacing it with floating bit values (AFAIK).  It doubles the bits per channel: 16-bits for each of the colour channels (Red, Green, Blue and Alpha) over 32-bit (8 bits per channel), and this uses an insane amount of bandwidth and GPU power.
With HDR, you can get essentially an 'unlimited' number of colour values to represent the insane variations in brightness and contrast of light in real world.
HDR in Far Cry uses the OpenEXR standard, developed by Industrial Light & Magic and currently only supported on the GeForce FX 6800 series of cards.  
Here are a couple of bits of information on Nvidia's HDR in the 6800 series cards: 
neoseeker and 
Xbitlabs .
Essentially, this is exactly what John Carmack was talking about when he was saying that 32-bit colour was insufficient for photorealistic colour in games and that the next step would have to be taken.
HDR cannot be done just by 'adjusting the brightness,' however ATI has supported HDR since the 9700 series. ATI's HDR is of lower quality than OpenEXR/64-bit ... 48-bits perhaps (12 per channel?).
It still looks great in the early screenshots of HL2 and the Pixel Shader/Shader Mark/RTHDRIBL (Real Time HDR IBL)/etc, though, and I'm not exactly sure why Crytek decided on the higher quality (and worse performing) bandwidth needy 64-bit HDR that only Nvidia supports, instead of HDR that would work on both ATI and Nvidia cards.
One bit of speculation I heard was that the lower quality HDR it didn't look good or had errors/rendering imperfections at lower quality than 64-bit (in Far Cry at least); although this is just speculation and the other side of the coin is that Far Cry is a TWIMTBP game so it would make sense for a proprietary feature to 'sell' Nvidia cards.
Regardless, this is all conjecture; what we know so far is that HDR looks great in the Pixel Shader demos we've seen so far and it is supported in HL2, (unless the rumours that it was pulled from HL2 at the last minute are true... more speculation).  We also know that 64-bit HDR looks incredible in Far Cry.  Furthermore, we can see that HDR will require a bit more tweaking from developers as in Far Cry where in some situations the HDR lighting is totally overdone and way too bright than any realistic scenario would allow (such as indoor lights that are blinding).
Finally, the bandwidth required to run HDR (especially 64-bit) is huge - it basically kicks us back a generation or more in terms of GPU rendering performance.  However, in my opinion, it will eventually become the standard since it makes stuff look so much more realistic, just like shaders.  We're a few GPU generations off before feasible high-performance HDR, it seems.