• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Technical question about HDR Rendering in Far Cry

CHfan4ever

Diamond Member
New far cry patch 1.3 for far cry is out, and it got a nice feature in it call HDR Rendering that you can enable.(for nvidia 6800 gpu card owner only)

What it do tis incredible.It give light, blur effect to environment, texture etc.You have to see it to beleive it.(it just suck tho that we cannot use AA with this option but still i rather play with HDR than AA)

BUT!

It eat up my system performance to use this option.I run around 30-40fps only now, instead of a nice 70-80.

2 QUESTIONS!

1)The fact that it now run 2x time slower with the option enable, who do we need to blame?Nvidia or Crytek?Is it the card performance that affect the framerate?Maybe its because the card is made like this?(Like 2 years ago when ATI was eating up NVIDIA with AA option.ATI was jsut having a betetr performance card when using AA, can it be the same thing with nvidia 6800 cards?)OR can it be CRYTEK fault to have bad patch( cause the HDR is beta ) and maybe an upcoming patch can fix that framerate ?

2)Any other games on the market use that option OR WILL use that HDR rendering effect?Im curious

If u need an example on what the effect do, here is a link. http://forums.ubi.com/eve/ubb....06891&m=3621005432

Plz post your tough on that subhect too.I found that crazy that X800XT owner cannot use that effect.Its a non-sense.A $500 card that doesnt do latest effects.

 
😕

One, it's frickin' HDR. You've seen the screenshots. You know what it looks like. Do you have any idea what kind of calculations need to be done to get that kind of effect? There's no one to blame but the laws of mathematics. 😛

Secondly, yes, HDR will be used in the future. Sucks to be an X800 owner, doesn't it? 😉

- M4H
 
Originally posted by: Acanthus
But omgz SM2.0+ is just as good!! 😉

Teh Of Coursex0rs ... anyone who says otherwise is an IGNORANT NVEEDIOT ... regardless of whether or not they've actually seen code examples and taken a good hard look at the actual process involving in coding SM3 vs SM2 😉

Why do you suppose only the SM3-capable 6800s are supported in this patch? 😉

- M4H
 
Originally posted by: MercenaryForHire
Originally posted by: Acanthus
But omgz SM2.0+ is just as good!! 😉

Teh Of Coursex0rs ... anyone who says otherwise is an IGNORANT NVEEDIOT ... regardless of whether or not they've actually seen code examples and taken a good hard look at the actual process involving in coding SM3 vs SM2 😉

Why do you suppose only the SM3-capable 6800s are supported in this patch? 😉

- M4H

Is HDR and FP blending part of SM3.0 or is it just a separate feature that Nvidia introduced with the 6800 series.

I think that many of the HDR screenshots look amazing (especially when it's not overdone, in some pictures indoor lights are blindingly blue, for example), but the performance hit really limits the practicality of this effect on this generations of cards. It's very pretty to look at, but, especially on your first run through, it seems the low framerate would be quite annoying.

It does make me jealous as heck though, I want to be able to at least try out HDR!


I actually read somewhere (not sure if it's speculation or not) that 'true' HDR isn't in Half Life 2 due to errors with partial precision HDR cards (ie ATI Radeons), so they just pulled HDR altogether and implemented a sort of knockoff of HDR in shaders to run on everything.

Take that with a massive grain of salt, but it seems clear that HDR requires massive amounts of video precision (and blending, apparently) and, worse, incurs a gigantic performance hit.
 
Originally posted by: jiffylube1024
Is HDR and FP blending part of SM3.0 or is it just a separate feature that Nvidia introduced with the 6800 series.

If my perspective on it is correct, I look at it this way:

HDR is a shader technique, not a hardware function. If you were patient enough to wait the few seconds per frame, it can even be done in software. 😉

There's some demo with a long weird name that runs on any DX9-capable card (FX5200+, 9500+) that showcases HDR effects.

- M4H
 
Originally posted by: MercenaryForHire
Originally posted by: jiffylube1024
Is HDR and FP blending part of SM3.0 or is it just a separate feature that Nvidia introduced with the 6800 series.

If my perspective on it is correct, I look at it this way:

HDR is a shader technique, not a hardware function. If you were patient enough to wait the few seconds per frame, it can even be done in software. 😉

There's some demo with a long weird name that runs on any DX9-capable card (FX5200+, 9500+) that showcases HDR effects.

- M4H

The analogy I read (in terms of performance) over on NVnews is that turning on HDR is like knocking your card back a generation; someone said it felt like going from their 16X1 6800GT back to an 8X1 card.

So, it looks stunning (most of the time, again when it's not overdone), but you definately pay for it.

I've never been that keen on AA myself though, and I would gladly sacrifice AA for something like HDR, which I think is infinitely more beneficial (as long as we're talking 1280X960 or above here 😛 ).


The other think I read was that HDR generally knocked FPS down from the 75-80 range to the 25-30 range. Not sure if this was at 1280 or 1600 res though, nor if AF was on or not (I think it was 1280 with AF).
 
Wow I just played a bit with HDR turn on and it is so freaking sweet I might have to start playing that game again hehe. It does give your card a kick in the butt though.
 
Originally posted by: ifesfor

1)The fact that it now run 2x time slower with the option enable, who do we need to blame?Nvidia or Crytek?Is it the card performance that affect the framerate?Maybe its because the card is made like this?(Like 2 years ago when ATI was eating up NVIDIA with AA option.ATI was jsut having a betetr performance card when using AA, can it be the same thing with nvidia 6800 cards?)OR can it be CRYTEK fault to have bad patch( cause the HDR is beta ) and maybe an upcoming patch can fix that framerate ?

Nobody is to blame in this situation; certainly not Crytek. Blame nature if you want for incorporating such a complex system of lighting and light sources 😛 .

I wouldn't expect too much from Crytek in terms of major patches anymore. This was the major patch; they may fix some of the bevy of problems with HDR (see-through walls, Cryvision no longer working properly, etc), but there won't be a major HDR overhaul or anything. They got the main fixes they wanted out, and 64-bit support was officially cancelled AFAIK.
 
Originally posted by: MercenaryForHire
Originally posted by: jiffylube1024
Is HDR and FP blending part of SM3.0 or is it just a separate feature that Nvidia introduced with the 6800 series.

If my perspective on it is correct, I look at it this way:

HDR is a shader technique, not a hardware function. If you were patient enough to wait the few seconds per frame, it can even be done in software. 😉

There's some demo with a long weird name that runs on any DX9-capable card (FX5200+, 9500+) that showcases HDR effects.

- M4H


16-bit Floating point blending is a hardware function, and one only avalable on the Geforce6 and even then only without AA. All you people going on about SM3 are off base here, The HDR looks exactly the same using SM2.0.
 
So, what is HDR doing that is so complex? Looking @ the screenshots makes it look like they simply increased the brighness in some areas, and then gradually blended back down to the "stardard" brightness in the dark areas. Couldn't a similar look be given to the game by modifying the lighting parameters?

I agree that it looks pretty cool, but I don't really see why it couldn't look the same in the "non-HDR" version, if you wanted it to look that way. Perhaps it is the way the brightness blends from light to dark that is special?

Can anyone clarify what makes HDR different than just "really bright highlights on certain areas"?

Thanks,
D'oh!
 
Originally posted by: AnnoyedGrunt
So, what is HDR doing that is so complex? Looking @ the screenshots makes it look like they simply increased the brighness in some areas, and then gradually blended back down to the "stardard" brightness in the dark areas. Couldn't a similar look be given to the game by modifying the lighting parameters?

I agree that it looks pretty cool, but I don't really see why it couldn't look the same in the "non-HDR" version, if you wanted it to look that way. Perhaps it is the way the brightness blends from light to dark that is special?

Can anyone clarify what makes HDR different than just "really bright highlights on certain areas"?

Thanks,
D'oh!

No HDR is not just "really brightening the colour," it is MUCH more than that. HDR is a new way of rendering colour and it's purpose is to eliminate the limitations of a fixed number of colours (ie integers) by replacing it with floating bit values (AFAIK). It doubles the bits per channel: 16-bits for each of the colour channels (Red, Green, Blue and Alpha) over 32-bit (8 bits per channel), and this uses an insane amount of bandwidth and GPU power.


With HDR, you can get essentially an 'unlimited' number of colour values to represent the insane variations in brightness and contrast of light in real world.

HDR in Far Cry uses the OpenEXR standard, developed by Industrial Light & Magic and currently only supported on the GeForce FX 6800 series of cards.

Here are a couple of bits of information on Nvidia's HDR in the 6800 series cards: neoseeker and Xbitlabs .

Essentially, this is exactly what John Carmack was talking about when he was saying that 32-bit colour was insufficient for photorealistic colour in games and that the next step would have to be taken.

HDR cannot be done just by 'adjusting the brightness,' however ATI has supported HDR since the 9700 series. ATI's HDR is of lower quality than OpenEXR/64-bit ... 48-bits perhaps (12 per channel?).

It still looks great in the early screenshots of HL2 and the Pixel Shader/Shader Mark/RTHDRIBL (Real Time HDR IBL)/etc, though, and I'm not exactly sure why Crytek decided on the higher quality (and worse performing) bandwidth needy 64-bit HDR that only Nvidia supports, instead of HDR that would work on both ATI and Nvidia cards.

One bit of speculation I heard was that the lower quality HDR it didn't look good or had errors/rendering imperfections at lower quality than 64-bit (in Far Cry at least); although this is just speculation and the other side of the coin is that Far Cry is a TWIMTBP game so it would make sense for a proprietary feature to 'sell' Nvidia cards.

Regardless, this is all conjecture; what we know so far is that HDR looks great in the Pixel Shader demos we've seen so far and it is supported in HL2, (unless the rumours that it was pulled from HL2 at the last minute are true... more speculation). We also know that 64-bit HDR looks incredible in Far Cry. Furthermore, we can see that HDR will require a bit more tweaking from developers as in Far Cry where in some situations the HDR lighting is totally overdone and way too bright than any realistic scenario would allow (such as indoor lights that are blinding).

Finally, the bandwidth required to run HDR (especially 64-bit) is huge - it basically kicks us back a generation or more in terms of GPU rendering performance. However, in my opinion, it will eventually become the standard since it makes stuff look so much more realistic, just like shaders. We're a few GPU generations off before feasible high-performance HDR, it seems.
 
Originally posted by: MercenaryForHire
HDR is a shader technique, not a hardware function. If you were patient enough to wait the few seconds per frame, it can even be done in software. 😉
Not in FC 1.3's case. The FP Blending is NV4x-specific, ATM, and I believe this is what FC's HDR uses b/c a) it's so slow, and b) you can't use AA with it.

There's some demo with a long weird name that runs on any DX9-capable card (FX5200+, 9500+) that showcases HDR effects.
RTHDRIBL. HDR, but supposedly coded mainly for ATi cards, and apparently doesn't use the same features as FC 1.3's HDR (FP16 blending).

Edit: Speeling.
 
To my knowledge HDR can be run under SM 2.0 though SM 3.0 will probably be faster. In FC's case it's using FP blending methods only available on NV4x cards so it goes beyond the simple question of which shader model you're using.
 
sure is purdy but i can't play it 🙁 . my p4 2.8@ 3.4 with a 6800 ultra's getting 22 fps in a firefight at 900 x 600 with no aa and no af! looks like it'll be fine with an sli rig tho...
 
*foreheadslap*

Where in any of my posts did I even mention FP blending, since it's not the only way to do HDR, as evidenced by RTHDRIBL? (thanks for the name, Pete) SM3's codepaths "just make the coding (human effort) and calculations (digital effort) easier."

Once again, HDR stands for High Dynamic Range, not Hardware Dependant Rendering. 😛

- M4H
 
Originally posted by: MercenaryForHire
*foreheadslap*

Where in any of my posts did I even mention FP blending, since it's not the only way to do HDR, as evidenced by RTHDRIBL? (thanks for the name, Pete) SM3's codepaths "just make the coding (human effort) and calculations (digital effort) easier."

You *didn't* mention FP blending, which is why your posts confused a lot of people (that, and I think you were trying to be funny and a lot of people didn't get it. Also, you're not funny. 😛)

You said:

Secondly, yes, HDR will be used in the future. Sucks to be an X800 owner, doesn't it?

Yep. Sucks that ATI parts don't support the often faster and easier to code SM3 codepaths.

Why do you suppose only the SM3-capable 6800s are supported in this patch?

You stuck 'wink' emoticons after these, but I hope you can see how this still might be confusing to someone who doesn't understand how FarCry's HDR works at a low level. Technical discussions are confusing enough without throwing in sarcasm and easily-misconstrued statements.
 
Originally posted by: Matthias99
Originally posted by: MercenaryForHire
*foreheadslap*

Where in any of my posts did I even mention FP blending, since it's not the only way to do HDR, as evidenced by RTHDRIBL? (thanks for the name, Pete) SM3's codepaths "just make the coding (human effort) and calculations (digital effort) easier."

You *didn't* mention FP blending, which is why your posts confused a lot of people (that, and I think you were trying to be funny and a lot of people didn't get it. Also, you're not funny. 😛)

You said:

Secondly, yes, HDR will be used in the future. Sucks to be an X800 owner, doesn't it?

Yep. Sucks that ATI parts don't support the often faster and easier to code SM3 codepaths.

Why do you suppose only the SM3-capable 6800s are supported in this patch?

You stuck 'wink' emoticons after these, but I hope you can see how this still might be confusing to someone who doesn't understand how FarCry's HDR works at a low level. Technical discussions are confusing enough without throwing in sarcasm and easily-misconstrued statements.

So basically, I didn't dumb it down far enough for the no0bs? 😉 😀

I'll work on that.

...

😉

- M4H
 
No, basicly you didn't have a clue what you we talking about. But that happens to us all sometimes, so there is really no harm in admitting it. 😉
 
Back
Top