Technical question about HDR Rendering in Far Cry

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Genx87

Lifer
Apr 8, 2002
41,091
513
126
At 1600X1200 I went from a solid 60FPS(vert sync enabled) to 15fps hehe.
It is however playable at 1280X1024. I get a solid 25fps with occasional bumps into the high 30s.

It is probably not allowed to work with AA for good reason. The memory bandwidth would not be enough.

I am thinking we wont see great HDR peformance until the NV50 at the earliest. I guess it isnt bad but nothing spectacular.

btw I believe the Unreal 3 engine will be using FP blending for its HDR also.

 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: Genx87
It is probably not allowed to work with AA for good reason. The memory bandwidth would not be enough.

No, as I mentioned above; the hardware simply isn't physically capable of doing both at the same time.
 

CHfan4ever

Diamond Member
Oct 1, 2004
3,290
0
0
Originally posted by: AnnoyedGrunt
So, what is HDR doing that is so complex? Looking @ the screenshots makes it look like they simply increased the brighness in some areas, and then gradually blended back down to the "stardard" brightness in the dark areas. Couldn't a similar look be given to the game by modifying the lighting parameters?

I agree that it looks pretty cool, but I don't really see why it couldn't look the same in the "non-HDR" version, if you wanted it to look that way. Perhaps it is the way the brightness blends from light to dark that is special?

Can anyone clarify what makes HDR different than just "really bright highlights on certain areas"?

Thanks,
D'oh!

Here a "clear" example of what HDR do, i just took it myself:



BEFORE HDR(Its what eveybody play Far Cry with a 6800 card.This is lvl catacombs



AFTER HDR Notice its not just higher brightess, but extreme reflection



OTHER EXAMPLE(And that is the effect that impress me the most )

WITH HDR Enable:


Im comming out of the caves, a dark area.I got the effect of lightning-blind as you see on the screenshot.Notice the blue halo

A closer look and aproche of the light area.I start getting use to the lightning, the halo "blindness" is reducing.

Thats it, my eye got use to the lightning.Now i see the area normally.

The effect also work when u see light and go in the dark.Because your eye was use of the lightning if you go straight into a dark area, youll see nothing,all black. And after 2-3 seconds, your eye will get use to the darkness and you will start seeing some things up.(that my friend is pretty amazing!)

So that it, i think it doesnt need a betetr explaination hehe
 
Jan 31, 2002
40,819
2
0
Originally posted by: TheSnowman
No, basicly you didn't have a clue what you we talking about. But that happens to us all sometimes, so there is really no harm in admitting it. ;)

I guess you're one of those people who gets confused by sarcasm and humour. It's okay, you can get your learning aide to explain this one to you. ;)

I did not mention the NV40-specific FP blending anywhere, because it's not the only effing way to do HDR. See rthdribl for an example of it done on Ye Olde Hardware.

The points I'm making are as follows:

1) SM3 makes things easier for the coders.
2) The same effect can be done in SM2, but it takes more effort and computing power.
3) Humans by nature are lazy and will take the easier route first.
4) Since the X800 series doesn't support SM3, expect them to lag behind in software support via patching.
5) HDR pwnz you, your mother, and your dog. All at once.

Of course it will all be irrelevant in a generation or two if Moore's Law holds up. Which hopefully it will - refer to Reason #5. :p

- M4H
 
Jan 31, 2002
40,819
2
0
Originally posted by: ifesfor
Originally posted by: AnnoyedGrunt
So, what is HDR doing that is so complex? Looking @ the screenshots makes it look like they simply increased the brighness in some areas, and then gradually blended back down to the "stardard" brightness in the dark areas. Couldn't a similar look be given to the game by modifying the lighting parameters?

I agree that it looks pretty cool, but I don't really see why it couldn't look the same in the "non-HDR" version, if you wanted it to look that way. Perhaps it is the way the brightness blends from light to dark that is special?

Can anyone clarify what makes HDR different than just "really bright highlights on certain areas"?

Thanks,
D'oh!

Here a "clear" example of what HDR do, i just took it myself:



BEFORE HDR(Its what eveybody play Far Cry with a 6800 card.This is lvl catacombs



AFTER HDR Notice its not just higher brightess, but extreme reflection



OTHER EXAMPLE(And that is the effect that impress me the most )

WITH HDR Enable:


Im comming out of the caves, a dark area.I got the effect of lightning-blind as you see on the screenshot.Notice the blue halo

A closer look and aproche of the light area.I start getting use to the lightning, the halo "blindness" is reducing.

Thats it, my eye got use to the lightning.Now i see the area normally.

The effect also work when u see light and go in the dark.Because your eye was use of the lightning if you go straight into a dark area, youll see nothing,all black. And after 2-3 seconds, your eye will get use to the darkness and you will start seeing some things up.(that my friend is pretty amazing!)

So that it, i think it doesnt need a betetr explaination hehe

What screen is that on? 16:9 gaming is the shiznit. :D

- M4H
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Genx87
At 1600X1200 I went from a solid 60FPS(vert sync enabled) to 15fps hehe.
It is however playable at 1280X1024. I get a solid 25fps with occasional bumps into the high 30s.

It's funny how now, apparently 25 fps is 'solid' for some people. Personally, below 60 and I start to get annoyed.

It is probably not allowed to work with AA for good reason. The memory bandwidth would not be enough.

That appears to be the reason. AA would just turn it into an absolute dog of a performer; probably single digit FPS with 4x AA.

btw I believe the Unreal 3 engine will be using FP blending for its HDR also.

U3 may be using FP blending; it's definately using 64-bit colour. However, even with the normal maps they plan on using for the game, U3 looks to be a single-digit FPS performer on current gen hardware - it's gonna require a HELLUVA video processor to run. But let's stick to the here and now and enjoy our 6800's and X800's before they get humbled ;) .
 

CHfan4ever

Diamond Member
Oct 1, 2004
3,290
0
0
What screen is that on? 16:9 gaming is the shiznit. :D

- M4H

I play all my games in 1280x768(widescreen format)Any monitor that support 1280x1024 or mroe can do it.You just have to stretch your screen "manually"(with your monitor options) so it have a 16:9 format.Far cry can be set directly in 16:9, some games tho( doom3 as example) you have to set some options in doom3.cfg.If you need to know how for doom 3 lemme know ill explain.

Yes on a 17 inc monitor it look smaller, but god the resolution make me shut my mouth :) I beleive there is another 16:9 format ( its 1900x something i think dont remember )

ATM all my games i play if i can in 16:9.Its the same thing like tv.Once you touch widesreen you cannot go back hehe

 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: jiffylube1024
It is probably not allowed to work with AA for good reason. The memory bandwidth would not be enough.

That appears to be the reason. AA would just turn it into an absolute dog of a performer; probably single digit FPS with 4x AA.

For about the fourth time in this thread -- NV4X cannot do FP blending and AA at the same time because of the way they implement AA. Certainly the performance would suck even if it could, but the architecture is physically unable to use both features simultaneously.
 

CHfan4ever

Diamond Member
Oct 1, 2004
3,290
0
0
Originally posted by: Matthias99
Originally posted by: jiffylube1024
It is probably not allowed to work with AA for good reason. The memory bandwidth would not be enough.

That appears to be the reason. AA would just turn it into an absolute dog of a performer; probably single digit FPS with 4x AA.

For about the fourth time in this thread -- NV4X cannot do FP blending and AA at the same time because of the way they implement AA. Certainly the performance would suck even if it could, but the architecture is physically unable to use both features simultaneously.


Bah me dont give a s about AA,HDR all the way lol !
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: MercenaryForHire

I did not mention the NV40-specific FP blending anywhere, because it's not the only effing way to do HDR. See rthdribl for an example of it done on Ye Olde Hardware.

I'm not sure I know what you are talking about by " Ye Olde Hardware" as I am on my 6800gt right now. But I am sure you don't know what you are talking about because while rthdribl is HDR, it doesn't use any alpha channels like Far Cry has all over the place and therefore doesn't require a floating point framebuffer which is only avalable on the Geforce6 series.

Originally posted by: MercenaryForHire
The points I'm making are as follows:

1) SM3 makes things easier for the coders.

But that point is completely off topic as SM3 doesn't have anything to do with the HDR in Far Cry which this thread is about.

Originally posted by: MercenaryForHire2) The same effect can be done in SM2, but it takes more effort and computing power.

Really? What is it in the specific shaders used for HDR that benifits from SM3? I just did a quick test on my 6800gt and while SM3 does give much better framerate than SM2 in general, enableing HDR takes about the same precentage of a preformce drop either way. So what is it in the specific shaders used for HDR that benifits from SM3? Have you even looked at the shaders and do you even know anything about codeing shaders or are you just trying to BS us still?


Originally posted by: MercenaryForHire
3) Humans by nature are lazy and will take the easier route first.
4) Since the X800 series doesn't support SM3, expect them to lag behind in software support via patching.
5) HDR pwnz you, your mother, and your dog. All at once.

Of course it will all be irrelevant in a generation or two if Moore's Law holds up. Which hopefully it will - refer to Reason #5. :p

- M4H

Please stop with the BS, you are stinking it up in here. :disgust:
 
Jan 31, 2002
40,819
2
0
Originally posted by: TheSnowman
I'm not sure I know what you are talking about by " Ye Olde Hardware" as I am on my 6800gt right now. But I am sure you don't know what you are talking about because while rthdribl is HDR, it doesn't use any alpha channels like Far Cry has all over the place and therefore doesn't require a floating point framebuffer which is only avalable on the Geforce6 series.

As you conceded above - it's still HDR, and doable on much lower-end hardware. Example, Radeon 9500 Pro. Of course, as the almighty owner of a 6800GT, I'm sure it's below your station in life to speak to someone with less than 16 pixel pipes. :roll:

But that point is completely off topic as SM3 doesn't have anything to do with the HDR in Far Cry which this thread is about.

SM3 makes HDR easier. See below.

Really? What is it in the specific shaders used for HDR that benifits from SM3? I just did a quick test on my 6800gt and while SM3 does give much better framerate than SM2 in general, enableing HDR takes about the same precentage of a preformce drop either way. So what is it in the specific shaders used for HDR that benifits from SM3? Have you even looked at the shaders and do you even know anything about codeing shaders or are you just trying to BS us still?

Hi, I'm "Branching." I'm a coding practice you're taught in programming!
Me and my cousins "Looping" and "Conditional Execution" are here to help prevent you from executing entire sections of unneeded code on things that just get culled in the final rendering pass!

Aside from the three adorable little munchkins above, the lengthening of the maximum instruction sets means that developers don't have to slice up code if they want to do complex shading effects. In other words - fewer passes per pixel.

Please stop with the BS, you are stinking it up in here. :disgust:

So you think that (#3) Humans are not lazy by nature (#4) Humans will therefore take the easier path of coding (#5) You hate the shiny reflective goodness?

Please die in a finely rendered, HDR-shaded, SM3.0+ fire.

- M4H
 

CHfan4ever

Diamond Member
Oct 1, 2004
3,290
0
0
Ok so if i understand correctly what is talk on that board.Ati user can also use HDR rendering?So that crytek decide to just give an "exclusive" feature to the 6800 owner?

 

Shinei

Senior member
Nov 23, 2003
200
0
0
Crytek likely pulled a Valve by making HDR an nVidia-specific feature, along with supporting Shader Model 3.0, but their method of implementing HDR is more precise than other HDR implementations (like that RTFMOMGWTFBBQ or whatever the hell it's named) because it takes advantage of a floating-point framebuffer, rather than an integer-based buffer. It's not technically an nVidia-exclusive design, but it doesn't work as intended on ATI hardware because R3x0/420 don't support floating-point framebuffers. R500 may support floating-point buffers though, so we'll see what happens next spring.
 
Jan 31, 2002
40,819
2
0
Originally posted by: ifesfor
Ok so if i understand correctly what is talk on that board.Ati user can also use HDR rendering?So that crytek decide to just give an "exclusive" feature to the 6800 owner?

ATI can use HDR rendering, since there's multiple ways to accomplish it. The R-whatever techdemo showcases it and works on the previous-gen cards as well.

CryTek elected to use a method requiring NV40-specific hardware. I'm guessing they're still working on something that runs "acceptably" on the ATI sets, and it'll be in FarCry 1.4 :)

- M4H
 
Jan 31, 2002
40,819
2
0
Originally posted by: Shinei
R500 may support floating-point buffers though, so we'll see what happens next spring.

The rumours say it's supposed to - frankly, I think it would be suicide not to at this point.

- M4H
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: ifesfor
Ok so if i understand correctly what is talk on that board.Ati user can also use HDR rendering?So that crytek decide to just give an "exclusive" feature to the 6800 owner?

ATI's boards can do HDR but not FP blending or 64-bit (OpenEXR) rendering, which is exclusively on the GF 6 series. Crytek decided to use OpenEXR/FP blending for Far Cry.
 

imported_kouch

Senior member
Sep 24, 2004
220
0
0
Originally posted by: MercenaryForHire
Originally posted by: ifesfor
Ok so if i understand correctly what is talk on that board.Ati user can also use HDR rendering?So that crytek decide to just give an "exclusive" feature to the 6800 owner?

ATI can use HDR rendering, since there's multiple ways to accomplish it. The R-whatever techdemo showcases it and works on the previous-gen cards as well.

CryTek elected to use a method requiring NV40-specific hardware. I'm guessing they're still working on something that runs "acceptably" on the ATI sets, and it'll be in FarCry 1.4 :)

- M4H

or maybe they did it with nv code path because far cry is a "the way its meant to be played" game. Also don't hold your breath on fc1.4 patch. From what I hear this is probably the last major patch and they even scrapped the plans for a 64-bit version.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
It's funny how now, apparently 25 fps is 'solid' for some people. Personally, below 60 and I start to get annoyed.

It is funny how fanbois read into things as well.

When I say "solid" 25FPS with occasional bumps into the 30s that means it is a "solid" 25 fps. It never varys except upwards. I am making no claim as to the playability. Although I think at 25fps it looks smooth and that is fine. You may however think not.

 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: kouch
Originally posted by: MercenaryForHire
Originally posted by: ifesfor
Ok so if i understand correctly what is talk on that board.Ati user can also use HDR rendering?So that crytek decide to just give an "exclusive" feature to the 6800 owner?

ATI can use HDR rendering, since there's multiple ways to accomplish it. The R-whatever techdemo showcases it and works on the previous-gen cards as well.

CryTek elected to use a method requiring NV40-specific hardware. I'm guessing they're still working on something that runs "acceptably" on the ATI sets, and it'll be in FarCry 1.4 :)

- M4H

or maybe they did it with nv code path because far cry is a "the way its meant to be played" game. Also don't hold your breath on fc1.4 patch. From what I hear this is probably the last major patch and they even scrapped the plans for a 64-bit version.

No, it is because you can't do HDR though alpha channels without a floating point framebuffer and the geforce6s are the only cards around which have that feature. MercenaryForHire's babble about " still working on something that runs acceptably" is utter nonsense, other cards simply don't support the feature just like the Geforce6s don't support the feature with AA.
 
Jan 31, 2002
40,819
2
0
Originally posted by: Genx87
It's funny how now, apparently 25 fps is 'solid' for some people. Personally, below 60 and I start to get annoyed.

It is funny how fanbois read into things as well.

When I say "solid" 25FPS with occasional bumps into the 30s that means it is a "solid" 25 fps. It never varys except upwards. I am making no claim as to the playability. Although I think at 25fps it looks smooth and that is fine. You may however think not.

Uh ... a "framerate fanboy"? :confused: Yeah ... that makes sense ... we all hate liquid smooth animation and quick reaction times ...

- M4H
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
Originally posted by: Rhagz
Wow, watching that video makes me want to go buy FarCry when my 6800 comes in this week!

get a copy for free when you order athlon 64 mobo combos from mwave.com:)
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
Originally posted by: jiffylube1024
Originally posted by: AnnoyedGrunt
So, what is HDR doing that is so complex? Looking @ the screenshots makes it look like they simply increased the brighness in some areas, and then gradually blended back down to the "stardard" brightness in the dark areas. Couldn't a similar look be given to the game by modifying the lighting parameters?

I agree that it looks pretty cool, but I don't really see why it couldn't look the same in the "non-HDR" version, if you wanted it to look that way. Perhaps it is the way the brightness blends from light to dark that is special?

Can anyone clarify what makes HDR different than just "really bright highlights on certain areas"?

Thanks,
D'oh!

No HDR is not just "really brightening the colour," it is MUCH more than that. HDR is a new way of rendering colour and it's purpose is to eliminate the limitations of a fixed number of colours (ie integers) by replacing it with floating bit values (AFAIK). It doubles the bits per channel: 16-bits for each of the colour channels (Red, Green, Blue and Alpha) over 32-bit (8 bits per channel), and this uses an insane amount of bandwidth and GPU power.


With HDR, you can get essentially an 'unlimited' number of colour values to represent the insane variations in brightness and contrast of light in real world.

HDR in Far Cry uses the OpenEXR standard, developed by Industrial Light & Magic and currently only supported on the GeForce FX 6800 series of cards.

Here are a couple of bits of information on Nvidia's HDR in the 6800 series cards: neoseeker and Xbitlabs .

Essentially, this is exactly what John Carmack was talking about when he was saying that 32-bit colour was insufficient for photorealistic colour in games and that the next step would have to be taken.

HDR cannot be done just by 'adjusting the brightness,' however ATI has supported HDR since the 9700 series. ATI's HDR is of lower quality than OpenEXR/64-bit ... 48-bits perhaps (12 per channel?).

It still looks great in the early screenshots of HL2 and the Pixel Shader/Shader Mark/RTHDRIBL (Real Time HDR IBL)/etc, though, and I'm not exactly sure why Crytek decided on the higher quality (and worse performing) bandwidth needy 64-bit HDR that only Nvidia supports, instead of HDR that would work on both ATI and Nvidia cards.

One bit of speculation I heard was that the lower quality HDR it didn't look good or had errors/rendering imperfections at lower quality than 64-bit (in Far Cry at least); although this is just speculation and the other side of the coin is that Far Cry is a TWIMTBP game so it would make sense for a proprietary feature to 'sell' Nvidia cards.

Regardless, this is all conjecture; what we know so far is that HDR looks great in the Pixel Shader demos we've seen so far and it is supported in HL2, (unless the rumours that it was pulled from HL2 at the last minute are true... more speculation). We also know that 64-bit HDR looks incredible in Far Cry. Furthermore, we can see that HDR will require a bit more tweaking from developers as in Far Cry where in some situations the HDR lighting is totally overdone and way too bright than any realistic scenario would allow (such as indoor lights that are blinding).

Finally, the bandwidth required to run HDR (especially 64-bit) is huge - it basically kicks us back a generation or more in terms of GPU rendering performance. However, in my opinion, it will eventually become the standard since it makes stuff look so much more realistic, just like shaders. We're a few GPU generations off before feasible high-performance HDR, it seems.


Informative post JL, thanks much:)
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Bah Zebo, are you just quoting that misinformation I already corrected and calling it informative to spite me? :confused:
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
Originally posted by: TheSnowman
Bah Zebo, are you just quoting that misinformation I already corrected and calling it informative to spite me? :confused:

Where? I don't see you correcting anything JL said. I see a bunch a vauge comments. How about laying it out for us laymen like JL did if you're so smart. Remeber details and I'm 33 going on eight years old.:)