• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

[H]ard's New Article on FarCry patch 1.3 & SM3.0

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: Ackmed
There are different types of HDR. Crytek just picked a way that doesnt work on ATi cards. HL2's HDR does.


Oh? What types of HDR can accomplish the effect though alpha channel textures with out a floatingpoint framebuffer?
 
Originally posted by: TheSnowman
Originally posted by: Ackmed
There are different types of HDR. Crytek just picked a way that doesnt work on ATi cards. HL2's HDR does.


Oh? What types of HDR can accomplish the effect though alpha channel textures with out a floatingpoint framebuffer?



Snowman, I usually read your posts with earnest, but you've posted this shoddy reply three times. what are you getting at? please stop being a cryptic dillhole! 🙂

are you trying to say that in the case of Farcry, but not HL2 or other examples, that HDR would have been impossible through any means on an x800 core? if that is what you are trying to say, please spell it out so we can debunk it.

😉
mario
 
Don't even bother, gururu. I gave up on trying to explain it to him in the other HDR FarCry post, but he seems to think that "SM3 makes complex coding (example HDR) easier" means the same as "OMGWTFBBQ SM3 IS REQUIRED FOR HDR"

- M4H
 
MercenaryForHire, I never claimed that you thought SM3 was required, I just doubted your arguments that SM3 helps the HDR as you were never clear on how you though this was the case but rather only listed general benifits of SM3 which I knew may or may not apply to the HDR in Far Cry. Regardless, I just checked out the shader files for Far Cry and all of the HDR shaders are PS2.0; so your whole line of argument is clearly wrong on the face.

Originally posted by: gururu
Originally posted by: TheSnowman
Originally posted by: Ackmed
There are different types of HDR. Crytek just picked a way that doesnt work on ATi cards. HL2's HDR does.


Oh? What types of HDR can accomplish the effect though alpha channel textures with out a floatingpoint framebuffer?



Snowman, I usually read your posts with earnest, but you've posted this shoddy reply three times. what are you getting at? please stop being a cryptic dillhole! 🙂

How you say I am being cryptic when I am simply asking for clairfication of your statement? From what I understand, a floating point famebuffer is simply required to carry an HDR effect though alpha channels; and since Far Cry is full of textures with alpha channels, I am at a loss as to what methods you are implying Crytek could have used to make the effect work in their game with cards that do not support a floating point famebuffer.

Originally posted by: gururuare you trying to say that in the case of Farcry, but not HL2 or other examples, that HDR would have been impossible through any means on an x800 core? if that is what you are trying to say, please spell it out so we can debunk it.

I'm not one to say anything is impossable, but I not aware of any way which the effect could be accomplished in Far Cry or any other situation with transparent textues unless you have a floatingpoint framebuffer. If you are aware of a practical method to accomplish this; please tell us what that method is as I am sure that Cry Tek, John Carmack who requested floatingpoint framebuffer support quite a while back, and many other developers would like to know.

 
Originally posted by: TheSnowman


I'm not one to say anything is impossable, but I not aware of any way which the effect could be accomplished in Far Cry or any other situation with transparent textues unless you have a floatingpoint framebuffer. If you are aware of a practical method to accomplish this; please tell us what that method is as I am sure that Cry Tek, John Carmack who requested floatingpoint framebuffer support quite a while back, and many other developers would like to know.

thanks, that's a little clearer. all i know is that ATi cards have supported HDR since their first line of DX 9.0 cards hit the market. in fact, they have a demo showing just that. the link I posted in an earlier post also clearly demonstrates how HDR effects can be employed with hardware at that time (which included the R300 core but not the FX series). So I don't know what you are getting at. To be honest, you, Crytek, and Carmack should consult Gabe Newell to determine how to do it.

 
What makes you think Gabe Newell would know how to accomplish HDR though transparent textues? I have seen examples of HDR in HL2, but not though alpha channels.
 
Originally posted by: TheSnowman
What makes you think Gabe Newell would know how to accomplish HDR though transparent textues? I have seen examples of HDR in HL2, but not though alpha channels.


ok, so you aren't arguing that HDR cannot be done in ATI, it just has to be done differently. That I never disputed. What sort of alpha channels does Farcry employ that HL2 doesn't? I think I might finally be able to learn something here in our discussion.
 
Originally posted by: PrayForDeath
Originally posted by: stnicralisk
Originally posted by: Gamingphreek
I do agree that Crytek is implementing HDR only for Nvidia because of agreements and stuff.
-Kevin

Riiight so theyre working on adding 3dc support because of agreements too? Your post is crap.

Never heard of 3DC support in FarCry :disgust:

Pwned. Seems like stnicralisk doesn't know what he is talking about and Pray for Death and Gamingphreek (Myself) do.

-Kevin
 
Originally posted by: gururu
ok, so you aren't arguing that HDR cannot be done in ATI, it just has to be done differently. That I never disputed. What sort of alpha channels does Farcry employ that HL2 doesn't? I think I might finally be able to learn something here in our discussion.

No, I am saying that HDR cannot be done though an alpha channel's transparency with Ati hardware or any other hardware that does not support a floating point framebufer. Look at this pic here and note the vegitation. The vegitation is done with simple polygons textured with transparent areas to give the of the little leaves. To carry that HDR though the transparncy the floatingpoint framebuffer is employed, without that floating point frame buffer the transparent part showing vegitation textues would look washed out like the left side of the image. So sure you could they could let us use HDR in Far Cry on Ati's DX9 cards, but it would look like crap.
 
Originally posted by: Rage187
Originally posted by: PrayForDeath
congrats on your new buy, Rage187 :beer:


THX, this is going in my second comp, my main is running a GT.



And Ronn, after playing Farcry for the first time, yes it does indeed look better then D3.

I think so too... Far cry is a fantastic game, and the fact it looks good even on a 9800 pro and runs good too (unlike Doom3) makes me praise the Crytek engine... FC is great, and I think they should make more games based on the same engine!!!
 
Half-Life 2 doesn't do HDR, as far as I know. It has some DX9 functionality, but it doesn't execute a realistic HDR representation as FarCry does (there are some light shafts on de_chateau in CS:S that are not HDR, nor does it offer any FarCry-style HDR shifts (that washout to clarity example someone gave was brilliant, I'm sure it looks better in motion than in a few screen grabs)). Of course, Valve is a huge ATI flunky, so at the most SM2.0B will be used in HL2, while SM3.0 will be utterly ignored and nVidia's floating point framebuffer will be left unused for HDR implementation.
I imagine that nVidia will fix the "no AA with HDR" issue with NV50, since apparently a ton of people hate not having AA while using HDR. Or it just may be the way FarCry's OEXR method is coded, since I don't think nVidia would design a system that only works for the barest functionality of a feature; it's wholly possible that the code is merely bloated or inefficient and takes up too much space to write AA samples to the framebuffer while HDR is being executed. Considering the game is DX9-based, I'm inclined to believe that the bloated code explanation is actually true. *hates DX code*
 
Originally posted by: Shinei
Half-Life 2 doesn't do HDR, as far as I know. It has some DX9 functionality, but it doesn't execute a realistic HDR representation as FarCry does (there are some light shafts on de_chateau in CS:S that are not HDR, nor does it offer any FarCry-style HDR shifts (that washout to clarity example someone gave was brilliant, I'm sure it looks better in motion than in a few screen grabs)). Of course, Valve is a huge ATI flunky, so at the most SM2.0B will be used in HL2, while SM3.0 will be utterly ignored and nVidia's floating point framebuffer will be left unused for HDR implementation.
I imagine that nVidia will fix the "no AA with HDR" issue with NV50, since apparently a ton of people hate not having AA while using HDR. Or it just may be the way FarCry's OEXR method is coded, since I don't think nVidia would design a system that only works for the barest functionality of a feature; it's wholly possible that the code is merely bloated or inefficient and takes up too much space to write AA samples to the framebuffer while HDR is being executed. Considering the game is DX9-based, I'm inclined to believe that the bloated code explanation is actually true. *hates DX code*

If the performance is there for RADEON users, HDR could be a real selling point for ATI, as it?s definitely one of those features that you?re not going to want to turn off once you?ve seen it in action. Remember the first time you saw the sun in Gran Tourismo 3? To the uninitiated, HDR is similar to that effect, only it?s about two times more effective. In addition, the light reflects off of reflective or shiny surfaces. When you combine this with the Half-Life 2 water (which is the most accurate representation of water we?ve seen in a game to date), can you imagine how good HDR would look at sunset over a large body of water? HDR will also be used for effects like muzzle flashes.
link
 
Shinei so why is there a 5 minute vid showing u HL2s HDR?

It didnt looks as pronounced as Far Crys but it looked very good. And that was a year ago, and at that time, it happens that the X800 or 6800 werent out then.
 
it amazes me that so many people claim that ATI hardware can't do HDR when ATI hardware was the first to support it.
 
Well, I believe I said that I hadn't heard anything about HDR in HL2, and I was talking about my experience with CS:S (which, being based on HL2, should be a decent indicator of what HL2 will look like). The article you linked still mentions HL2 as the most anticipated game of 2003, so it may be possible that HDR was removed for some reason during a later beta build (and/or may not be present at all in CS:S). Or it just may be that the effect isn't really all that pronounced, since I don't notice half of these little niggling things unless someone specifically points it out to me (like I never noticed mipmap transitions until someone complained and posted a screenshot of the transitions).
 
Originally posted by: CraigRT
Originally posted by: Rage187
Originally posted by: PrayForDeath
congrats on your new buy, Rage187 :beer:


THX, this is going in my second comp, my main is running a GT.



And Ronn, after playing Farcry for the first time, yes it does indeed look better then D3.

I think so too... Far cry is a fantastic game, and the fact it looks good even on a 9800 pro and runs good too (unlike Doom3) makes me praise the Crytek engine... FC is great, and I think they should make more games based on the same engine!!!

It's interesting to see games built on the SandBox engine (or whatever it's called), have anyone heard of such upcoming games?
 
Originally posted by: gururu
it amazes me that so many people claim that ATI hardware can't do HDR when ATI hardware was the first to support it.

I don't think I have seen anyone claiming that Ati hardware can't do HDR, but rather that they can't do HDR in a situation that requires a floatingpoint framebuffer as the hardware doesn't support floatingpoint framebuffers.
 
Also, I though HDR in HL2 had been shelved, at least temporarily, for whatever reasons. This despite the fact that they had solved initial problems with the FX series and got it working with not even floats, but just ints. Odd, if true (both in terms of the range of FX12 HDR, and ditching it [perhaps related to translucent textures]).
 
Originally posted by: Shinei
Well, I believe I said that I hadn't heard anything about HDR in HL2, and I was talking about my experience with CS:S (which, being based on HL2, should be a decent indicator of what HL2 will look like). The article you linked still mentions HL2 as the most anticipated game of 2003, so it may be possible that HDR was removed for some reason during a later beta build (and/or may not be present at all in CS:S). Or it just may be that the effect isn't really all that pronounced, since I don't notice half of these little niggling things unless someone specifically points it out to me (like I never noticed mipmap transitions until someone complained and posted a screenshot of the transitions).

CS:S has HDR disabled in the source engine for performance reasons.

There are mods to enable it.
 
In response to several comments.

Older hardware (9X00's and FX) can do a lesser form of hdr using only ints, this was going into HL2 at one point. However only 6*00's can do HDR of the quality implemented in farcry 1.3. 6800's won't be able to implement multi-sampling anti-aliasing as well because both it and hdr use the same buffer - take your pick. HDR is not part of the SM3.0 spec, however there are a image quality enhancing things that are such as displacement mapping. 3dc is a normal compresion algorithm that works a little better then (tweaked) DXTC5, which is the algorithm used in Doom 3, and added as beta to farcry. 3dc doesn't get used yet because it only works on the X800 hardware and isn't in the direct X spec (DXTC5 is).
 
Originally posted by: sbuckler
In response to several comments.

Older hardware (9X00's and FX) can do a lesser form of hdr using only ints, this was going into HL2 at one point. However only 6*00's can do HDR of the quality implemented in farcry 1.3. 6800's won't be able to implement multi-sampling anti-aliasing as well because both it and hdr use the same buffer - take your pick. HDR is not part of the SM3.0 spec, however there are a image quality enhancing things that are such as displacement mapping. 3dc is a normal compresion algorithm that works a little better then (tweaked) DXTC5, which is the algorithm used in Doom 3, and added as beta to farcry. 3dc doesn't get used yet because it only works on the X800 hardware and isn't in the direct X spec (DXTC5 is).


that was a very nice description. thanks 🙂
 
Originally posted by: sbuckler
Older hardware (9X00's and FX) can do a lesser form of hdr using only ints, this was going into HL2 at one point.

Both the 9X00's and FX's can do HDR with floating point color calculations, as can be seen in rthdribl and I'm pretty sure the HL2 HDR example used floating point color calculations on Radeons as well. It is the use of the floatingpoint framebuffer that makes the HDR in Far Cry different.
 
Back
Top