7900 wont do HDR+AA, Geforce 8 will

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

darXoul

Senior member
Jan 15, 2004
702
0
0
Predictable.

Two things I wanna say:

1. I don't care much about HDR, I don't really like the effect, at least in some games.

2. Given the performance of the current video cards, it doesn't really matter since HDR + AA usually produces abysmal frame rates anyway, at least in the resolutions I'm interested in.
 

Crescent13

Diamond Member
Jan 12, 2005
4,793
1
0
This is something new? I ran HL2 Lost Coast with 4xAA and HDR on my 6600GT a while back, and i can do it at 16x TSAA and HDR with my 7800GT.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: Genx87
I find it interesting a game like HL can combine AA + HDR and push it through the shaders and get great performance. But a game like far cry which is supposed to be using a version of HDR that has less overhead goes belly up with their HDR implementation.

I know people have said the makers of Far Cry are supposed to be using the better of the two implementations. But I think HL2 looks better than Far Cry and it performs much better as well.

The FP Blending/OpenEXR implementation used in Far Cry allows for much more precision which results in higher IQ. They chose this for 2 reasons:
[*]Partnered with Nvidia so, at the time, only Nvidia was capable of the required 32bit FP Blending
[*]Better IQ

Valve on the other hand chose to make their own proprietary because they are partnered with ATI who doesn't have 32bit FP Blending (with the exception of X1xxx series and up). Also, IMO because the engine is newer and offers arguably better IQ they can afford slightly lower IQ when rendering HDR Lighting.

HL2, may look better because its rendering engine is simply newer. Remember Far Cry came out ~1 year prior to HL2.

-Kevin
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Gamingphreek
Originally posted by: munky
Originally posted by: Elfear
I think AOE3 has HDR+AA too. We'll have to see if newer games take advantage of this feature or not.

Yes, and actually it works on Nv cards also, through SSAA. It takes a major performace hit, but that's one way to get HDR+AA working on Nv cards.

No it doesn't. Nvidia cards do not have the required onboard logic to run FSAA (Full Scene Anti Aliasing) with HDR that uses FP Blending. AFAIK, HL2, which uses the Pixel Shaders to render High Dynamic Range Lighting, is the only game that allows any card to run HDR (Provided DX9 and PS2.0 support are present) and any card to run HDR+AA.

-Kevin

No, actually it does, but not the way regular AA works. I know the g70 cant do MSAA+HDR, but for AOE3 the devs manually coded SSAA for Nv cards.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: Crescent13
This is something new? I ran HL2 Lost Coast with 4xAA and HDR on my 6600GT a while back, and i can do it at 16x TSAA and HDR with my 7800GT.

Read what i wrote.

HL2 uses the Pixel Shaders to render HDR due to Valves proprietary engine.

Far Cry uses OpenEXR/FP Blending to render HDR.

Option 1 allows for HDR+AA since you are going through the pixel shaders (Lower IQ, but allows for AA)

Option 2 only allows for HDR OR AA because you are rending with floating point precision (Pete or Matthias99 would be able to explain this MUCH better than I) (Higher IQ, but doesn't allow for AA)

-Kevin
 

CP5670

Diamond Member
Jun 24, 2004
5,697
798
126
No, actually it does, but not the way regular AA works. I know the g70 cant do MSAA+HDR, but for AOE3 the devs manually coded SSAA for Nv cards.

That might be worth trying with other games, since it's possible to force SSAA modes through nHancer. Although the framerates would go through the floor in most cases.

I don't like that weird blooming effect that some HDR games use, but the lighting and colors look a lot better with it.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: munky
Originally posted by: Gamingphreek
Originally posted by: munky
Originally posted by: Elfear
I think AOE3 has HDR+AA too. We'll have to see if newer games take advantage of this feature or not.

Yes, and actually it works on Nv cards also, through SSAA. It takes a major performace hit, but that's one way to get HDR+AA working on Nv cards.

No it doesn't. Nvidia cards do not have the required onboard logic to run FSAA (Full Scene Anti Aliasing) with HDR that uses FP Blending. AFAIK, HL2, which uses the Pixel Shaders to render High Dynamic Range Lighting, is the only game that allows any card to run HDR (Provided DX9 and PS2.0 support are present) and any card to run HDR+AA.

-Kevin

No, actually it does, but not the way regular AA works. I know the g70 cant do MSAA+HDR, but for AOE3 the devs manually coded SSAA for Nv cards.

Hmm interesting, do you have a link to this?

-Kevin
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: CP5670
That might be worth trying with other games, since it's possible to force SSAA through nHancer. Although the framerates would go through the floor in most cases.

That is ONLY for AOEIII (which, as i just found out, apparently has special code to do this??). You cannot do this with Far Cry or SS:CT etc...

-Kevin
 

Elfear

Diamond Member
May 30, 2004
7,169
829
126
Originally posted by: darXoul
Predictable.

Two things I wanna say:

1. I don't care much about HDR, I don't really like the effect, at least in some games.

2. Given the performance of the current video cards, it doesn't really matter since HDR + AA usually produces abysmal frame rates anyway, at least in the resolutions I'm interested in.


Actually, if you look at the benches from this review, the X1900XTX actually does a pretty good job with HDR+AA, even at 1920x1200. Only FarCry or SC:CT dip into the low 30's at that res.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: Elfear
Originally posted by: darXoul
Predictable.

Two things I wanna say:

1. I don't care much about HDR, I don't really like the effect, at least in some games.

2. Given the performance of the current video cards, it doesn't really matter since HDR + AA usually produces abysmal frame rates anyway, at least in the resolutions I'm interested in.


Actually, if you look at the benches from this review, the X1900XTX actually does a pretty good job with HDR+AA, even at 1920x1200. Only FarCry or SC:CT dip into the low 30's at that res.

Far Cry is the only game they tested that uses FP Blending with AA aside from AOEIII which is, apparently, specially coded for SSAA. At any rate, 30FPS, while respectful varies in game. I dont want my max fps hanging around there when playing a game.

-Kevin
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Gamingphreek
Originally posted by: munky
Originally posted by: Gamingphreek
Originally posted by: munky
Originally posted by: Elfear
I think AOE3 has HDR+AA too. We'll have to see if newer games take advantage of this feature or not.

Yes, and actually it works on Nv cards also, through SSAA. It takes a major performace hit, but that's one way to get HDR+AA working on Nv cards.

No it doesn't. Nvidia cards do not have the required onboard logic to run FSAA (Full Scene Anti Aliasing) with HDR that uses FP Blending. AFAIK, HL2, which uses the Pixel Shaders to render High Dynamic Range Lighting, is the only game that allows any card to run HDR (Provided DX9 and PS2.0 support are present) and any card to run HDR+AA.

-Kevin

No, actually it does, but not the way regular AA works. I know the g70 cant do MSAA+HDR, but for AOE3 the devs manually coded SSAA for Nv cards.

Hmm interesting, do you have a link to this?

-Kevin

http://www.hardware.fr/articles/605-10/...xtx-x1900-xt-x1900-crossfire-test.html

If you can translate from French, it says on the bottom of the page - the game engine implements effective 2.25x SSAA, by rendering the scene internally at a higher resolution.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
If the game uses HDR bloom only, I'll probably not be too thrilled, but if it incorporates radiance mapping and all that good stuff then I'll likely use it.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
And on a side note, I tried Farcry with HDR+AA using the pre-release 1.4 patch, and the 64-bit content pack, and it was perfectly playable at 1280x960 with all settings maxed out, average fps was somewhere around 50-80. But in a more demanding game at higher resolutions, it looks like we'll need an even more powerful card to run it smoothly with EXR HDR and MSAA.
 

Elfear

Diamond Member
May 30, 2004
7,169
829
126
Originally posted by: Gamingphreek

Far Cry is the only game they tested that uses FP Blending with AA aside from AOEIII which is, apparently, specially coded for SSAA. At any rate, 30FPS, while respectful varies in game. I dont want my max fps hanging around there when playing a game.

-Kevin

I understand, I was just replying to darXoul's comment about HDR+AA performance in general being "abysmal". It doesn't look that smooth with Farcry or SC:CT at 1920x1200 but for guys that game at 1600x1200 or below it looks like it would be playable.
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Why would it be different since we are talkin about the same marchitecture standards? :confused:
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Nothing suprising here. I never expected more than refined AF, AA tweaks (to the existing modes) and more pipelines.

HDR+AA just isn't that important yet - it's too slow to be useful, better to stick with just plain old HDR in that case. When nVidia does impliment HDR+AA I expect it will be FP16 + AA, not ATi's cheesy FX10 "HDR" mode.
 

RobertR1

Golden Member
Oct 22, 2004
1,113
1
81
Originally posted by: Gstanfor
Nothing suprising here. I never expected more than refined AF, AA tweaks (to the existing modes) and more pipelines.

HDR+AA just isn't that important yet - it's too slow to be useful, better to stick with just plain old HDR in that case. When nVidia does impliment HDR+AA I expect it will be FP16 + AA, not ATi's cheesy FX10 "HDR" mode.

http://www.beyond3d.com/reviews/ati/r520/index.php?p=21
"Here we can see that the size of the FP16 buffer and 4x FSAA pushes some data out to system RAM reducing the fill-rate performance at 1600x1200"

So Ati cards aren't doing FP16+AA in FarCry and Dave Baumann is lying???
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
IMO ATi is simply ahead of the game in alot of cases and puts out better cards. Not to say I dont like nVidia, if I get a new PC I'll get a 7800GT. After all, a 7800GT would be too weak to do HDR+ AA anyway, so I shouldnt worry.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: RobertR1
Originally posted by: Gstanfor
Nothing suprising here. I never expected more than refined AF, AA tweaks (to the existing modes) and more pipelines.

HDR+AA just isn't that important yet - it's too slow to be useful, better to stick with just plain old HDR in that case. When nVidia does impliment HDR+AA I expect it will be FP16 + AA, not ATi's cheesy FX10 "HDR" mode.

http://www.beyond3d.com/reviews/ati/r520/index.php?p=21
"Here we can see that the size of the FP16 buffer and 4x FSAA pushes some data out to system RAM reducing the fill-rate performance at 1600x1200"

So Ati cards aren't doing FP16+AA in FarCry and Dave Baumann is lying???

Somewhere on the B3D forums, Baumann admits he stuffed up the farcry tests with the beta patch.

ATi hardware is incapable of doing HRD + AA @FP16. The use the fixed point (integer based) FX10 format instead.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: RobertR1
Originally posted by: Gstanfor
Nothing suprising here. I never expected more than refined AF, AA tweaks (to the existing modes) and more pipelines.

HDR+AA just isn't that important yet - it's too slow to be useful, better to stick with just plain old HDR in that case. When nVidia does impliment HDR+AA I expect it will be FP16 + AA, not ATi's cheesy FX10 "HDR" mode.

http://www.beyond3d.com/reviews/ati/r520/index.php?p=21
"Here we can see that the size of the FP16 buffer and 4x FSAA pushes some data out to system RAM reducing the fill-rate performance at 1600x1200"

So Ati cards aren't doing FP16+AA in FarCry and Dave Baumann is lying???

ATI can indeed do FP16+AA, but in order to use HDR and keep the game playable with high res and maybe AA, they use FX10 method.
 

M0RPH

Diamond Member
Dec 7, 2003
3,302
1
0
Originally posted by: Gstanfor
Nothing suprising here. I never expected more than refined AF, AA tweaks (to the existing modes) and more pipelines.

HDR+AA just isn't that important yet - it's too slow to be useful, better to stick with just plain old HDR in that case. When nVidia does impliment HDR+AA I expect it will be FP16 + AA, not ATi's cheesy FX10 "HDR" mode.

As a few people have already mentioned in this thread, the performance of HDR+ AA is perfectly acceptable in the games where its available. And your statement about ATI only doing FX10 HDR is just plain misinformed.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: M0RPH
Originally posted by: Gstanfor
Nothing suprising here. I never expected more than refined AF, AA tweaks (to the existing modes) and more pipelines.

HDR+AA just isn't that important yet - it's too slow to be useful, better to stick with just plain old HDR in that case. When nVidia does impliment HDR+AA I expect it will be FP16 + AA, not ATi's cheesy FX10 "HDR" mode.

As a few people have already mentioned in this thread, the performance of HDR+ AA is perfectly acceptable in the games where its available. And your statement about ATI only doing FX10 HDR is just plain misinformed.

Then you will have no trouble pointing to examples of fully hardware accelerated (ie: non software based) ATi HDR +AA, in a mode grater than FX10 will you m0rph? ;)
 

crazydingo

Golden Member
May 15, 2005
1,134
0
0
Originally posted by: Gstanfor
HDR+AA just isn't that important yet - it's too slow to be useful, better to stick with just plain old HDR in that case.
Gee .. the same people who used to hype up SM3 .. :D

 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: crazydingo
Originally posted by: Gstanfor
HDR+AA just isn't that important yet - it's too slow to be useful, better to stick with just plain old HDR in that case.
Gee .. the same people who used to hype up SM3 .. :D

Huh?!? What does HDR have to do with SM3.0? And how is your ATi card performing in AOE3? :p