7900 wont do HDR+AA, Geforce 8 will

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136
Originally posted by: crazydingo
Originally posted by: Gstanfor
HDR+AA just isn't that important yet - it's too slow to be useful, better to stick with just plain old HDR in that case.
Gee .. the same people who used to hype up SM3 .. :D

People hyped SM3 before it became widespread, now the people complaining about that do the same with HDR+AA. It's still in it's infancy and may not even be rendered properly FP16+AA on G80/R600.

I personally will turn of HDR in all games until it provides a enhanced gameplay experience and not just glare at people.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Gstanfor
Originally posted by: M0RPH
Originally posted by: Gstanfor
Nothing suprising here. I never expected more than refined AF, AA tweaks (to the existing modes) and more pipelines.

HDR+AA just isn't that important yet - it's too slow to be useful, better to stick with just plain old HDR in that case. When nVidia does impliment HDR+AA I expect it will be FP16 + AA, not ATi's cheesy FX10 "HDR" mode.

As a few people have already mentioned in this thread, the performance of HDR+ AA is perfectly acceptable in the games where its available. And your statement about ATI only doing FX10 HDR is just plain misinformed.

Then you will have no trouble pointing to examples of fully hardware accelerated (ie: non software based) ATi HDR +AA, in a mode grater than FX10 will you m0rph? ;)

I dont know if you're just ignoring the facts or spreading misinformation, but FX10 HDR in only used in AOE3. FP16 is used in both Farcry and SS2, and works with AA also. There are several HDR formats, like FX10, FX16, FP16, and all of them can be used with MSAA on the r5xx cards.

http://www.hardware.fr/articles/605-7/r...xtx-x1900-xt-x1900-crossfire-test.html

*edit: more info
http://www.beyond3d.com/reviews/ati/r520/index.php?p=06
 

M0RPH

Diamond Member
Dec 7, 2003
3,302
1
0
Originally posted by: Gstanfor

Then you will have no trouble pointing to examples of fully hardware accelerated (ie: non software based) ATi HDR +AA, in a mode grater than FX10 will you m0rph? ;)

You've already been set straight in this thread that ATI is capable of FP16 just as well as Nvdia and it's just an issue of what the game developer chooses to use. I'm not going to go researching which games use which types of HDR because it doesn't interest me that much. Even if all the games out there are using FX10, I'd still rather have AA with FX-10 HDR than AA with no HDR at all. Also if it helps with performance and makes HDR+AA playable where it wouldn't be with FP16, then so be it.... that's a smart move
 

RobertR1

Golden Member
Oct 22, 2004
1,113
1
81
Originally posted by: Gstanfor
Originally posted by: crazydingo
Originally posted by: Gstanfor
HDR+AA just isn't that important yet - it's too slow to be useful, better to stick with just plain old HDR in that case.
Gee .. the same people who used to hype up SM3 .. :D

Huh?!? What does HDR have to do with SM3.0? And how is your ATi card performing in AOE3? :p

http://www.hardware.fr/articles/605-10/...xtx-x1900-xt-x1900-crossfire-test.html
Does alright.......
 

M0RPH

Diamond Member
Dec 7, 2003
3,302
1
0
AOEIII is definately a game where HDR adds to the graphics, IMO. Assuming you want to play the game with HDR+AA, let's look at your choices:

1600x1200 HDR+4xAA

1900XTX 54.2
1900XT 51.1
7800GTX512 32.9
7800GTX 25.3

Hmm, I wonder which card you should choose.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: M0RPH
Originally posted by: Gstanfor

Then you will have no trouble pointing to examples of fully hardware accelerated (ie: non software based) ATi HDR +AA, in a mode grater than FX10 will you m0rph? ;)

You've already been set straight in this thread that ATI is capable of FP16 just as well as Nvdia and it's just an issue of what the game developer chooses to use. I'm not going to go researching which games use which types of HDR because it doesn't interest me that much. Even if all the games out there are using FX10, I'd still rather have AA with FX-10 HDR than AA with no HDR at all. Also if it helps with performance and makes HDR+AA playable where it wouldn't be with FP16, then so be it.... that's a smart move

Uhhh... keeping dreaming/smoking the whoopie weed... Here is what I wrote in that thread (note that NOWHERE did I say that ATi was capable of FP16 HDR or FP16 HDR + AA...

Gstanfor
Golden Member

Posts: 1345
Joined: 10/19/1999
quote:

--------------------------------------------------------------------------------
So HDR + AA doesn't look that out there, especially when using ATI's new (and so far exclusive) FX10 mode
--------------------------------------------------------------------------------



This is the most amusing part of the whole "ATi has HDR and AA" argument to me. Where are the ATi precision trolls now - its less than FP32 (ATi like to trumpet that r5xx feature FP32 throughou the entire pipeline...), its less than FP24, the previous "golden standard according to ATi and the fanATics, its less than FP16 (nVidia/S3's partial precision, and not good enough/not part of the DX9 specs according to ATi and the fanATics), its even less precision than the FX12 mode found in nV30! It isn't an industry defined standard that has trickled down to consumers either.

Of course now that ATi's latest and greatest GPu uses it none of that matters to the fanATics...
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: M0RPH
AOEIII is definately a game where HDR adds to the graphics, IMO. Assuming you want to play the game with HDR+AA, let's look at your choices:

1600x1200 HDR+4xAA

1900XTX 54.2
1900XT 51.1
7800GTX512 32.9
7800GTX 25.3

Hmm, I wonder which card you should choose.

Well maybe you should take into consideration that the ATI cards are doing significantly less work. Why? Because while they are cruising along on MSAA the Nvidia cards are using SSAA...thats just the way it works with HDR + AA in AOEIII

M0RPH, you still didn't answer the question posted for you earlier. Do you even know ANYTHING about video marchitecture and algorithms or are you just going to blindly recommend one brand of cards for personal reasons.

-Kevin
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: M0RPH
AOEIII is definately a game where HDR adds to the graphics, IMO. Assuming you want to play the game with HDR+AA, let's look at your choices:

1600x1200 HDR+4xAA

1900XTX 54.2
1900XT 51.1
7800GTX512 32.9
7800GTX 25.3

Hmm, I wonder which card you should choose.

One that doesn't "optimize" the game behind your back? :p
 

fierydemise

Platinum Member
Apr 16, 2005
2,056
2
81
Originally posted by: Gstanfor
One that doesn't "optimize" the game behind your back? :p

Do you have any proof that ATI cards are using IQ reducing optimizations or are you just assuming it?
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: fierydemise
Originally posted by: Gstanfor
One that doesn't "optimize" the game behind your back? :p

Do you have any proof that ATI cards are using IQ reducing optimizations or are you just assuming it?

see my signature, read the B3D thread.
 

M0RPH

Diamond Member
Dec 7, 2003
3,302
1
0
Originally posted by: Gstanfor
Originally posted by: fierydemise
Originally posted by: Gstanfor
One that doesn't "optimize" the game behind your back? :p

Do you have any proof that ATI cards are using IQ reducing optimizations or are you just assuming it?

see my signature, read the B3D thread.

You haven't answered his question. Do you have any prrof that there are IQ reducing optimizations? ANd how bout a link for this B3D thread you spek of?
 

fierydemise

Platinum Member
Apr 16, 2005
2,056
2
81
Originally posted by: Gstanfor
see my signature, read the B3D thread.

Can you provide a link to this thread? all your sig tells me is that there are optimizations, not whether or not they have an adverse effect on IQ.
 

Dethfrumbelo

Golden Member
Nov 16, 2004
1,499
0
0
It's seems unlikey that G71 is going to have 32 pixel shader pipelines for a refresh. 8 vertex shader pipes, 16 ROPs (not a bottleneck anyway), and still 24 pixel shader pipes, probably with an extra shader unit ot two per pipe, die shrink, increased GPU/memory speeds.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
ATi hardware is incapable of doing HRD + AA @FP16. The use the fixed point (integer based) FX10 format instead.
The usual BS and misinformation we've come to expect from Gstanfor.

Tom's Hardware
ATI offers HDR with AA in one of three modes: Multisampling AA (MSAA), Temporal AA and ATI's new feature, Adaptive AA (AAA), which is detailed below. ATI states that the X1000 series can handle a wide array of HDR formats with the fastest speed being 32-bit (Integer 10), the maximum value of 64-bit (Floating Point 16 and Integer 10) and other custom combinations.

Tech Report
HDR support with AA ? R520 and the gang can do filtering and blending of 16-bit per color channel floating-point texture formats, allowing for easier use of high-dynamic-range lighting effects?just like the GeForce 6 and 7 series. Unlike NVIDIA's GPUs, the R500 series can also do multisampled antialiasing at the same time, complete with gamma-correct blends. ATI's new chips also support a 10:10:10:2 integer buffer format that offers increased color precision at the expense of alpha transparency precision. This is a smaller 32-bit format, but it should still be suitable for high-dynamic-range rendering.

Beyond3D.
With the entire range of chips using the R520 architecture ATI will now support HDR blending, but will do so under a number of formats:

FP16 - 64-bit floating point
Int16 - 64-bit integer
Int10 - 32-bit 10-10-10-2
Custom formats (eg Int10+L16)

Although ATI have provided these HDR blending capabilities in the R520 architecture they haven't removed any orthogonally, meaning that all modes of operation that run through the ROP's work with one another, the net result is that all of the FSAA options that are provided under standard blending buffers are equally supported under any of the HDR blending modes,
But hey, don't let reality get in the way of your delusions.

Then you will have no trouble pointing to examples of fully hardware accelerated (ie: non software based) ATi HDR +AA, in a mode grater than FX10 will you m0rph?
He won't when developers start taking advantage of it. Or in your little land does a feature not exist unless developers take advantage of it?
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
Well maybe you should take into consideration that the ATI cards are doing significantly less work. Why? Because while they are cruising along on MSAA the Nvidia cards are using SSAA...thats just the way it works with HDR + AA in AOEIII
That may be so but the article states this hacked implementation has lower IQ than ATi's 4xAA mode.
 

TheRyuu

Diamond Member
Dec 3, 2005
5,479
14
81
I was never expecting HDR+AA on the G71 so I'm not dissapointed. Plus with some of the games comming out, HDR+AA is just murder on your system at higher resolutions. I don't plan on upgrading till the G80 (hopefully) and hell, I might even go ATI.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
First off, I never specified what optimizations (IQ etc) may be present, mainly because Eric never did either, so why are people asking me about reduced IQ?

B3D thread 1 extra reading

About FP16 & AA on R52x, so far as I was aware it was unavailable. If it is available, then I apologize and retract negative comments directly related to its lack of availability herewith.

However, IF, it is available, why isn't it being used in favor of the FX10 version (which is seeing use)? If it is present, my guess would be because it is too slow to be useful.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: Gstanfor
However, IF, it is available, why isn't it being used in favor of the FX10 version (which is seeing use)? If it is present, my guess would be because it is too slow to be useful.

Where did you get it that FX10 was being used?!
 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
Originally posted by: xtknight
Originally posted by: Gstanfor
However, IF, it is available, why isn't it being used in favor of the FX10 version (which is seeing use)? If it is present, my guess would be because it is too slow to be useful.

Where did you get it that FX10 was being used?!

Too slow on NV cards maybe?
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Farcey and SplinterCell CT both use FP16 HDR. Now if ATi supports FP16 HDR and AA is truly orthagonal to that then both titles should "just work" with HDR & AA turned on.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Few things:

That Hardware.fr article everyone's referencing has been available in English at their sister site, BeHardware.com, for a little while, so we can all give Babelfish a rest. ;) You can read about HDR + AA here.

Kevin, you can check out that link to see that AoE3 uses SSAA with "HDR" for NV cards. I'm not sure if the fact that ATI cards are doing "less" work matters as much as the fact that they appear to look the same as NV cards--well, I haven't seen a site saying there was an apparent difference, but I may be wrong. You can probably chalk the "look as good" part up to AoE3's HDR apparently being just bloom (much like SS2's HDR, judging from screenshots).

Far Cry's HDR involves FP16 blends and buffers, not FP32. If ppl think FP16 buffers require a lot of bandwidth and FP16 + MSAA requires too much bandwidth (check out benchmark scores and note that R5x0 is apparently limited to 4xMSAA with FP16 buffers), imagine what FP32 would do.

"ATi hardware is incapable of doing HRD + AA @FP16. The use the fixed point (integer based) FX10 format instead."
Greg, how do you know ATI can't MSAA FP buffers? You and Rollo are both convinced of this, yet we see multiple reviews benching R5x0 AAing FP16 buffers in both Far Cry and Serious Sam 2. They don't report IQ differences that are so obvious when comparing SC:CT's FP16 vs. FX16 "HDR." Are you getting word from NV that ATI can't apply MSAA to FP16 buffers? If not, then from who? It's not impossible for ATI to be lying or skirting the truth, but then multiple reviewers have supposedly seen it in action with Far Cry and even benched it. Are they lying, too, or just suckers with bad eyesight?

As for HDR+AA with higher-precision buffers than FX10, well, SC:CT supposedly uses FX16 for "HDR" with ATI cards.... But you have to think that simple bandwidth concerns are going to make lower-precision buffers worthwhile for performance reasons, even though they may not look as good as FP16 buffers. Heck, both the 360 and PS3 are packing 128-bit buses. Can we expect FP16 blending and buffering on that kind of bandwidth? Even tho Xenon has EDRAM, if regular "FX8" 720p buffers have trouble fitting into its 10MB for just 2xAA, FP16 buffers are right out. FX10 supposedly fits into FX8's footprints but allows for slightly higher precision. I'd like to see more proof before I call it "cheesy."

As for optimizing "behind your back," I've not seen one site note an IQ difference b/w ATI's and NV's "HDR" (looks more like simply bloom) paths in AoE3.

"When nVidia does impliment HDR+AA I expect it will be FP16 + AA, not ATi's cheesy FX10 "HDR" mode."
When NV implements HDR+AA (as FP16 + MSAA) with G80, there'll be more than enough bandwidth and fillrate to make it very playable. I'm not sure how that relates to ATI having it now at decent framerates, or to the fact that ATI will still have it then with R600 and at likely similar framerates to G80 (assuming similar bandwidth).

You keep trashing FX10 as not HDR, which is true. But why are you mocking its existence, given the reality of bandwidth and ROP and RAM limitations of most (SM3) hardware? It's at least a step beyond FX8, however small. And ye olde "FX8" buffer doesn't seem to hurt Source's "HDR."

"HDR+AA just isn't that important yet - it's too slow to be useful, better to stick with just plain old HDR in that case." Shades of V3's 22-bit color. :p I thought we've established that giving us the choice is preferable to having it made for us?

BFG, I'm pretty sure that TR quote is wrong in that R520 & co. doesn't offer "hardware" FP16 filtering. If devs want FP16 filtering with R520+, they'll have to implement it themselves (as FutureMark did with 3DM06: Ctrl-F for Shader emulated floating-point filtering results).
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Excellent post!!! (As always)

Just out of curiosity, i know you told me this before when i made the thread on it, but why is the Nvidia hardware incapable of running FSAA + HDR at the same time? What are they missing/what would they have to implement to get it to work?

Thanks,
-Kevin
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Pete, it all comes back to precision and effects really.

Everyone wants pretty DX9 games, yes? all the fanatics scream any precision less than FP24 isn't enough. They grudgingly accept FP16 for HDR. FX10 isn't even enough precision for ATi's DX8 gen SM1.4 (which requires FP12).

So, if you are going to use FX10 for HDR how are you going to hang on to your much vaunted precision for rendering effects correctly? That's the question the fanatics need to answer - they pooh-pooh'd low precision in the past in favor of FP24, now suddenly FX10 is just peachy? :roll: :shocked: :disgust:
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: Gstanfor
Pete, it all comes back to precision and effects really.

Everyone wants pretty DX9 games, yes? all the fanatics scream any precision less than FP24 isn't enough. They grudgingly accept FP16 for HDR. FX10 isn't even enough precision for ATi's DX8 gen SM1.4 (which requires FP12).

So, if you are going to use FX10 for HDR how are you going to hang on to your much vaunted precision for rendering effects correctly? That's the question the fanatics need to answer - they pooh-pooh'd low precision in the past in favor of FP24, now suddenly FX10 is just peachy? :roll: :shocked: :disgust:

I think you are trying to compare apples to oranges here. Rendering the image with floating point numbers is one thing. However rendering high dynamic lighting and applying 32bit precision buffers along with AA is completely different.

Im not 100% sure though, i am still learning this stuff.

-Kevin
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Kevin, I honestly have no idea. I guess it involves transistors b/w the ROPs and the memory controller. Maybe nano-leprechauns, too.

Greg,

Everyone wants pretty DX9 games, yes?
Oh God, yes.

all the [kids say] any precision less than FP24 isn't enough. They grudgingly accept FP16 for HDR. FX10 isn't even enough precision for ATi's DX8 gen SM1.4 (which requires FP12).
Er, wait ... what? You're mixing terms here, and so confusing me. (And no one's screaming anything. Leave "fanatics" out of this, FFS. If you think I'm a fanatic, then use it in singular form. Otherwise, let's stick to the ppl actively involved in this discussion, please.)

FP16, as I understand it (as s10e5), has less precision than both FX12 (NV) and FX16 (ATI). It has greater precision and range than FX8, though. The question of FP24 (s16e7?) or FP32 (s23e8?) precision in intermediate shader calculations seems unrelated to that of high (dynamic) range buffers, where I believe we've been stuck at FX8 since roughly forever (TNT?). In the case of buffers, FX10 is an improvement, and FP16 a further one. It does seem fair to call FX10 more MDR and FP16 HDR, though even I'm not crazy enough to try and convince marketers what's proper.

That's news to me that PS1.4 requires FP12. I thought a) R300 was the first gen to offer FP in the fragment shaders, b) R200 was FX16, and c) NV2x was FX12. Either you've mixed a), b), and c) to come up with FP12, or I'm the one who's mixed up.

So, if you are going to use FX10 for HDR how are you going to hang on to your much vaunted precision for rendering effects correctly?
I dunno. FX10 seems to have the same precision as FP16 but not the same range. I don't know how the dynamic part of high dynamic range plays into this, but Source seems to be doing a decent job with what I believe are FX8 buffers. Then again, maybe what I'm seeing with my 9800 in CSS maps like Nuke and Militia is better classified as bloom. In any case, yeah, minimum FP16 straight through would be ideal, but is it currently feasible? That might be answered by the fact that ATI is "compromising" FP16 to allow for AA.

That's the question the [kids] need to answer - they pooh-pooh'd low precision in the past in favor of FP24, now suddenly FX10 is just peachy?
Apples and oranges? Eh, we'll see what future games do with "HDR." Maybe the upcoming Unreal Performance Test will shed some light on this.

---

Thanks, Apple, much appreciated. I'm not sure you can read too much into an instantaneous FRAPS grab, but you can see some FC HDR benches here that show a X1800XT with 4xAA is basically twice as fast as a 6800GT without, a least at 1600x1200. The X1900XT/X are a good bit faster w/o AA, but it seems FP16 + AA is just throttled by bandwidth, as the gap b/w the X1900 and X1800 narrows. GDDR4 might blow this whole FP16+AA thing wide open, even for cards on 128-bit buses.

Cool. Just a few posts down in that NVN thread, 5'8 also compares "HDR" on a X1800 and 6800 in AoE3. Aside from some odd details (missing/reduced shadows, lighter static shadows in water, less pronounced bump mapping), it doesn't look like FX10 is noticably inferior to FP16, at least in the brightness intensity. Then again, I'm not sure if I'm seeing anything more than a touch o' bloom. The lighter static shadows in the water could be reduced precision at work.