7900 wont do HDR+AA, Geforce 8 will

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

bjc112

Lifer
Dec 23, 2000
11,460
0
76
I personally don't think it is that big of deal.. AA + AF + HDR is nice, but I can go with max AA and no HDR on other computers, and it doesnt bother me.. I don't think nVidia is losing too much based on this.

I do like this 1900xtx though..

:D
 
Jun 14, 2003
10,442
0
0
Originally posted by: Gstanfor
Originally posted by: M0RPH
Originally posted by: Gstanfor

Then you will have no trouble pointing to examples of fully hardware accelerated (ie: non software based) ATi HDR +AA, in a mode grater than FX10 will you m0rph? ;)

You've already been set straight in this thread that ATI is capable of FP16 just as well as Nvdia and it's just an issue of what the game developer chooses to use. I'm not going to go researching which games use which types of HDR because it doesn't interest me that much. Even if all the games out there are using FX10, I'd still rather have AA with FX-10 HDR than AA with no HDR at all. Also if it helps with performance and makes HDR+AA playable where it wouldn't be with FP16, then so be it.... that's a smart move

Uhhh... keeping dreaming/smoking the whoopie weed... Here is what I wrote in that thread (note that NOWHERE did I say that ATi was capable of FP16 HDR or FP16 HDR + AA...

Gstanfor
Golden Member

Posts: 1345
Joined: 10/19/1999
quote:

--------------------------------------------------------------------------------
So HDR + AA doesn't look that out there, especially when using ATI's new (and so far exclusive) FX10 mode
--------------------------------------------------------------------------------



This is the most amusing part of the whole "ATi has HDR and AA" argument to me. Where are the ATi precision trolls now - its less than FP32 (ATi like to trumpet that r5xx feature FP32 throughou the entire pipeline...), its less than FP24, the previous "golden standard according to ATi and the fanATics, its less than FP16 (nVidia/S3's partial precision, and not good enough/not part of the DX9 specs according to ATi and the fanATics), its even less precision than the FX12 mode found in nV30! It isn't an industry defined standard that has trickled down to consumers either.

Of course now that ATi's latest and greatest GPu uses it none of that matters to the fanATics...


hell if FX10 works, who gives a stuff
 

darXoul

Senior member
Jan 15, 2004
702
0
0
Originally posted by: Elfear
Actually, if you look at the benches from this review, the X1900XTX actually does a pretty good job with HDR+AA, even at 1920x1200. Only FarCry or SC:CT dip into the low 30's at that res.

I'm familiar with this review. I'm a pretty fps-sensitive FPS gamer (no pun intended) so pretty much in all games where I see HDR+AA results, we are on the verge of playability. I know that some people might even consider the performance in Far Cry comfortable enough (3x fps) but I do not.

 

M0RPH

Diamond Member
Dec 7, 2003
3,302
1
0
Originally posted by: darXoul

I'm familiar with this review. I'm a pretty fps-sensitive FPS gamer (no pun intended) so pretty much in all games where I see HDR+AA results, we are on the verge of playability. I know that some people might even consider the performance in Far Cry comfortable enough (3x fps) but I do not.

That's because you're a super-human mutant with vision that refreshes 300 times per second, so a game running at 80fps just looks like a slide show to you. :roll:

 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
but why is the Nvidia hardware incapable of running FSAA + HDR at the same time?
It appears to be a lacking capability of the ROPs to take AA samples when running through an FP buffer.

If you want to know the exact units that are missing you probably need to ask a hardware engineer.
 

darXoul

Senior member
Jan 15, 2004
702
0
0
Originally posted by: M0RPH
Originally posted by: darXoul

I'm familiar with this review. I'm a pretty fps-sensitive FPS gamer (no pun intended) so pretty much in all games where I see HDR+AA results, we are on the verge of playability. I know that some people might even consider the performance in Far Cry comfortable enough (3x fps) but I do not.

That's because you're a super-human mutant with vision that refreshes 300 times per second, so a game running at 80fps just looks like a slide show to you. :roll:

OMG! You blew my cover! First, the AEG affair and now THIS! Call the Police! No, wait. Call the android hunters!

On a serious note, 35-50 fps is not a convincing fps average for fast shooters in my book.
 

Jeff7181

Lifer
Aug 21, 2002
18,368
11
81
If Valve can do "HDR" and AA with Lost Coast, Day of Defeat: Source, cs_militia, etc. on current hardware why can't other game developers use similar methods? Regardless of whether it's true HDR or not, it works on current hardware, so why is Valve the only one doing it?

On a serious note, 35-50 fps is not a convincing fps average for fast shooters in my book.

I completely agree. 30 FPS is fine for RPG's and junk like that, but it just doesn't cut it for fast paced first person shooters like Counter-Strike: Source. Even in slower paced ones like FEAR it's fine when you're walking around and getting 40 FPS, but when you're trying to shoot things and they're shooting back at you, dips into the 30's and 40's are very noticeable.
 

SpeedZealot369

Platinum Member
Feb 5, 2006
2,778
1
81
Originally posted by: Jeff7181
If Valve can do "HDR" and AA with Lost Coast, Day of Defeat: Source, cs_militia, etc. on current hardware why can't other game developers use similar methods? Regardless of whether it's true HDR or not, it works on current hardware, so why is Valve the only one doing it?

On a serious note, 35-50 fps is not a convincing fps average for fast shooters in my book.

I completely agree. 30 FPS is fine for RPG's and junk like that, but it just doesn't cut it for fast paced first person shooters like Counter-Strike: Source. Even in slower paced ones like FEAR it's fine when you're walking around and getting 40 FPS, but when you're trying to shoot things and they're shooting back at you, dips into the 30's and 40's are very noticeable.

FEAR is such a hardware hog...
 

Elfear

Diamond Member
May 30, 2004
7,163
819
126
Originally posted by: darXoul

I'm familiar with this review. I'm a pretty fps-sensitive FPS gamer (no pun intended) so pretty much in all games where I see HDR+AA results, we are on the verge of playability. I know that some people might even consider the performance in Far Cry comfortable enough (3x fps) but I do not.

I agree with you in the fact that fast-paced FPS's need a lot higher than 30fps to be butter-smooth, but for single player action it's not that bad at all. Most of the games in the article are based off of single player action for which HDR+AA would be great. If the game has multiplayer capability than just switch off the HDR since you're not really looking at the pretty effects anyway.
 

TraumaRN

Diamond Member
Jun 5, 2005
6,893
63
91
Originally posted by: Jeff7181
If Valve can do "HDR" and AA with Lost Coast, Day of Defeat: Source, cs_militia, etc. on current hardware why can't other game developers use similar methods? Regardless of whether it's true HDR or not, it works on current hardware, so why is Valve the only one doing it?

On a serious note, 35-50 fps is not a convincing fps average for fast shooters in my book.

I completely agree. 30 FPS is fine for RPG's and junk like that, but it just doesn't cut it for fast paced first person shooters like Counter-Strike: Source. Even in slower paced ones like FEAR it's fine when you're walking around and getting 40 FPS, but when you're trying to shoot things and they're shooting back at you, dips into the 30's and 40's are very noticeable.

Personally I wish they did it like HL, it looks more realisitic to me, Far Cry even when you turn the HDR down is still blooming too much for me and things will seem overly bright that shouldnt be(ie skin)

On the other hand, big deal that it doesnt support AA + HDR, like people have saying if you are playing a FPS online you want the most FPS as possible. So your eye candy is probably turned down anyways.
 

darXoul

Senior member
Jan 15, 2004
702
0
0
Originally posted by: Elfear
I agree with you in the fact that fast-paced FPS's need a lot higher than 30fps to be butter-smooth, but for single player action it's not that bad at all. Most of the games in the article are based off of single player action for which HDR+AA would be great. If the game has multiplayer capability than just switch off the HDR since you're not really looking at the pretty effects anyway.

True that, although even in single player, I need more than 35-40 frames to truly enjoy a game. I'm picky, I know.

As for multiplayer, you're perfectly right. Q3 (which of course, looks dated by now anyway) seems like a game from a different era on my comp ;) Everything turned off, virtually no textures due to high picmip, vertex lighting, etc. I don't look at graphics, aim efficiency is all that counts :)
 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
I don't really consider 30 anywhere near acceptable either, at least in FPS games. I really like to have the minimums at 60 and am usually ready to turn down some settings for that, although it's very hard to get in some modern games so I settle for 45 or 50. I don't care about the averages at all, although they generally need to be over 100. For multiplayer games (I mostly play the UT games) I usually turn down the settings a lot so it's stays around 80 or 100; you don't really notice the graphical quality when playing online anyway.
 

fierydemise

Platinum Member
Apr 16, 2005
2,056
2
81
For single player I can handle 30+ but for multiplayer I crank down the settings so I can get atleast 50