Firingsquad Publishes HDR + AA Performance Tests

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Sonikku

Lifer
Jun 23, 2005
15,882
4,882
136
Originally posted by: josh6079
I've learned that it doesn't pay to argue with Nvidiots. If Wreckage and Crusader want to think that HDR+AA is still unplayable, that's for them to decide. Kudos for everyone else who actually gets to enjoy it.


A very sensible post. Kudos.
 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
Will G80 support HDR+AA? NVIDIA just can't make the same mistake twice. Oh and for all you idiots attacking NVIDIA, remember when the 6800 series came out? Don't tell me you sprang for the X-series... I rest my case...
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: Crusader
I agree you can't compare different timedemos; however, STs FRAPs stands on it's own as pretty good evidence Oblivion HDR+AA is not really "playable" at 19X12 on a single X1900.

Anyway you guys want to spin this, the fact of the matter is HDR+AA is still likely limited to big bucks Crossfire rigs, and the issues with that far overshadow the benefit of HDR+AA at Oblivion and Far Cry.

No it isn't. I borrowed an XTX to test out and found it played Oblivion great on my res w/ HDR+AA (1680x1050), so I bought one of my own.

You're a serial liar. Stop posting.
 

Elfear

Diamond Member
May 30, 2004
7,163
819
126
Originally posted by: Nightmare225
Oh and for all you idiots attacking NVIDIA, remember when the 6800 series came out? Don't tell me you sprang for the X-series... I rest my case...

Not sure what you mean by this but I tried both series of cards and found the X*** series to be faster.
 
Jun 14, 2003
10,442
0
0
Originally posted by: thilan29
Originally posted by: Wreckage
Actually I would consider FS the most red biased site on the net. It's no suprise that the redfanboys here always quote that site above all others.

You're joking right?? Now it's FS the most red biased?? I thought it was supposedly B3D that was so red biased?? Next it'll be AT being the most red biased?? I guess any site that paints ATI in a good light is biased huh?? So I guess any site that reviewed the 7950 and said it performs well is also biased too right??

And actually, no, you're generalizing when you say it is quoted above all others. You pulled that statement out of your a$$.

And what's been stated in the review has been stated by several people on this forum that have played Oblivion with HDR+AA. It's just that some fanboys thought they were lying. If you had an ATI card maybe you could test it out too.


everyone should take every single review site with a pinch of salt anyway.

they are more round abouts answers, than definate answers
 

Elfear

Diamond Member
May 30, 2004
7,163
819
126
Originally posted by: Nightmare225
The 6800 series was superior to the X800 series in both capabilities and speed.

Did you own both series? Did you ever look at any benchmarks comparing the two cards? If by capabilities you mean SM3.0 than you are correct but at the time (and currently for the most part) it was an almost useless feature on the 6800 series.

I went from a 6800GT@Ultra speeds (probably a tad faster than an actual Ultra because of tighter ram timings) to an X800XT PE and there was a noticeable difference in speed. I remember Doom3 ran better on the Nvidia cards but the majority of games ran better on the X800XT. I have pages of my own benchmarks comparing the two and the X800XT PE came out on top 90% of the time. The X850XT PE was faster still.
 

fierydemise

Platinum Member
Apr 16, 2005
2,056
2
81
Originally posted by: Nightmare225
The 6800 series was superior to the X800 series in both capabilities and speed.
Performance was just about equal (depending on the game) and price\performance could go either depending on the price point, the 6 series did have more features but most were arguably unimportant because of lack of support or insufficient performance to take advantage of the capabilities.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
For the 6800 vs X800 Series both parties had their Pro and Cons.

Nvidia

Pros
Shader Model 3.0 support, OpenEXR FP16 HDR support, excellent OpenGL performance, great DirectX performance, SLI Support with intro of PCI-E

Cons
Higher Power Consumption (~80W), Dual Slot solution (6800 Ultra), a tad slower in Direct3D games, poor availability on 6800 Ultra intially

ATI
Pros
Lower Power Consumption (~65-70W), great Direct3D performance, shorter PCB

Cons
Lack of Shader Model 3.0, and OpenEXR HDR FP16, slower in OpenGL games, very bad availability on X800 XT PE initially for sometime, no Crossfire till very late in the life cycle.

After Nvidia screwed up and ATI got it's act together it's semes both companies have done well, and not made any more dramatic mistakes. They are quite competitive, with arguments which could be made for each side depending on which angle you looked at it from.
 

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
Originally posted by: josh6079
I agree that one X1900XT(X) is probably not enough to have a decent gaming experience at 19x12 with max settings and 4xAA+HDR. However, I haven't seen many people who have a 24" monitor and a single X1900. I know that my single XTX at 1680x1050 with 4XAA+HDR with visual enhancing mods to boot is doing very, very well. I wouldn't want to bump up the resolution any more than what I have it at though, since--like you say--it probably isn't going to give me that great of gaming experience.

Do you think that 19x12 is the "default" resolution that one needs in order to have a decent gaming experience?

Dont listen to his ignorance. Oblivion doesnt even use timedemos. Before, he posted even Crossfire rigs couldnt get playable frames with HDR+AA at 1920x1200. Now hes backtracking to a single card. The simple fact is, ATi can do HDR+AA in Oblivion, and NV cant. And thats why hes upset, and tries to belittle or "spin" HDR+AA.

FS numbers speak for themselves. HDR+AA is playable, on various ATi cards.

 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
I'll still be happy with my 7950GX2, right? i heard it's a great card, despite all of this HDR+AA talk...
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Originally posted by: Nightmare225
I'll still be happy with my 7950GX2, right? i heard it's a great card, despite all of this HDR+AA talk...


Sure, the gx2 IS a great card, as long as you only want raw performance and dont care about image quality, if you do then go ATI
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Ackmed
Dont listen to his ignorance. Oblivion doesnt even use timedemos. Before, he posted even Crossfire rigs couldnt get playable frames with HDR+AA at 1920x1200. Now hes backtracking to a single card. The simple fact is, ATi can do HDR+AA in Oblivion, and NV cant. And thats why hes upset, and tries to belittle or "spin" HDR+AA.

FS numbers speak for themselves. HDR+AA is playable, on various ATi cards.

He is right to a certain extent. The X1900 isn't God's card and unstoppable no matter what resolution you put to it. It has a limit, just like every other hardware made. I myself haven't tried a 1920x1200 LCD with max Oblivion settings so I am of course just speculating. I think that 19x12 is just a difficult resolution for gaming since single cards when presented with demanding games are stretched a little thin. I could be wrong of course, but my 1680x1050 with all of my settings gets me playable frames. I just think that if I were to increase the resolution that those frames would drop more than what I--or anyone else--would want them to.

I am still curious though Crusader, do you think that 1920x1200 is the default resolution one needs to get a decent gaming experience?
 

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
Sure, its not going to be playable to some people, yet it is to others. He is basing it on pure speculation, with no review to back him up. First, he claimed no ATi user could get playable frames at that res, and settings. Now its back down to one card. Its laughable, backtracking to try and save face. People are posting they get playable frames. FS has even posted an article, with great frames at 1600x1200. He has not a clue, and doesnt even have a card, or monitor to run the settings. Calling HDR+AA useless, is more ignorance, which he is full of. Please dont quote him... I cant skip his avatar when he gets quoted.

The whole point of this thread is, ATi can play Oblivion and Farcry with HDR+AA and get very playable frames. With various cards. Thats the bottom line, and it doesnt site well with NV "fans". So what do they do? Try to belittle it, which is pretty sad.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: Nightmare225
I'll still be happy with my 7950GX2, right? i heard it's a great card, despite all of this HDR+AA talk...

It's the best single card on the market. ATI only HDR is not even officially supported in Oblivion. (You can use bloom in Oblivion.) Other than that very few games support it. You can still see HDR+AA on your card on Day of Defeat:Source and HL2 Episode one. You can use Tranparency AA and SLI AA for superior image quality.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: Ackmed

The whole point of this thread is, ATi can play Oblivion and Farcry with HDR+AA and get very playable frames. With various cards. Thats the bottom line, and it doesnt site well with NV "fans". So what do they do? Try to belittle it, which is pretty sad.
So a X1600 will do HDR+AA at 1920+1200 at 100fps....wow :roll:

I don't disagree that it's playable on a high end crossfire system or with a XTX and a $1000 CPU. However with the average system, I doubt it. I also don't see anybody spending several thousand dollars to play at 1280x1024.

True I would like to see results from other websites and it's good that ATI got the HDR+AA ball rolling. However, it's not even close to mainstream.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
You guys must be paid to consistently lie/obfuscate or what?


I can play with great success with 2xAA or 4xAA and HDR + a bunch of visual mods, Oblivion with shadows enabled and everything on high at 1680x1050.
 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
For what it's worth, I have an X1900 but I doubt I'll ever use the HDR+AA feature. I tried it out in Far Cry once and was finding that it worked okay at 1280x960 but was too slow at any higher resolution. This is more due to the Far Cry HDR itself, which causes a large performance hit particularly on ATI cards. The additional drop from AA is actually pretty minimal. I don't have Oblivion and have no plans to buy it, but from the benchmarks I have seen I would probably end up running it around 800x600 and use neither HDR nor AA in order to get decent performance. Although if SCCT supported HDR+AA I would have probably used it there, as the HDR and AA by themselves don't hurt performance much on the X1900 cards.

It's a little pointless to argue about what kind of performance is "playable," as pretty much any framerate is playable if you're used to it. I used to play an old Mac 3D racing game called Vette many years ago and the framerate I got was generally between 2 and 5, but I didn't care one bit. :D These days I rarely tolerate any minimum point under 50fps before I drop down some settings and see if it gets any better (I can play with less if I have to, but I look for rather more than playable performance when buying expensive video cards). It's certainly better to have the HDR+AA feature than not in any case and obviously many people do find the Far Cry and Oblivion framerates quite acceptable.
 

thilanliyan

Lifer
Jun 21, 2005
12,040
2,254
126
Originally posted by: CP5670
For what it's worth, I have an X1900 but I doubt I'll ever use the HDR+AA feature. I tried it out in Far Cry once and was finding that it worked okay at 1280x960 but was too slow at any higher resolution. This is more due to the Far Cry HDR itself, which causes a large performance hit particularly on ATI cards. The additional drop from AA is actually pretty minimal. I don't have Oblivion and have no plans to buy it, but from the benchmarks I have seen I would probably end up running it around 800x600 and use neither HDR nor AA in order to get decent performance. Although if SCCT supported HDR+AA I would have probably used it there, as the HDR and AA by themselves don't hurt performance much on the X1900 cards.

It's a little pointless to argue about what kind of performance is "playable," as pretty much any framerate is playable if you're used to it. I used to play an old Mac 3D racing game called Vette many years ago and the framerate I got was generally between 2 and 5, but I didn't care one bit. :D These days I rarely tolerate any minimum point under 50fps before I drop down some settings and see if it gets any better (I can play with less if I have to, but I look for rather more than playable performance when buying expensive video cards). It's certainly better to have the HDR+AA feature than not in any case and obviously many people do find the Far Cry and Oblivion framerates quite acceptable.

I get didn't go below 25fps with HDR on at 1280x1024 with my lowly X1800XL...I DOUBT you would have to go to 800x600 to get playable framerates with an X1900...that's being a bit dramatic.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
That's being ridiculous. With a ton of visual mods, all settings on high, 1680X1050 I get 30fps in the outdoors and 60-80+ in the other parts.

800x600, give me a break


My last card, 7800GTX @ 490/1380 would run Oblivion reasonably well at 4xAA with Bloom, or HDR and no AA, 8xAnistropic Filtering, the only main problem with it was the minimum frames were low, particularly when fighting outdoors
 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
Well, 30fps is far below my acceptable threshold. Although you're right, it looks like I didn't read that Firingsquad review properly. For some reason I thought the minimum scores shown at the bottom were for 1024x768. It looks like 1280x960 should in fact give pretty decent performance, although with HDR alone rather than HDR+AA.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: Wreckage
Originally posted by: Nightmare225
I'll still be happy with my 7950GX2, right? i heard it's a great card, despite all of this HDR+AA talk...

It's the best single card on the market. ATI only HDR is not even officially supported in Oblivion. (You can use bloom in Oblivion.) Other than that very few games support it. You can still see HDR+AA on your card on Day of Defeat:Source and HL2 Episode one. You can use Tranparency AA and SLI AA for superior image quality.

Come on man you must work for nvidia
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: CP5670
Well, 30fps is far below my acceptable threshold. Although you're right, it looks like I didn't read that Firingsquad review properly. For some reason I thought the minimum scores shown at the bottom were for 1024x768. It looks like 1280x960 should in fact give pretty decent performance, although not with HDR+AA.

It's different with Oblivion though. It's smooth at 20+fps. You'd have to play it to understand, its not like BF2, or HL2, etc. Plenty here will agree