Firingsquad Publishes HDR + AA Performance Tests

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: Crusader
No thanks RobertR1, but it's interesting you call men "sweetie" and want them to "sit on your lap".

It's also interesting you played 75 hours of Oblivion at a setting HardOCP deemed "unplayable". ;)

http://enthusiast.hardocp.com/article.html?art=MTA4Myw1LCxoZW50aHVzaWFzdA

Originally posted by: ST
In my testing of an x1900xt, i found a discernable difference when enabling HDR + AA. THese numbers were Frapped directly from a set course i used in Oblivion outdoors (10 min run) at 1920x1080p (my LCDTV's native resolution) w/ all image settings maxed:

HDR w/ No AA 8x AF :

ATI Stock 621MHz Core - 720MHz Mem (HQAF)
Frames, Time (ms), Min, Max, Avg
13968, 543034, 2, 57, 25.722

ATI OC 655MHz Core - 792MHz Mem (HQAF)
Frames, Time (ms), Min, Max, Avg
14778, 540610, 16, 58, 27.336


HDR-4X AA 8x HQAF :

ATI Stock 621MHz Core - 720MHz Mem
Frames, Time (ms), Min, Max, Avg
10539, 562581, 0, 36, 18.733

ATI OC 655MHz Core - 792MHz Mem
Frames, Time (ms), Min, Max, Avg
11492, 549567, 12, 45, 20.911


At higher resolutions, you will see quite a detrimental loss in framerate. NOte that I'm not knocking the X1900s, as I prefer the superior HDR+AA features, but some folks are really exagerating the performance delta. I would assume the next gen of vid cards (x1950xt?) w/ the higher clock rates will rectify this though, as evidenced by the OC fps numbers.

Hey, it's not just me and HardOCP saying it. Our own ST has shown us how much of a slide show Oblivion is at those settings on a X1900XT clocked at 655/792.

Don't they warn kids with epilepsy about flashing strobe lights like that?
 
Apr 6, 2006
32
0
0
Originally posted by: Crusader
Hey, it's not just me and HardOCP saying it. Our own ST has shown us how much of a slide show Oblivion is at those settings on a X1900XT clocked at 655/792.

Don't they warn kids with epilepsy about flashing strobe lights like that?

okies, oblivion with HDR+AA is a slideshow at those settings, but they are also a slideshow on a nvidia card with those settings but without AA.
 

LittleNemoNES

Diamond Member
Oct 7, 2005
4,142
0
0
The thing is that oblivion is barely playable with max setting above 1280x1024. You and I know it. That's why you need crossfire/SLI. Of course since oblivion is an extremely demanding game, you can cut current gen cards some slack. I get 40-60 FPS (vsync+triple buffering) with 1680x1050 HDR+4xAA and 16xAF with 4096x4096 LOD textures mods. From my personal experience, the game is mostly CPU bound. Bring on Conroe! BTW, if you disagree with me you are wrong. If I werent so lazy, I'd prove my performance. Will do if requested by many.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: spank
Originally posted by: Crusader
Hey, it's not just me and HardOCP saying it. Our own ST has shown us how much of a slide show Oblivion is at those settings on a X1900XT clocked at 655/792.

Don't they warn kids with epilepsy about flashing strobe lights like that?

okies, oblivion with HDR+AA is a slideshow at those settings, but they are also a slideshow on a nvidia card with those settings but without AA.

Spank,
The stock 7950 GX2 gets over 20% higher at 19X12 HDR 0X16X at the most grueling Oblivion test:

http://www.firingsquad.com/hardware/geforce_7950_gx2_preview/page10.asp

At a standard outdoor scene the GX2 gets almost triple the average of HDR+ AA on the OCd XT that ST was kind enough to bench for us:

http://www.firingsquad.com/hardware/geforce_7950_gx2_preview/page9.asp
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: Crusader
Originally posted by: spank
Originally posted by: Crusader
Hey, it's not just me and HardOCP saying it. Our own ST has shown us how much of a slide show Oblivion is at those settings on a X1900XT clocked at 655/792.

Don't they warn kids with epilepsy about flashing strobe lights like that?

okies, oblivion with HDR+AA is a slideshow at those settings, but they are also a slideshow on a nvidia card with those settings but without AA.

Spank,
The stock 7950 GX2 gets over 20% higher at 19X12 HDR 0X16X at the most grueling Oblivion test:

http://www.firingsquad.com/hardware/geforce_7950_gx2_preview/page10.asp

What's it like to be you I wonder. I hope you are paid for this crap.


7950GX2 get's 20% higher framerates with no AA/HDR than an XTX gets WITH AA and HDR. That makes the XTX look good or the Gx2 look bad

Glad I went with an XTX knowing what kind of performance hit the 79/7800 series takes with AA enabled. If it could that is

 

LittleNemoNES

Diamond Member
Oct 7, 2005
4,142
0
0
Originally posted by: Frackal
Originally posted by: Crusader
Originally posted by: spank
Originally posted by: Crusader
Hey, it's not just me and HardOCP saying it. Our own ST has shown us how much of a slide show Oblivion is at those settings on a X1900XT clocked at 655/792.

Don't they warn kids with epilepsy about flashing strobe lights like that?

okies, oblivion with HDR+AA is a slideshow at those settings, but they are also a slideshow on a nvidia card with those settings but without AA.

Spank,
The stock 7950 GX2 gets over 20% higher at 19X12 HDR 0X16X at the most grueling Oblivion test:

http://www.firingsquad.com/hardware/geforce_7950_gx2_preview/page10.asp

What's it like to be you I wonder. I hope you are paid for this crap.


7950GX2 get's 20% higher framerates with no AA/HDR than an XTX gets WITH AA and HDR. That makes the XTX look good or the Gx2 look bad

Glad I went with an XTX knowing what kind of performance hit the 79/7800 series takes with AA enabled. If it could that is

I think the bigger point is 1 card with built in SLi vs a single card is teh cheap. Not fair to compare x1900xtx vs 7950

x1900xtx vs 7900GTX = Fair
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Crusader
Originally posted by: spank
Originally posted by: Crusader
Hey, it's not just me and HardOCP saying it. Our own ST has shown us how much of a slide show Oblivion is at those settings on a X1900XT clocked at 655/792.

Don't they warn kids with epilepsy about flashing strobe lights like that?

okies, oblivion with HDR+AA is a slideshow at those settings, but they are also a slideshow on a nvidia card with those settings but without AA.

Spank,
The stock 7950 GX2 gets over 20% higher at 19X12 HDR 0X16X at the most grueling Oblivion test:

http://www.firingsquad.com/hardware/geforce_7950_gx2_preview/page10.asp

At a standard outdoor scene the GX2 gets almost triple the average of HDR+ AA on the OCd XT that ST was kind enough to bench for us:

http://www.firingsquad.com/hardware/geforce_7950_gx2_preview/page9.asp

Now, are you just pretending that a single x1900xtx is as slow in Oblivion as a single 7900gtx or is there some magic trick I gotta do to show the scores for an xtx in the benches you linked?
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: gersson

I think the bigger point is 1 card with built in SLi vs a single card is teh cheap. Not fair to compare x1900xtx vs 7950

Comparing NVIDIA's top card to ATI's top card is not fair? That's kinda funny.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Wreckage
Originally posted by: gersson

I think the bigger point is 1 card with built in SLi vs a single card is teh cheap. Not fair to compare x1900xtx vs 7950

Comparing NVIDIA's top card to ATI's top card is not fair? That's kinda funny.

It isn't when you take into account the price points. The 7950GX2 costs more than the X1900XTX, so (one would hope) that the more expensive product would at least perform better since we all know it doesn't look better. Granted though, ATI has yet to implement a dual-GPU product such as the 7950GX2. I personally would have went with the GX2 had they offered any different features to the table. In short, it is the 7800 on steroids with another 7800 on steroids spotting it.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: gersson
Originally posted by: Frackal
Originally posted by: Crusader
Originally posted by: spank
Originally posted by: Crusader
Hey, it's not just me and HardOCP saying it. Our own ST has shown us how much of a slide show Oblivion is at those settings on a X1900XT clocked at 655/792.

Don't they warn kids with epilepsy about flashing strobe lights like that?

okies, oblivion with HDR+AA is a slideshow at those settings, but they are also a slideshow on a nvidia card with those settings but without AA.

Spank,
The stock 7950 GX2 gets over 20% higher at 19X12 HDR 0X16X at the most grueling Oblivion test:

http://www.firingsquad.com/hardware/geforce_7950_gx2_preview/page10.asp

What's it like to be you I wonder. I hope you are paid for this crap.


7950GX2 get's 20% higher framerates with no AA/HDR than an XTX gets WITH AA and HDR. That makes the XTX look good or the Gx2 look bad

Glad I went with an XTX knowing what kind of performance hit the 79/7800 series takes with AA enabled. If it could that is

I think the bigger point is 1 card with built in SLi vs a single card is teh cheap. Not fair to compare x1900xtx vs 7950

x1900xtx vs 7900GTX = Fair

I wasn't trying to imply anything about the ATI card Gersson. It's a fine piece of forward looking engineering that has some very nice features.

I was merely pointing out that a nVidia card can run those settings at better framerates in response to Spank's assertion none could. The XFX XXX would likely be highly playable at those settings, given the huge stock OC.
 

RobertR1

Golden Member
Oct 22, 2004
1,113
1
81
Originally posted by: Crusader
Originally posted by: RobertR1
I played through 75+ hours of Oblivion using 4xAA + 8xHQAF at 1920x1200. Works fine.

Crusader, sweetie, wanna come sit on my lap while daddy drives the mouse and keyboard so you can see the pretty picture???

No thanks RobertR1, but it's interesting you call men "sweetie" and want them to "sit on your lap".

It's also interesting you played 75 hours of Oblivion at a setting HardOCP deemed "unplayable". ;)

http://enthusiast.hardocp.com/article.html?art=MTA4Myw1LCxoZW50aHVzaWFzdA

Yes because my life is based off what Hardop tells me.
Wink away princess. For a guy who believes bethsada should determine how the game is played why are you even concerned with HDR+AA since it's not officially supported?


 

Tig Ol Bitties

Senior member
Feb 16, 2006
305
0
0
*sigh* another NV vs. ATI thread, might as well participate since this is the only issue that keeps video alive apparently...at least it's somewhat civil. I'm just wondering how many of the "Nvidiots" have actually used the 1900 series card instead of just relying on other website benchmarks as facts. I find Ackmed, Frackal, hell even Joker (being that they are ATI fanboys, Frackal less so) have more credibility in their views of both camps because quite simply they've owned cards on both sides. I'm pretty certain Crusader has never owned a ATI card after that retarded post (it truly was) he made about being un-American for choosing ATI. Not so sure about Wreckage, but he seems pretty certain of his findings strictly through websites which can be considered ALL biased based on his definition of the term. My point, I take the ATI fanboys views more seriously than Crusader or Wreckage because they are actually witnesses to both sides and they have far more mature responses.

But hey, I'm not so blindsided to just rely on their take of ATI's better IQ and the whole HDR+AA shabang. I helped my cousin build his rig (got a 1900xtx) and he passed off to me his 7800 gtx to mate with my first one. Granted I get more fps in most games, but I certainly do notice the superior IQ and sharp colors of ATI(of course this can be rather subjective to others). HDR+AA ran just fine on Oblivion 4x16x @1600x1200 with a tweaked .ini file...rarely did it ever dip below 25fps...maxed settings except shadows, which were either low or off. What may seem unplayable to HardOCP may be very playable to others and besides, each computer setup is different...I certainly don't run what HardOCP runs in their test setup and my numbers are better than theirs. I love my cards, but I have to admit ATI has the edge with the 1900's...if I was buying a GPU today with nothing in my rig, I'd go for the X1900XT solution, there are ways to get around the heat and noise.

So Wreckage and Crusader can go on and on linking those sites as backup all day if they want, I'll still know that 1st-hand-use, which they likely have none of with ATI, wins over anything they have to say, as I'm sure most of us realize. Just as ATI's 1900s have an edge over NV's 7900s, the ATI fanboys have an edge in credibility over the Nvidiots in this thread IMO.

Oh, and comparing the 7950 to 1900xtx is weak point/argument...comparing two cards pretty much in SLi against a single card, I'm more impressed with ATI's performance in the comparison. Good job :thumbsdown:

Edited for missed content :eek:
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Originally posted by: munky
Originally posted by: Crusader
Originally posted by: spank
Originally posted by: Crusader
Hey, it's not just me and HardOCP saying it. Our own ST has shown us how much of a slide show Oblivion is at those settings on a X1900XT clocked at 655/792.

Don't they warn kids with epilepsy about flashing strobe lights like that?

okies, oblivion with HDR+AA is a slideshow at those settings, but they are also a slideshow on a nvidia card with those settings but without AA.

Spank,
The stock 7950 GX2 gets over 20% higher at 19X12 HDR 0X16X at the most grueling Oblivion test:

http://www.firingsquad.com/hardware/geforce_7950_gx2_preview/page10.asp

At a standard outdoor scene the GX2 gets almost triple the average of HDR+ AA on the OCd XT that ST was kind enough to bench for us:

http://www.firingsquad.com/hardware/geforce_7950_gx2_preview/page9.asp

Now, are you just pretending that a single x1900xtx is as slow in Oblivion as a single 7900gtx or is there some magic trick I gotta do to show the scores for an xtx in the benches you linked?

Nice job on comparing a dual card using HDR only to a single card using both hdr + aa... let me enlighten you

1600x1200

http://www.firingsquad.com/hardware/geforce_7950_gx2_preview/page10.asp

7950 GX2 - 32.4

http://www.firingsquad.com/hardware/hdr_aa_ati_radeon_x1k/page7.asp

x1900 - 30.2

Wow, some TRIPLE gain huh? oh wait... cherry picking rules

:roll:
 

RobertR1

Golden Member
Oct 22, 2004
1,113
1
81
Tig Ol Bitties,

In their head, Oblivion must be played in the exact settings and tweaks that make them look right.

 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Tig Ol Bitties
*sigh* another NV vs. ATI thread, might as well participate since this is the only issue that keeps video alive apparently...at least it's somewhat civil. I'm just wondering how many of the "Nvidiots" have actually used the 1900 series card instead of just relying on other website benchmarks as facts. I find Ackmed, Frackal, hell even Joker (being that they are ATI fanboys, Frackal less so) have more credibility in their views of both camps because quite simply they've owned cards on both sides. I'm pretty certain Crusader has never owned a ATI card after that retarded post (it truly was) he made about being un-American for choosing ATI. Not so sure about Wreckage, but he seems pretty certain of his findings strictly through websites which can be considered ALL biased based on his definition of the term. My point, I take the ATI fanboys views more seriously than Crusader or Wreckage because they are actually witnesses to both sides and they have far more mature responses.

But hey, I'm not so blindsided to just rely on their take of ATI's better IQ and the whole HDR+AA shabang. I helped my cousin build his rig (got a 1900xtx) and he passed off to me his 7800 gtx to mate with my first one. Granted I get more fps in most games, but I certainly do notice the superior IQ and sharp colors of ATI(of course this can be rather subjective to others). HDR+AA ran just fine on Oblivion 4x16x @1600x1200 with a tweaked .ini file...rarely did it ever dip below 25fps...maxed settings except shadows, which were either low or off. What may seem unplayable to HardOCP may be very playable to others and besides, each computer setup is different...I certainly don't run what HardOCP runs in their test setup and my numbers are better than theirs. I love my cards, but I have to admit ATI has the edge with the 1900's...if I was buying a GPU today with nothing in my rig, I'd go for the X1900XT solution, there are ways to get around the heat and noise.

So Wreckage and Crusader can go on and on linking those sites as backup all day if they want, I'll still know that 1st-hand-use, which they likely have none of with ATI, wins over anything they have to say, as I'm sure most of us realize. Just as ATI's 1900s have an edge over NV's 7900s, the ATI fanboys have an edge in credibility over the Nvidiots in this thread IMO.

Oh, and comparing the 7950 to 1900xtx is weak point/argument...comparing two cards pretty much in SLi against a single card, I'm more impressed with ATI's performance in the comparison. Good job :thumbsdown:

Edited for missed content :eek:

WHS
 

thilanliyan

Lifer
Jun 21, 2005
12,040
2,255
126
Originally posted by: Tig Ol Bitties
HDR+AA ran just fine on Oblivion 4x16x @1600x1200 with a tweaked .ini file...rarely did it ever dip below 25fps...maxed settings except shadows, which were either low or off. What may seem unplayable to HardOCP may be very playable to others and besides, each computer setup is different...I certainly don't run what HardOCP runs in their test setup and my numbers are better than theirs.

About the shadows, I also noticed a large performance hit with them on. I turned it completely off and my fps also rarely dropped below 25fps.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: ShadowOfMyself
Originally posted by: munky
Originally posted by: Crusader
Originally posted by: spank
Originally posted by: Crusader
Hey, it's not just me and HardOCP saying it. Our own ST has shown us how much of a slide show Oblivion is at those settings on a X1900XT clocked at 655/792.

Don't they warn kids with epilepsy about flashing strobe lights like that?

okies, oblivion with HDR+AA is a slideshow at those settings, but they are also a slideshow on a nvidia card with those settings but without AA.

Spank,
The stock 7950 GX2 gets over 20% higher at 19X12 HDR 0X16X at the most grueling Oblivion test:

http://www.firingsquad.com/hardware/geforce_7950_gx2_preview/page10.asp

At a standard outdoor scene the GX2 gets almost triple the average of HDR+ AA on the OCd XT that ST was kind enough to bench for us:

http://www.firingsquad.com/hardware/geforce_7950_gx2_preview/page9.asp

Now, are you just pretending that a single x1900xtx is as slow in Oblivion as a single 7900gtx or is there some magic trick I gotta do to show the scores for an xtx in the benches you linked?

Nice job on comparing a dual card using HDR only to a single card using both hdr + aa... let me enlighten you

1600x1200

http://www.firingsquad.com/hardware/geforce_7950_gx2_preview/page10.asp

7950 GX2 - 32.4

http://www.firingsquad.com/hardware/hdr_aa_ati_radeon_x1k/page7.asp

x1900 - 30.2

Wow, some TRIPLE gain huh? oh wait... cherry picking rules

:roll:

ShadowofMyself,
Please re-read the thread. I was responding to Spank's assertion no nVidia card could run Oblivion at 19X12 HDR 16X, obviously the GX2 can.

The "triple framerates" I was referring to are between STs 20fps average on an outdoor scene and FS's 54.4 fps average on the Mountain demo.

No one is talking about 16X12 anything, we were talking about a single ATI card's inability to run HDR+AA at 19X12, the default resolution for people with decent gaming monitors these days.

(and RobertR1s jailhouse shower talk laced assertion he can indeed run Oblivion at those settings, contrary to what FS, HardOCP, and our own ST have found)
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Crusader
No one is talking about 16X12 anything, we were talking about a single ATI card's inability to run HDR+AA at 19X12, the default resolution for people with decent gaming monitors these days.

I don't know, the NEC 20WMGX2, ViewSonic VX2025, the Dell 2005wfp's such as yours, etc. seem to be more popular than the 24" monitors. Just look at xtkinght's LCD guide and there isn't a single gaming monitor that his sticky recommends that can do 1920x1200. If you think that 1920x1200 is the "default" resolution in order to have a "decent" gaming resoluiton, perhaps you yourself should invest in one beeings how yours is like most others here.
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
Originally posted by: Crusader
Originally posted by: ShadowOfMyself
Originally posted by: munky
Originally posted by: Crusader
Originally posted by: spank
Originally posted by: Crusader
Hey, it's not just me and HardOCP saying it. Our own ST has shown us how much of a slide show Oblivion is at those settings on a X1900XT clocked at 655/792.

Don't they warn kids with epilepsy about flashing strobe lights like that?

okies, oblivion with HDR+AA is a slideshow at those settings, but they are also a slideshow on a nvidia card with those settings but without AA.

Spank,
The stock 7950 GX2 gets over 20% higher at 19X12 HDR 0X16X at the most grueling Oblivion test:

http://www.firingsquad.com/hardware/geforce_7950_gx2_preview/page10.asp

At a standard outdoor scene the GX2 gets almost triple the average of HDR+ AA on the OCd XT that ST was kind enough to bench for us:

http://www.firingsquad.com/hardware/geforce_7950_gx2_preview/page9.asp

Now, are you just pretending that a single x1900xtx is as slow in Oblivion as a single 7900gtx or is there some magic trick I gotta do to show the scores for an xtx in the benches you linked?

Nice job on comparing a dual card using HDR only to a single card using both hdr + aa... let me enlighten you

1600x1200

http://www.firingsquad.com/hardware/geforce_7950_gx2_preview/page10.asp

7950 GX2 - 32.4

http://www.firingsquad.com/hardware/hdr_aa_ati_radeon_x1k/page7.asp

x1900 - 30.2

Wow, some TRIPLE gain huh? oh wait... cherry picking rules

:roll:

ShadowofMyself,
Please re-read the thread. I was responding to Spank's assertion no nVidia card could run Oblivion at 19X12 HDR 16X, obviously the GX2 can.

The "triple framerates" I was referring to are between STs 20fps average on an outdoor scene and FS's 54.4 fps average on the Mountain demo.

No one is talking about 16X12 anything, we were talking about a single ATI card's inability to run HDR+AA at 19X12, the default resolution for people with decent gaming monitors these days.

(and RobertR1s jailhouse shower talk laced assertion he can indeed run Oblivion at those settings, contrary to what FS, HardOCP, and our own ST have found)


You're totally full of crap.


You're comparing:


A: 1 outdoor scene run on one system by a user here in which the XTX does poorly

with

B: A different outdoor scene run on another system by a benchmarking site in which the GX2 does well


As a direct comparison between the two cards?


You are so obnoxious its difficult to characterize, you freaking troll.


The gall you're demonstrating is insane. IMO you are either nuts, or a paid agent.

 

Elfear

Diamond Member
May 30, 2004
7,163
819
126
Originally posted by: josh6079

I don't know, the NEC 20WMGX2, ViewSonic VX2025, the Dell 2005wfp's such as yours, etc. seem to be more popular than the 24" monitors. Just look at xtkinght's LCD guide and there isn't a single gaming monitor that his sticky recommends that can do 1920x1200. If you think that 1920x1200 is the "default" resolution in order to have a "decent" gaming resoluiton, perhaps you yourself should invest in one beeings how yours is like most others here.


QFT


Originally posted by: Wreckage

Comparing NVIDIA's top card to ATI's top card is not fair? That's kinda funny.

Lol. Because we all know competition is determined by an arbitrary list and not price point huh?
 

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
Originally posted by: spank
okies, oblivion with HDR+AA is a slideshow at those settings, but they are also a slideshow on a nvidia card with those settings but without AA.

Its not in Crossfire. Some people just cant accept the facts. Jealousy is a bad thing.

Speaking of frames in Oblivion, X1900's easily have the advantage.

http://www.xbitlabs.com/articles/video/display/asus-eax1900xtx_8.html

1600x1200 HDR+16xAF
Indoor:
X1900XT: 35/54.5
7900GTX: 25/53.7

Outdoor:
X1900XT: 21/35.5
7900GTX: 14/36

Nice, 14fps minimum frames... speaking of whats playable, and whats not. Add to the fact, that ATi is running with higher IQ settings in the drivers.

 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
I agree you can't compare different timedemos; however, STs FRAPs stands on it's own as pretty good evidence Oblivion HDR+AA is not really "playable" at 19X12 on a single X1900.

Anyway you guys want to spin this, the fact of the matter is HDR+AA is still likely limited to big bucks Crossfire rigs, and the issues with that far overshadow the benefit of HDR+AA at Oblivion and Far Cry.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Crusader
I agree you can't compare different timedemos; however, STs FRAPs stands on it's own as pretty good evidence Oblivion HDR+AA is not really "playable" at 19X12 on a single X1900.

Anyway you guys want to spin this, the fact of the matter is HDR+AA is still likely limited to big bucks Crossfire rigs, and the issues with that far overshadow the benefit of HDR+AA at Oblivion and Far Cry.

I agree that one X1900XT(X) is probably not enough to have a decent gaming experience at 19x12 with max settings and 4xAA+HDR. However, I haven't seen many people who have a 24" monitor and a single X1900. I know that my single XTX at 1680x1050 with 4XAA+HDR with visual enhancing mods to boot is doing very, very well. I wouldn't want to bump up the resolution any more than what I have it at though, since--like you say--it probably isn't going to give me that great of gaming experience.

Do you think that 19x12 is the "default" resolution that one needs in order to have a decent gaming experience?
 

RobertR1

Golden Member
Oct 22, 2004
1,113
1
81
Originally posted by: josh6079
Originally posted by: Crusader
I agree you can't compare different timedemos; however, STs FRAPs stands on it's own as pretty good evidence Oblivion HDR+AA is not really "playable" at 19X12 on a single X1900.

Anyway you guys want to spin this, the fact of the matter is HDR+AA is still likely limited to big bucks Crossfire rigs, and the issues with that far overshadow the benefit of HDR+AA at Oblivion and Far Cry.

I agree that one X1900XT(X) is probably not enough to have a decent gaming experience at 19x12 with max settings and 4xAA+HDR. However, I haven't seen many people who have a 24" monitor and a single X1900. I know that my single XTX at 1680x1050 with 4XAA+HDR with visual enhancing mods to boot is doing very, very well. I wouldn't want to bump up the resolution any more than what I have it at though, since--like you say--it probably isn't going to give me that great of gaming experience.

Do you think that 19x12 is the "default" resolution that one needs in order to have a decent gaming experience?

His gaming experience must really suck then :( "- 2005FPW"


 

Nightmare225

Golden Member
May 20, 2006
1,661
0
0
I own that monitor and I have no problems with gaming... Read Anand's own Article. They said it was superb for gaming. ;)