NV and ATi both have successful launches this week

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
Wrong. As I have said many times before, different games need different fps.
By "different" you mean the games you use HDR+AA in compared to the games I use 8xS in?

I'm glad we cleared that up. :roll:

As I have said many times before, different games need different fps. Ive said in Oblivion I dont need 60fps+.
In that case you'd have to admit that 8xS could be useful in modern games that don't need 60 FPS.

In an online twitch shooter such as Q4, 30fps just isnt good enough.
Show me where I said otherwise. For that matter show me were your original claim "8xAA is far from playable to me, in any sort of high res" had a reference to online twitch shooters and excluded single player games.

Yes, you did. You posted the link, with 10+ year old games getting playable frames at 8xAA.
I also posted links to three year old games (Jedi Academy, Call of Duty) and I also explained that I run 2004-2005 titles but they were not benchmarked at the time.

And your link also agrees with me that newer games doesnt do well with 8xAA for playable frames.
It depends on the game and the resolution. In general I wouldn't say 8xS is much more demanding than HDR+AA, if at all.

Its not bias, its a different type of game. Its being realistic. I need more frames in online play, than solo play. Thats pretty simple to understand.
No, you're shifting the goal-post whenever it suits you.

You made a blanket claim that 8xS wasn't usable and when you were called out with your HDR+AA double-standard you back-pedaled and starting talking about online twitch shooters.

Furthermore you keep insinuating that a game needs to be ten years old to use 8xS when I've pointed out multiple times that I run titles as new as 2005 with said feature.

Additionally I don't recall you ever saying ATi's HDR+AA is unplayable in the past but I've seen you have a go at 8xS multiple times despite the performance hit being quite similar.

Like I said, double standards.

It's one thing to have a standard for playable framerate but when you start chopping and changing that standard depending on ATi and nVidia is when I take issue.

Do you recall me saying that HDR+AA was playable for me at 1920x1200? Nope, you dont. Dont act as if I did. I also never said that HDR+AA was playable with a single card at my res.
This is a joke, right? You've been championing ATi's HDR+AA since it came out and claiming it's playable on single cards, much less Crossfire. Don't make me waste time dredging up your quotes.

I mean just in this very thread you said:
Having XT's in CF, HDR+AA was playable for me in Oblivion. And many others here with even a single card.
If you disagree with the single card part it begs the question why you mentioned it as evidence to back your claims.

Like I said, you chop and change (now you're doing it with others' definitions of playable) whenever it suits your agenda.

Again, dont put words in my mouth.
Where did I state I played Quake 4 online with 8xAA? Don't put words in my mouth.

I was merely disagreeing with your blanket claim that 8xS is not playable by providing examples of HDR+AA being unplayable according to your standards, examples in the past you have not stated are too slow.

First off, the ability to do HDR+AA in Farcry came out long after the game did. Cards were much faster.
Irrelevant; benches with two of ATi's current finest show it's unplayable according to your standards.

50fps is very close to being playable
Too bad your rig can't manage that which means it's unplayable. So do we have an admission that HDR+AA in Far Cry is currently unplayable on the ATi platform Ackmed?

BFG10K claimed that Q4 was playable at 8xAA (and I think 1600x1200). A review dropped (that does "best playable") that had Q4 was not playable to them at 8xAA and 1600x1200.
Single player is quite playable at that setting, unlike your false claim "8xAA is far from playable to me, in any sort of high res".

That depends on the game, and res. *If* Oblivion had an online mod, 30fps would still be fine, as would most any slow paced RPG I would guess. If Q4 had HDR+AA, I doubt I would use it. Because any dip, stutter, or anything of the sort in frames at a bad time, could be a death for you. I have no problem saying that frames are of a higher importance than IQ for fast paced online games. Once again, thats my opinion.
Which brings me back to my original point that even modern games can be playable with 8xS, unlike your blanket claim they can't.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
That is why I'd like a save of that location so I can produce a screenshot representative of what an enthusiast with highend hardware, concerned about about IQ would see on their own screen.
If you're referring to the HL2 railway track screenshot with the blur on the angled wall to the left then I can confirm I can replicate that as I've recently played through HL2 on my 7900 GTX.

It doesn't really concern me though as it's the only example I've seen in the game so it's a bit of an exaggeration to expect the whole game to be like that.

Being restricted to pure MSAA like you are on single ATi cards is a far bigger problem.
 

schneiderguy

Lifer
Jun 26, 2006
10,765
52
91
Originally posted by: Ackmed
blah blah blah

CS:Source is playable at 1280*1024 with 8xS AA. so is BF2. and thats on my 7600gt. a 7900gtx is twice the power of a 7600gt. to say 8xS is unplayable in modern games is rediculous :confused:
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: BFG10K
That is why I'd like a save of that location so I can produce a screenshot representative of what an enthusiast with highend hardware, concerned about about IQ would see on their own screen.
If you're referring to the HL2 railway track screenshot with the blur on the angled wall to the left then I can confirm I can replicate that as I've recently played through HL2 on my 7900 GTX.

It doesn't really concern me though as it's the only example I've seen in the game so it's a bit of an exaggeration to expect the whole game to be like that.

Being restricted to pure MSAA like you are on single ATi cards is a far bigger problem.

No, I'm not referring to that. I'm referring to the hardocp screenshot ackmed linked to.

The HL2 railtrack shot isn't exactly a common occurence in games either. Granted, it's annoying that it occurs, but it isn't exactly the end of the world and it's not as though all AF filtering performed by nvidia chips ends up looking like this, because, it doesn't. It's just the fanatics cherrypicking to find differences that make their GPu seem better.

Like you, I'll take 8xS AA over slightly poor AF everytime.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
I feel very sorry those that have donated their life to nvidia. Must be very boring defending their default iq. At least reading this repetative quibble is mega boring. When nv fixes their drivers than this iq stuff will be less talked about, but as long as they can't seem to get microsoft certified with decent settings - iq will be a serious weakness. Anyways I agree with the op, both companies have launched some nice cards and value to the consumer has greatly increased. :beer:
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Except that the only drivers that are available for the X1950XTX (6.9's) don't support the card by default.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Default IQ means absolutely nothing to me. To an average purchaser who doesn't know what a video control panel is let alone what Anti-Aliasing or Anisotropic Filtering could possibly mean it *may* be a bigger issue (if they are even aware of IQ in the first place). It's interesting to note that nvidia sells most of it cards and gains most of its profits in the average purchaser area, which tells you plenty about what Joe Average thinks about nvidia's IQ (as does his buying habits - nvidia vs ATi in that segment)...

Oh, and lets not forget - by default - HQ AF is off for ATi...
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
Must be very boring defending their default iq.
Not at all - quite the opposite actually.

I'm one of the most vocal opponents of nVidia's default Quality mode and I've even written an article about it to highlight some of the issues.

While it's true High Quality still has a bit of shimmer in rare places for the most part IQ is superb. Then we move on to nVidia's better TrAA and better AA modes on single cards and the net effect is better image quality than the competition.

When nv fixes their drivers than this iq stuff will be less talked about, but as long as they can't seem to get microsoft certified with decent settings - iq will be a serious weakness.
What are you talking about?

I was one of ATi's biggest driver supporters but not anymore. Many games don't work properly with ATi's AAA (unlike nVidia's TrAA which works perfectly in every Direct3D game I've tried). That and ATi's drivers have taken a dump on my X800XL over the last 5 revisions or so in terms of general compatibility problems.

In the past I was considering Crossfire but there's no way I'm going to drop so much cash for such sub-par drivers. I think a 8800GT should be a nice upgrade over my 7900GTX and if I need more power I'll go SLI.
 

Ackmed

Diamond Member
Oct 1, 2003
8,486
529
126
Originally posted by: Gstanfor
Originally posted by: Ackmed
How are we supposed to know the exact location of HardOCP's pic? Why do you need to take one? They've supplied comparison pics. The facts are the facts, ATis AF looks better. Get over it.

I have also said RPG's were not my bag. I got Oblivion because of all the hype. I played it for around 30 hours, and did enjoy it. Got over it, and got rid of it. Its not an excuse, I simply dont have the game. Im sure you can find many others that do have it.

I don't believe that hardocp's comparison picture accurately portrayed the capablilities of nvidia's AF system at settings most people would use in that game (the size of the screenshot for starters). That is why I'd like a save of that location so I can produce a screenshot representative of what an enthusiast with highend hardware, concerned about about IQ would see on their own screen.

You (and a heck of a lot of other people on this forum and others) are WAY, WAY to willing to just blindly accept what a review or article tells you about a particular topic. I don't believe anything other than my own firsthand experiences. YOu might just as well be a flock of sheep - don't think for yourselves - just follow the shepherd/pied/piper.


So you're claiming that I should trust you, over HarcOCP? They simply took a shot in the game, and cropped out the part that shows the difference the most.

Its not being a sheep to believe what is in a trusted review, its how we learn. I guess you dont think the new Core Duo is better than anything AMD has to offer, because you havent tested it yourself? You cant test everything yourself.

Originally posted by: BFG10K
Wrong. As I have said many times before, different games need different fps.
By "different" you mean the games you use HDR+AA in compared to the games I use 8xS in?

I'm glad we cleared that up. :roll:

By different, I mean different types of games. I dont know why you cant grasp it. Slower paced games dont need as many frames as higher paced games to me.


Originally posted by: BFG10K
As I have said many times before, different games need different fps. Ive said in Oblivion I dont need 60fps+.
In that case you'd have to admit that 8xS could be useful in modern games that don't need 60 FPS.

Yep. I wouldnt have a problem with 8xAA in Oblivion, if I could get around 30fps.


Originally posted by: BFG10K
In an online twitch shooter such as Q4, 30fps just isnt good enough.
Show me where I said otherwise. For that matter show me were your original claim "8xAA is far from playable to me, in any sort of high res" had a reference to online twitch shooters and excluded single player games.

You said 8xAA was playable in Q4 with 8xAA at 1600x1200. The link dropped shows thats not the case. You have provided no numbers to support your claim. It doesnt have a reference to only online twitch shooters. I guess I just thought it was common sense that faster paced games, needed more frames.


Originally posted by: BFG10K
Yes, you did. You posted the link, with 10+ year old games getting playable frames at 8xAA.
I also posted links to three year old games (Jedi Academy, Call of Duty) and I also explained that I run 2004-2005 titles but they were not benchmarked at the time.

Yeah, and those are not newer games. At least I dont count them as newer. I have one game installed right now, thats over 3 years old. Tribes. Its going on 9 years old, and would benefit from 8xAA, and is playable with it. As I said many times before, older games to me, benefit the most from 8xAA.


Originally posted by: BFG10K
And your link also agrees with me that newer games doesnt do well with 8xAA for playable frames.
It depends on the game and the resolution. In general I wouldn't say 8xS is much more demanding than HDR+AA, if at all.

Yes, it does depend. I would, but thats our opinion. I know of no data to back either up. HDR+2xAA at 1920x1200 I think would be much less demanding than 8xAA.


Originally posted by: BFG10K
Its not bias, its a different type of game. Its being realistic. I need more frames in online play, than solo play. Thats pretty simple to understand.
No, you're shifting the goal-post whenever it suits you.

No, I just assumed most people would agree that a slower paced game, doesnt need as many frames. Perhaps I shouldnt have. [/quote]

Originally posted by: BFG10K
You made a blanket claim that 8xS wasn't usable and when you were called out with your HDR+AA double-standard you back-pedaled and starting talking about online twitch shooters.

Furthermore you keep insinuating that a game needs to be ten years old to use 8xS when I've pointed out multiple times that I run titles as new as 2005 with said feature.

Additionally I don't recall you ever saying ATi's HDR+AA is unplayable in the past but I've seen you have a go at 8xS multiple times despite the performance hit being quite similar.

Like I said, double standards.

It's one thing to have a standard for playable framerate but when you start chopping and changing that standard depending on ATi and nVidia is when I take issue.

I said several times that it wasnt playable to me. Again, I dont use some sissy res. 1920x1200 is very demanding. Its hard enough to get even playable frames in todays demanding games. For Fear, I couldnt even use AA at all. 8xAA would be like playing on an etchasketch.

Actually, your link is what I went by, with games being old and using 8xAA. Dont like it? Test new games. Others have, and the games were not playable.

Its not a double standard for playable frames. For the gazillionth time, different types of games do not always need the same frames to me. I dont need 60fps in Oblivion.


Originally posted by: BFG10K
Do you recall me saying that HDR+AA was playable for me at 1920x1200? Nope, you dont. Dont act as if I did. I also never said that HDR+AA was playable with a single card at my res.
This is a joke, right? You've been championing ATi's HDR+AA since it came out and claiming it's playable on single cards, much less Crossfire. Don't make me waste time dredging up your quotes.

Go ahead, feel free. I have said you can get playable frames, as have many others. I also had a CF setup when I said that. I have also said that people will probably have to lower frames to get them in single cards. The discussion was also about Oblivion, which as you should know by now, doesnt need nearly as many fps as online shooters to be playable.


Originally posted by: BFG10K
I mean just in this very thread you said:
Having XT's in CF, HDR+AA was playable for me in Oblivion. And many others here with even a single card.
If you disagree with the single card part it begs the question why you mentioned it as evidence to back your claims.

Like I said, you chop and change (now you're doing it with others' definitions of playable) whenever it suits your agenda.

I didnt say single cards got playable frames in Oblivion with HDR+AA, others have. They also have a lower res than I do. I cant speak for them.


Originally posted by: BFG10K
Again, dont put words in my mouth.
Where did I state I played Quake 4 online with 8xAA? Don't put words in my mouth.

Sure you did;
Originally posted by: BFG10K

8xAA is far from playable to me, in any sort of high res.
I have a single 7900 GTX and I run many games at the setting (probably more than 4xAA) at 1920x1440.

Even modern games like Doom 3 and Quake 4 run well provided you drop the resolution a bit to 1600x1200.

Needless to say it's a breeze for SLI.[/quote]

As you can see, you said that you run 8xAA with Q4 at 1600x1200. The link dropped showed it wasnt playable. But again, "playable" is subjective. Even the GX2 didnt get playable frames for them.

Originally posted by: BFG10K
I was merely disagreeing with your blanket claim that 8xS is not playable by providing examples of HDR+AA being unplayable according to your standards, examples in the past you have not stated are too slow.

Once again, my standards are not that every game needs 60fps. Understand this, and it would cut down on half your posts. Ive already said that if I was playing an online shooter such as Q4, and it supported HDR+AA, and I got only 30fps, it wouldnt be playable.

Originally posted by: BFG10K
First off, the ability to do HDR+AA in Farcry came out long after the game did. Cards were much faster.
Irrelevant; benches with two of ATi's current finest show it's unplayable according to your standards.

When did I say it was playable at 1920x1200? Feel free to provide a link. Also, Farcry has a lot of slow action.

Originally posted by: BFG10K
50fps is very close to being playable
Too bad your rig can't manage that which means it's unplayable. So do we have an admission that HDR+AA in Far Cry is currently unplayable on the ATi platform Ackmed?

Wow, you just dont get it. I dont speak for everyone. If someone has a CF setup, or even a single card, and plays at 1280x1024, or even 1600x1200, its much faster than at 1920x1200. Obviously for some people it would be playable, and others it wouldnt. Not to mention the game is several years old. And just like 8xAA... is more playable on older games.

Originally posted by: BFG10K
BFG10K claimed that Q4 was playable at 8xAA (and I think 1600x1200). A review dropped (that does "best playable") that had Q4 was not playable to them at 8xAA and 1600x1200.
Single player is quite playable at that setting, unlike your false claim "8xAA is far from playable to me, in any sort of high res".

Reviews disagree with you. My claim is not false, its my opinion. I dont like slide shows, I guess you do. What are your frames with 1600x1200, 8xAA that you claim is playable? 2xTRAA, 4xAF, and 1600x1200 the review dropped says its about 54fps. I cant imagine 8xAA being even around 30fps.

Originally posted by: BFG10K
That depends on the game, and res. *If* Oblivion had an online mod, 30fps would still be fine, as would most any slow paced RPG I would guess. If Q4 had HDR+AA, I doubt I would use it. Because any dip, stutter, or anything of the sort in frames at a bad time, could be a death for you. I have no problem saying that frames are of a higher importance than IQ for fast paced online games. Once again, thats my opinion.
Which brings me back to my original point that even modern games can be playable with 8xS, unlike your blanket claim they can't.

Its not a blanket statement, its my opinion. Sure modern games can be playable, if you drop the res, a lot. And depending on the game, and person. I didnt say it was never playable. Are you saying that Fear, Oblivion, FL X, Tomb Raider Legend, BF2, etc. are all playable with 8xAA at 1920x1200? How about at 1600x1200? What about with TRAA added on top? I doubt that, and reviews agree with me. Reviews that use "best playable settings", agree with me that its not playable.

You're obviously not going to agree with reviews that 8xAA isnt playable in new games, at higher resolutions. This topic got trolled off topic, and Im done responding to it. I dont like bumping old threads, but I did anyways. The thread was about both companies having a great launch, and they did. Both got cards out when they said, and in good supply. Prices are already dropping, and thats good for everyone.

 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
(Ackmed) I said several times that it wasnt playable to me. Again, I dont use some sissy res. 1920x1200 is very demanding. Its hard enough to get even playable frames in todays demanding games. For Fear, I couldnt even use AA at all. 8xAA would be like playing on an etchasketch.

Thats ridiculous. True, at 1920x1440 with 8xS FEAR does slow down, but its certainly orders of magnitude faster than an etch-a-sketch. Of course, I still have a good 19" CRT so i'm not forced into trying to get playable frame rates out of ridiculous resolutions. 1600x1200 is just fine for FEAR as far as I'm concerned.

The following is just a rough guide, since I was running MS virtual pc fairly intensively at the same time, but you get the gist.

1920x1440
1600x1200
1280x960
My FEAR Profle
In game video settings
In game video effects

About reviews, previews etc on the 'net, they are are a starting point, thats all, not to be taken literally. Common sense and personal experience tells you if something is obviously wrong or not (like hardocp's oblivion comparison pic).
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
The problem is Ackmed, you consider playable only on the basis that it is at your resolution or if it is an ATI product running it. I'll compile a list that will serve as some cliff notes so that you can understand what has happened in your own thread:

[*]You originally said:
NV cant do it, and it sure cant do 8xAA in any sort of playable frames.

[*]You then claimed that 8xAA is only playable with one of Nvidia's most expensive setup:
What does all of this tell you? That NV's fastest card can only get usuable 8xAA on two older games, and thats it.

[*]You then told us that having one of the most expensive ATI setups gives you playable frames with HDR+AA.
Having XT's in CF, HDR+AA was playable for me in Oblivion.

Both need dual-GPU's in order to acheive what you consider playable, and both would need those two GPU's to be recent ones.

Since you haven't claimed that HDR+AA is playable with a single X1900 and I have, let me comment on how playable it is with one card at what you consider a "sissy" resolution of 1680x1050. Borderline. While for the most part it is playable, there are times when it dips down into unplayable frames. (i.e. intense vegetation coupled with equally intensive fighting, etc.)

[*]You said that you yourself have a standard for borderline playability:
For me, 54fps is borderline for playable, if you're playing online.

[*]BFG showed a link showing how Crossfire X1950XTX's only recieved 51.6 FPS in Oblivion:
So I guess a X1950XT Crossfire with HDR + AA in Oblivion is unplayable since it only scores 51.6 FPS at your chosen resolution.

[*]You say that 51.6 FPS is more than enough for yourself since Oblivion isn't a twitch shooter:
For a slower game such as Oblivion, its more than enough for me.

[*]BFG10K then linked a twitch shooter that was still under your definition of borderline playability using HDR+AA:
Just like HDR+AA can really hamper your game in a twitch shooter like Far Cry yet I don't recall you ever complaining about it. In fact you're still claiming said feature is playable on single cards when it isn't even playable on X1950 Crossfire according to your standards.

Click.

50.4 FPS on a Crossfire X1950 system. I guess you'd have to admit HDR+AA is an unusable feature in this twitch shooter since not even two of ATi's fastest cards can run it acceptably?

[*]You then tried to disregard his evidence by saying that Far Cry wasn't a twitch shooter even though you never played it online yourself:
Farcry is not a twitch shooter...I didnt play Farcrys multi after it first came out...

[*]Next, you summarize your situation with cards all while praising ATI's HDR+AA ability even though you consider it to be too slow for what you currently have:
Right now, it probably would be too slow for me to use HDR+AA in Oblivion. I sold my master card, so Im on a single X1900XT right now...ATi can do HDR+AA, where NV cannnot.

Even when the feature of HDR+AA isn't playable you state the obvious and say Nvidia can't do it yet leave out the fact that ATI can't do 8xS. That's your bias speaking.

[*]You defend ATI's abiltiy to utilize HDR+AA in a slow-paced game (Oblivion) but fault Nvidia for not being able to use 8xS in a fast-paced game (Q4). Then you say:
Its not a double standard for playable frames.
Not only that, but you attempt to say that another popular HDR+AA game is a slow-paced one when you said that Far Cry isn't a twitch shooter. Can we therefore assume you're saying that ATI's HDR+AA is only playable with some of the most expensive ATI setups and in slow-paced games that support it Ackmed? Nvidia can't do HDR+AA but their products that are as expensive as your CF X1900's can do 8xS where ATI can't, slow-paced or not.

[*]You claim that you're not speaking for everyone in what you consider playable:
Once again, Im not leading you to believe anything. I said several times, its not playable for me.

[*]However, you consider something unplayable if it doesn't have a certain frame rate at your resolution of 1920x1200:
Even a link from someone else trying to say it was playable, has only two games tested at 8x (best playable settings), one of those was with Q and not HQ settings. And it was at 1600x1200, not my res of 1920x1200.

[*]You continue to set the standard of playability according to how it performs with your resolution. Claiming that all other resolutions are "sissy" when in fact they are simply logical:
Again, I dont use some sissy res. 1920x1200 is very demanding. Its hard enough to get even playable frames in todays demanding games.
I would agree considering you had CF X1900's and only got playable, authentic HDR+AA in slow-paced games. This is why BFG10K said:
Needless to say it's a breeze for SLI.
when discussing 8xS and modern games.

[*]You periodically remind everyone that this thread isn't about 8xS vs. HDR+AA even though you are one of the contributers to that very arument.

[*]You claimed that this thread was about video card launches even though more of your own posts in it are about 8xS vs. HDR+AA.

[*]You mention how they are great launches despite ATI not providing drivers that work for the card that they launched.

You are biased Ackmed and that is alright. Yet when you can't rationalize your bias is when you begin to have threads like this. I hope you'll learn to think like an adult let alone post like one. Your credibility is down the toilet and now that you have been flip-flopping your standards to support either ATI or resolutions of 1920x1200 I would be surprised if anyone here ever takes your opinions/truths/claims seriously.

EDIT: Spelling/grammar
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
By different, I mean different types of games.
No you don't. By different you mean when it suits your pro-ATi agenda.

Yep. I wouldnt have a problem with 8xAA in Oblivion, if I could get around 30fps.
But that wasn't your original claim "8xAA is far from playable to me, in any sort of high res".

You said 8xAA was playable in Q4 with 8xAA at 1600x1200.
Correct.

The link dropped shows thats not the case.
False. They may show that for your standards for online twitch shooters but then the issue here is your shifting standard whenever it suits you.

That and you weren't originally talking about online twitch shooters to begin with, you've simply added that in because you were caught out.

You have provided no numbers to support your claim.
In actual gameplay the framerate is usually above 40 FPS at that setting which is okay for single player. It's not great but it's worth it in exchange to get super-sampling in OpenGL games.

Generally speaking IQ has a higher priority for me than framerate in single player games, especially when it comes to AA.

Yeah, and those are not newer games.
Err, Quake 4 is a 2005 title. Doom 3, Bloodlines and HL2 are 2004 titles.

HDR+2xAA at 1920x1200 I think would be much less demanding than 8xAA.
Probably given 2xAA is practically useless and offers a negligible increase in image quality. Nice goal-post shift again though, picking the lowest AA level just to prove your point.

Again, I dont use some sissy res. 1920x1200 is very demanding.
But 1920x1200 is a sissy res compared to the likes of 2560x1600. Heck your resolution is sissy compared to my best one of 1920x1440.

So when did we decide your resolution would be the standard? Oh that's right, when you purchased your LCD and were locked into one resolution due to inferior technology.

I have said you can get playable frames, as have many others.
So in otherwords you've flip-flopped again because the topic is HDR+AA. It's really quite hilarious how you think your reasoning is legit and flawless.

I didnt say single cards got playable frames in Oblivion with HDR+AA, others have
But you were only too happy to quote them. But when others talked about 8xS being playable you point out that your opinion overruled theirs.

So which is it Ackmed? Does you opinion of playable overrule others' or not? Or does it only overrule them when the issue is 8xS?

Sure you did;
What the hell are you talking about? Please highlight the bit where I said online like you claimed. Go ahead - I don't edit my posts like you.

Put up or retract that lie. Thanks.

Once again, my standards are not that every game needs 60fps. Understand this, and it would cut down on half your posts. Ive already said that if I was playing an online shooter such as Q4, and it supported HDR+AA, and I got only 30fps, it wouldnt be playable.
But you weren't talking about online twitch shooters to begin with; again that's just a concoction you've added in after you were caught out. Furthermore you attempted to pin online shooters on me when I never stated I played at those settings online.

When did I say it was playable at 1920x1200? Feel free to provide a link. Also, Farcry has a lot of slow action.
Ah, I see. Now you're changing the definition of twitch shooter to suit your agenda. I'm sure if Quake 4 had HDR support it too would mysteriously cease to be a twitch shooter. :roll:

Wow, you just dont get it. I dont speak for everyone. If someone has a CF setup, or even a single card, and plays at 1280x1024, or even 1600x1200, its much faster than at 1920x1200
Yet I don't see you calling their resolutions sissy - I see you quoting their comments and using them to back your HDR+AA claims of single card playability.

What are your frames with 1600x1200, 8xAA that you claim is playable?
Like I said in actual gameplay the framerate is generally above 40 FPS.

Furthermore you should know better than to blindly accept reviewers' numbers as the end-all performance metric since they seldom know how to tweak anything.