Originally posted by: BFG10K
Because nVidia has two modes of TrAA.
So now you're saying TRMSAA = AAA? LOL.
Riiiiight, because we all know resolution has no impact on IQ.:roll:
Sure it does, but not in the context of this thread and the entire point that I've had to reiterate over and over yet some people are still fail to understand something so simple.
Anyone that cares about testing GPUs properly.
What relevance does that resolution have to the average gamer? None.
Oh goodness, not this crap again. "The cards are equal, except for the settings most people don't use, but those don't count so the cards are equal! Haha! You lose!"
Again, resolution has nothing to do with high IQ settings that I've talked about (TRSSAA/AAA and HQ AF). That's something you decided to throw in there along with midgrade AA/AF and decided to call it the highest IQ possible. Yeah most people that don't own a $750 card don't use those settings. However, the ones that do pay for it might want to use them don't ya think?
That's your opinion. What isn't opinion that is low/middling resolutions are generally useless for testing video cards, regardless of what most people can or can't do.
Again, why do you keep harping on resolution? If resolution is all that matters, why bother with AA/AF at all? After all, insanely high resolutions that most gamers don't have access to is the end all be all of IQ according to you.
How there hell can a low resolution like 1280x1024 be "highest IQ settings possible"? And 30 FPS is a slideshow, especially if it's an average.
And that's your opinion. 30 fps is perfectly fine in CoD 2 and is far from a slideshow. What's the point of a higher resolution with midgrade quality settings? Why even bother with a $750 card then?
No it isn't. Your settings aren't even "highest IQ" and even those settings aren't playable.
More opinion.
What it means to most gamers is irrelevant. What is means to testing GPUs accurately is very relevant.
I see, so gamers that are actually going to buy this card and play it at the resolution and IQ settings they want don't matter. What matters is that the card scales to a resolution most people can't use? Nice logic.
Your selective benchmarks simply show what you want to see. Look at any website and they'll back Anand's.
They have to be selective since there aren't very many benchmarks that use TRSSAA/AAA and HQ AF. I was very clear about this throughout the thread.
It seems to me your definiton of "best quality IQ" is 4xAAA/4xTrAA which is quite laughable given it isn't even the best AA possible, much less best possible IQ. You're simply trolling and cherry picking benchmarks while being blatantly inconsistent with your own standards.
It is the highest IQ setting you can use to compare the two cards since nVidia lacks 6xTRSSAA. If I demanded the best IQ possible, nVidia would probably lose (performance wise). If you're going to reply, at least use some logic and common sense. Going to go have dinner and I'm sure I'll see more circular arguments from you that have no relevance to this thread.