Except I didn't.
You even said that it's not fair to compare nVidia's 8X AA to ATIs 6X AA, because the quality of ATIs is better anyway.
Well that's true to some degree I guess but if both are on maximum that's as fair as the comparison is going to get, much like AF.
Actually what i really object to is AF IQ comparisons done when the ATi card is left on 8x because nVidia can't go higher. If you wanna do IQ comparisons then go as as high as each each vendor allows instead of crippling one on account of the other.
You used to be pimpin the nV25 stuff back in the day as well,
That's because back in the day it was pretty much the best card around. I have no problem admitting to that.
LOL you haven't changed for years dude.
What, you mean praising hardware when it deserves praise and pointing out inferiorities when it doesn't? No, I guess I haven't changed that much.
Sure but that's not the point.
Actually it is because if it wasn't I wouldn't have been looking at shader performance. Or what, you expect me to argue something and then use totally unrelated evidence to back up the argument? Sorry, but that's called being illogical.
The point is that you have a very selective set of criteria when it comes to this stuff.
The criteria depends on what I'm trying to argue.
You consider 10X7 no AA/AF a pointless setting in comparing performance because no one would want to look at "butt ugly settings" like that
I consider that setting invalid in the majority of games when comparing top of the line $400 cards, yes (unless you're talking about very stressful new games like Halo). Using such settings on those cards and then proclaiming their equality is totally invalid because they're only "equal" due to CPU limitations which means you aren't comparing the cards at all.
Or shall I go back to using 320 x 240 and proclaim a GeForce3 is as good as your 5800? Because that's exactly the kind of logic you use.
unless of course that's all your beloved ATI could run, then it's just fine.
If that was all ATi could run then I'd be the first person to run to the store and pick up something better.
The point is you selectively cull minute differences and play them up like deal breakers. "nVidia's AA plainly has 3 more jaggies when you look at this magnified screenshot!" or "Well, nVidia may lead at 10X7 and 12X10 4X8X, but the only benmchmark that matters is 16X12, and ATI leads by 5 whole fps there!" etc..
The point is that you ignore all differences and proclaim equality when there is none. You also you use invalid scenarios to back your claims.
LOL- you still don't get this. Some people just like to try different hardware BFG. Believe it or not, something doesn't have to be the very best to be worthwhile. (gasp)
I understand you like trying hardware Rollo and I have no problem at all with it. I take issue with your "equality" and "no difference" arguments.
I can links to reviews where the 5950 is the faster,
No doubt using old school games running at the VGA settings you like to run your $400 boards at. Again I could show you 320 x 240 benchmarks of a GF3 being equal to a 5800; does that mean the former card is as good as the latter?
Of course the games also can't have any shaders either because we know that you've magically deemed all shader games invalid.
To sum up your stance: "there is no difference between ATi and nVidia except when ATi is better but when that happens, it doesn't count."