Originally posted by: Cookie Monster
Originally posted by: Ackmed
Dont need 6x? Please.
8xAA is far from playable to me, in any sort of high res. There is a HUGE hit on performance. Shimmering cannot be fixed, it can be reduced. At a cost of frames. HQ AF was not enabled, its plainly obvious. You're now claiming that HQ AF is not better than normal AF?
More links that say 4xAA quality is virtually the same;
With 4X anti-aliasing enabled, the "jaggies" in the scene are cleaned up considerably, but it's extremely difficult to perceive any real differences in the images, even while zoomed, although the cables on the crane seem slightly more defined on the Radeon. We've also included a sample using ATI's 6X AA mode, which was the best overall. (NVIDIA doesn't support this mode.)
http://www.hothardware.com/viewarticle.aspx
Thats you. Some people here espeically GX2 users can utlise 8xS on some new games, and almost all the old ones.
Link
They used 8xTSS on both HL2 and BF2. Not to mention that theyve used AA on GRAW unlike the other cards benched.
I think 8xS is quite usable.
I said to me. I dont try and speak for everyone, unlike some.
They used 8xAA on the fastest NV card, on two older games to get 8xAA playable. HL2 plays well with virtually everything. BF2 also does the same. I tried it with a 7800GTX, and it wasnt even close to playable. It still wasnt in SLI. They didnt use HQ for HL2, because they didnt get playable frames with it. They did use HQ for BF2. They also used 1600x1200. Ive been at 1920x1200 for about a year and a half. Which is much more demanding.
8xAA on a 7900GT, or even a 7900GTX would be much, much slower than with that GX2. If you notice in the link you dropped, they didnt use 8xAA one time with the 7900GTX. Why is that? Becuase they didnt think it was playable. And thats with a very high end machine. Do you think people on a 7800GT and 2gig will find it playable in any sort of newer game?
Also, that is not AA in GRAW. Its "edge smoothing". Just a nit pick. Their engine cant do AA, so they just blur everything. It sucks, and looks pretty bad to me.
What does all of this tell you? That NV's fastest card can only get usuable 8xAA on two older games, and thats it. One with HQ drivers, and one without, at 1600x1200. HardOCP also uses 8xAA when they can get it playable. And thats very few and far between. Again, this is on a top end machine, with NV's fastest card. 8xAA very usable? Thats up to the end users, from when I had the option, it was next to never.
Thats not to say I dont like NV's 8xAA. I think it looks very, very good. It just takes far too large of a performance hit for me to use in any sort of newer game. To say that NV doesnt "need" 6xAA like ATi, because they have 8xAA, is pretty shortsighted to me. More options for the consumers, the better.
Originally posted by: Gstanfor
Well, if there truly is a difference I'm sure ackmed ought to be able to produce a screenshot with HQAF in it for us (while keeping the other parameters the same!!!)
Sure, its very easy to see. If you ever used an ATi card, its pretty easy to see.
http://enthusiast.hardocp.com/image.htm...E1MzU5NDI5MGdxTGdRV2plUVFfM183X2wucG5n