5150Joker
Diamond Member
Originally posted by: Wreckage
http://www.tech-hounds.com/review18/ReviewsComplete.html
It boils down to this. ATI's adaptive antialiasing is somewhat a mixed bag. In Performance mode, the image quality it offers is better than NVIDIA under the same setting. However, this is not the preferred mode of use. In Quality mode, which corresponds to supersampling on the GeForce 7 series, the Radeon adaptive antialiasing is not as effective. It only works well on objects that are near to the camera. There's still very noticeable aliasing (more so in motion) on faraway objects with transparent textures. Gamers looking for the best image quality in games with transparent textures, should be more happy with the GeForce 7 than the Radeon X1900
There is virtually no discernable difference between the majority of the screenshots they posted! Bit-tech's review with updated Catalyst drivers comes to the opposite conclusion of this website:
Essentially, both ATI and NVIDIA are doing the same thing and, not surprisingly, the final result is virtually the same too. In a blind taste test, you're not going to be able to tell the difference between the two supersampled alpha tested texture anti-aliasing techniques. That's good to hear, because we feel fairly confident in saying that if you've got a choice between two video cards and they're both playable with supersampled transparency/adaptive anti-aliasing, you couldn't have a better choice as either card is more than likely to satisfy your needs.
Then there's HardOCP's examination (older review with dated drivers):
n the third screenshot, we have a comparison with some vegetation and a tree that have alpha tested textures. With no AA, the tree branches and vegetation are very jagged. Adaptive AA really cleans up the image producing a very clean picture with 6X Adaptive AA. The GeForce 7800 GT also improves the image quality with a very clean picture with 8xS TR SSAA. However, as we have seen above, the Radeon X1800 XL seems to edge out the 7800 GT slightly by producing better image quality at default settings using anti-aliasing.
Source: http://www.hardocp.com/article.html?art=ODIyLDgsLGhlbnRodXNpYXN0
Then look at tech-hounds's clueless conclusion:
ATI's adaptive antialiasing is not as effective as NVIDIA's transparency anti aliasing. You don't really gain that much image quality improvement, even with Quality setting.
Gee who to believe? An article from a no-name site that provides screenshots which show hardly any difference between the filtering modes and claims quality AAA makes very little difference over performance AAA (they must be blind) or a respected site like bit-tech? Are you sure you're not the one that wrote that article because nobody with a pair of eyes would agree with their laughable conclusion.
