Originally posted by: pulsedrive
Originally posted by: M0RPH
Neither one. You should wait for ATI's R520. The 7800GT has image quality problems that nobody should be willing to accept from a $400 card.
Uhm, yeah, either back up your claims, or STFU.
I was getting tired of pointing out the links, since nobody around here seems to have the attention span to read any of the discussions anyways. If you actually read this forum you would have seen it by now, but here ya go:
Beyond3D
HardForum
AnandTech
3DCenter
Conclusion
ATI, with its Radeon X800, shows: Even with activated "optimizations" (meaning quality reduction),
there are no shimmering textures. While there is no full trilinear filtering used, this can not be noticed so quickly. Even though ATI's texture filtering hardware does not compute as exactly as a GeForces',
the overall image quality is better, for there are not as many questionable "optimizations." Angle dependency when using AF, however, should not be considered as a feature of modern graphic cards any more, ATI's advertising speaking of "High Definition" gaming can thus be seen as an unfulfilled promise straight from the marketing department. At least, ATI shows that the scene in the video does not have to include texture shimmering.
Nvidia, with its current 7800 series, offers graphic cards that
can not be recommended to lovers of texture quality?even though texel performance was increased by a factor of 2.5 compared to the GeForce FX 5800 Ultra! Added to the angle dependency (inspired by ATI's R300), there is now the
tendency to texture shimmering in addition. The GeForce 6800 (or GeForce 6600) has to be configured to use "High Quality" to circumvent texture shimmering as much as possible. With the 7800, this seems to be useless; even when using "High Quality", the new chip tends to texture shimmering. The old Nvidia GeForce FX shows nearly perfect textures, though.
The quoted passages from Nvidia's Reviewer's Guide can be easily disproved. That means: All benchmarks using standard settings, no matter if GeForce 7800 or 6800, versus a Radeon, are wrong:
Nvidia offers, at this time, the by far worse AF quality. Radeon standard settings are better (speaking in terms of image quality) than 6800 standard settings, whilst the 7800's standard settings are even worse. Thus, the so called "performance" should not be compared either. One should also not compare 7800 standard vs. 6800 standard or 7800 HQ vs 6800 HQ, since the 7800's texture quality is lower. Real "performance" includes on-screen image quality. (Otherwise, why bother to have and benchmark with AF at all? Should an AF implementation just have only minimal impact on the rendering speed or should it result in greatly improved textures?)
What advantage do you have of 16x AF if you get 2x AF at maximum at certain angles only, are exposed to texture shimmering while other cards provide flicker-free textures? All benchmarks using the standard setting for NV40 and G70 against the Radeon are invalid, because the Nvidia cards are using general undersampling which can (and does) result in texture shimmering. We know and love GeForce 7 for the Tranparency Antialiasing and the high-performance implementation of SM3 features,
but the GeForce 7 series cannot be configured to deliver flicker-free AF textures while the GeForce 6 series and the Radeon series can (of course) render flicker-free AF quality.
If there should be any changes with new driver versions, we will try to keep our readers up-to-date.