- Jul 1, 2005
- 5,529
- 0
- 0
This was a problem with older ATI cards but it seems that it did not get fixed in the newer X1900 cards or with updates to HL2 & Catalyst.
See for yourself in these screenshots
http://www.hothardware.com/viewarticle.aspx?page=8&articleid=777&cid=2
A possible explanation from Ratchet @ Rage3D
http://www.rage3d.com/board/showpost.php?p=1333964749&postcount=10
If ATI keeps running lower detail it would rule out using HL2 as a benchmark and may explain their numbers on the source engine.
See for yourself in these screenshots
http://www.hothardware.com/viewarticle.aspx?page=8&articleid=777&cid=2
If you open each of the standard shots individually and skip through them quickly, you're likely to notice a bit more detail in the shots taken with the GeForce 7800 GTX versus those taken with the Radeon using its standard angular dependant anisotropic filtering mode, disregarding artifacts produced by the JPG compression.
The same seemed to be true when inspecting the 16x aniso images. Of course, image quality analysis is objective by its nature, but based on these images, we think the GeForce 7800 GTX has the better image quality as it relates to anisotropic filtering when standard "optimized" aniso is used
A possible explanation from Ratchet @ Rage3D
http://www.rage3d.com/board/showpost.php?p=1333964749&postcount=10
If ATI keeps running lower detail it would rule out using HL2 as a benchmark and may explain their numbers on the source engine.