We did enable 8xAF in most tests these days. A lot of games incorporate
AF into options like "texture filtering" in bf2 and "high quality" mode
in doom3/quake4. In games with the in game setting we use the in game
setting and what ever mode the developer decided to choose as their
highest setting -- which is often 8x but sometimes higher. In fact,
there isn't a game we test in which we enable AF in the driver anymore.
If we have to pick the mode explicitly we do select 8xAF.
Aside from the fact that we've been using 8xAF for a while, I tend to
prefer it when gaming. With 16xAF, the angular dependency becomes much
more apparent in most implementations (ATI's high quality mode
definitely looks better, and we hope NV will follow suit).
Which brings us to the last question ... we are trying to compare apples
to apples as much as possible. At the highest image quality modes, ATI
and NVIDIA have different image quality. For example, it's possible to
get better AF out of ATI's x1000 series, but it's possible to get better
AA with NVIDIA's transparency AA than ATI's adaptive AA.
That being said, you've inspired me to work on an article that compares
the highest quality ATI modes to the highest quality NVIDIA modes in
both image quality and performance. It may take a while to get it done,
but I'll try to get it out before the end of the month.
For most other articles, we'll still stick to noaa/noaf and 4xaa/8xaf.
midrange and low end users won't be able to get much out of the higher
quality features anyway, so looking at them is really most useful on the
high end.
Thanks,
Derek Wilson