This came from techreport, but is posted on page 18 of the Beyond3d post regarding this topic.
-------------------------------------------------------------------------------------------------------------------------
Damn those translated articles are tough to read but here's my interpretation (someone correct me if I'm wrong).
1) Using UT2k3, they compared the X800 vs the 9800 pro.
1280x1024 4xAA 16xAF:
Radeon X800 pro 62.26 fps
Radeon 9800 XT 39.89 fps
2) Then they underclocked the x800 to 9800XT speeds:
1280x1024x32 4xAA 16xAF:
X800 @ 9800XT 49.68 fps
Radeon 9800 XT 39.89 fps
3) Then they used coloured mipmaps:
R360 vs. R420 "speciale" firstcoloredmip 1
1280x1024x32 4xAA 16xAF:
X800 @ 9800 XT ColorMips 1 39.59 fps
9800 XT ColorMips 1 39.67 fps
The contention is that regardless what setting is selected in the control panel or application, the drivers use 'brilinear'. Only when the driver detects colored mipmaps, does it uses trilinear and then takes a 20% performance hit, making it the same speed as a 9800XT clock-for-clock.
The differences in the image quality may be subtle, but now you have to be careful when comparing benches between different architectures. For example, if a 6800U and X800 show similar performance scaling, but the X800 suddenly jumps ahead when AF is turned on, is it a fair comparison if the 6800U is using trilinear while the X800 is using brilinear?
[edit] I personally think this issue is looking a little shady (though more info may be forthcoming).
If you select trilinear, you should get trilinear.
If ATI has come up with some whizz-bang 'brilinear' method, then fantastic; but put another checkbox in the control panel that users can explicilty enable.
Is trilinear a poor performer and you dont want people to use it in the current drivers? That's fine; grey out the option for trilinear in the current drivers and enable it when its ready.
------------------------------------------------------------------------------------------------------------------------
We saw it first with the garbage that was spewing out of Driver Heaven a couple weeks ago against Nvidia and now we are seeing more pointless garbage pointing towads ATI by computerbase.de
This is not pointless garbarge, especially when you're considering dropping 400-500 dollars on a card that doesn't perform like they claim it does. Findings like this are important for the consumer, whichever it paints the bad picture for. Knowing the facts can not only make you wiser, but better at choosing the hardware that will give you the best bang for your buck. If you think it's stupid, than they should just stop benchmarking/testing/reviewing/uncovering of everything on the internet and let the consumers "guess" what is best for them.