Originally posted by: CP5670
For what it's worth, I have an X1900 but I doubt I'll ever use the HDR+AA feature. I tried it out in Far Cry once and was finding that it worked okay at 1280x960 but was too slow at any higher resolution. This is more due to the Far Cry HDR itself, which causes a large performance hit particularly on ATI cards. The additional drop from AA is actually pretty minimal. I don't have Oblivion and have no plans to buy it, but from the benchmarks I have seen I would probably end up running it around 800x600 and use neither HDR nor AA in order to get decent performance. Although if SCCT supported HDR+AA I would have probably used it there, as the HDR and AA by themselves don't hurt performance much on the X1900 cards.
It's a little pointless to argue about what kind of performance is "playable," as pretty much any framerate is playable if you're used to it. I used to play an old Mac 3D racing game called Vette many years ago and the framerate I got was generally between 2 and 5, but I didn't care one bit.

These days I rarely tolerate any minimum point under 50fps before I drop down some settings and see if it gets any better (I can play with less if I have to, but I look for rather more than playable performance when buying expensive video cards). It's certainly better to have the HDR+AA feature than not in any case and obviously many people do find the Far Cry and Oblivion framerates quite acceptable.