This time, the x800s are sticking with the 24-bit precision while DX9.0c will be 32-bit. I believe the x800s will incur a performance overhead when converting their image quality up to the DX9.0c standard in applications using Microsoft's API. This is going to be similar to the GeForce FX series not adhering to the DX9 path (32-bit precision instead of 24-bit).
XP SP2 (due in the next few weeks) includes DX9.0c.
1. Will the x800s be disadvantaged?
2. Will my 5900XT's performance improve?
Webmal
Edit: Inserted the word "precision" - thanks vshah!
XP SP2 (due in the next few weeks) includes DX9.0c.
1. Will the x800s be disadvantaged?
2. Will my 5900XT's performance improve?
Webmal
Edit: Inserted the word "precision" - thanks vshah!
