I've really been wondering why they don't, as they used to (I remember ATi used to allow the user to select any depth buffer format that was supported by the Rage 128). It would make old games enjoyable. For some older apps, a 24 bit z-buffer isn't enough, especially since a lot of those games used a W-buffer or didn't even require a certain depth buffer format (meaning that a 32 bit FP zuffer would look better than a 24 bit fixed-point Zbuffer). Allowing the end user to force a FP32 complimentary Z buffer at the driver level would work perfectly.
AIDA says that dithering is supported on my GTX460, but 16 bit color still looks as bad as it ever did. If they allowed forcing of R8G8B8A8, then the banding problems would be gone from games that only support 16 bit color.
They could always get around tech support issues by saying that those options aren't supported, couldn't they?
AIDA says that dithering is supported on my GTX460, but 16 bit color still looks as bad as it ever did. If they allowed forcing of R8G8B8A8, then the banding problems would be gone from games that only support 16 bit color.
They could always get around tech support issues by saying that those options aren't supported, couldn't they?