Why don't nvidia and ATi allow end-users to force any RGBA and depth in hardware?

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
I've really been wondering why they don't, as they used to (I remember ATi used to allow the user to select any depth buffer format that was supported by the Rage 128). It would make old games enjoyable. For some older apps, a 24 bit z-buffer isn't enough, especially since a lot of those games used a W-buffer or didn't even require a certain depth buffer format (meaning that a 32 bit FP zuffer would look better than a 24 bit fixed-point Zbuffer). Allowing the end user to force a FP32 complimentary Z buffer at the driver level would work perfectly.

AIDA says that dithering is supported on my GTX460, but 16 bit color still looks as bad as it ever did. If they allowed forcing of R8G8B8A8, then the banding problems would be gone from games that only support 16 bit color.

They could always get around tech support issues by saying that those options aren't supported, couldn't they?
 

darckhart

Senior member
Jul 6, 2004
517
2
81
I have no good technical info to add, nor any good marketing speculation.

But, I have noticed that these newer generations, say ATI 4000 series and NV GTX200 series and above, have seen a lot of ... hm how can I put it diplomatically... "streamlining" of features in order to put a ... satisfactory.. product out there for the masses.

Simply put, why bother expending the energy/resources/etc on features that will only add value for what amounts to a very small segment of the population?