40 bit FX logarithmic z-buffer with (3) stencil buffers questions

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
This isn't looking at the big picture, but still... why don't they just do away with all of the other formats and go with 40 bits of fixed point precision for the z buffer, make it logarithmic and then a 24 bit stencil buffer that can be effectively 3 stencil buffers?

I know that early z-occlusion culling wouldn't work with it, but what I mentioned would provide more precision than any other mode. It would also save transistors because the transistors for early z wouldn't need to be there and it wouldn't require floating point z buffer modes.

For the color buffer, they could make it RGBA20FP and emulate other formats with some extended (or double floating point extended) precision general purpose cores (in the cases in which it would even be necessary to do emulation).

Do you think that what I mentioned above would've been more likely to have happened if we stayed with OpenGL?

I was asking because... games look like crap today because of all the damn lossy compression, all the aliasing (especially older games since nv took away the option to force trilinear mipmaps), and because of low precision formats (single FP precision being used for AA or anything else doesn't help either).
 
Last edited:

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
I was asking because... games look like crap today because of all the damn lossy compression, all the aliasing (especially older games since nv took away the option to force trilinear mipmaps), and because of low precision formats (single FP precision being used for AA or anything else doesn't help either).

Not all games. Just AA titles :)
Check Sine Mora, Alan Wake, Deadlight, Trine 2. I don't think that games looking bad is a technical limitation; more of a bad implementation of current techniques + color blindness.