Is early z really worth it as much as it's actually used?

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
I think it's ridiculous that it matters so much for several reasons:

the best type of z-buffer (32 bit [1] fixed point logarithmic) can't be used with it.

there will still a maximum of 4 zixels per clock with a 32 bit depth buffer.

if it could be forced (via a check box in the cp) in place of the w-buffer, then that would solve z-fighting issues in games that used the w-buffer.

IIRC, nvidia made the drivers so early z occlusion culling couldn't even be disabled in rivatuner when the Geforce 8 series first came out.

One of the worst things about DX was that it took until DX10 for 32 bit depth buffers to be part of its specification.

Anyway, this post isn't meant to be against DX, but I think it illustrates part of why DX sucked so badly because it only required 24 bit precision until DX10.

[1] Actually, 64 bits would be even better but one should understand my point:)
 
Last edited: