the problem is that they looked better on a geforce 3/4 (xbox)/fx.
the other problem is that i don't see games with things like that anymore because depth buffers linear in eye space haven't been used in pc games since the fx, or if they have then not much.
am i the only one who likes the way it looks better when graphics are linear in eye space?
remember how good the original unreal engine-games looked? i always ut99 looked amazing on a voodoo2 and the maps couldn't be preserved even in dx9 hardware. depth range hacks were necessary or replicating the w-buffer via pixel shaders which wouldve been expensive.
i realize that sm 3.0 gave some things that were useful but microsoft's min specs have been a failure IMO because of how long they ignored things like direct depth buffer access and depth buffer precision. those may not have been important, but the IHVs should not have to comply with any minimum and nv may have dropped the w-buffer after the geforce fx anyway. or they may have done tile based rendering. or they may have done everything in software. or they may have intended and wanted to follow ms's min standards all along. or they may have done none of that.
but that's why minimum specs aren't all that great in my opinion. they make things look crappy especially since we have had things like TWIMTBP and now we also have GE.
microsoft has been quite the patent trolling bully for too long and their min specs pretty much raised ATi from the dead in 2002 as R300's transistor budget was extremely performance biased in my opinion.
the other problem is that i don't see games with things like that anymore because depth buffers linear in eye space haven't been used in pc games since the fx, or if they have then not much.
am i the only one who likes the way it looks better when graphics are linear in eye space?
remember how good the original unreal engine-games looked? i always ut99 looked amazing on a voodoo2 and the maps couldn't be preserved even in dx9 hardware. depth range hacks were necessary or replicating the w-buffer via pixel shaders which wouldve been expensive.
i realize that sm 3.0 gave some things that were useful but microsoft's min specs have been a failure IMO because of how long they ignored things like direct depth buffer access and depth buffer precision. those may not have been important, but the IHVs should not have to comply with any minimum and nv may have dropped the w-buffer after the geforce fx anyway. or they may have done tile based rendering. or they may have done everything in software. or they may have intended and wanted to follow ms's min standards all along. or they may have done none of that.
but that's why minimum specs aren't all that great in my opinion. they make things look crappy especially since we have had things like TWIMTBP and now we also have GE.
microsoft has been quite the patent trolling bully for too long and their min specs pretty much raised ATi from the dead in 2002 as R300's transistor budget was extremely performance biased in my opinion.
