Anarchist420
Diamond Member
1. Did the flipper actually have a hardware mode of RGBA8 and just couldnt use it because of RAM limitations?
2. did the Flipper do T&L in 32bit precision or was its calculations 24 bit precision?
3. how did the PS1 clip things? it didnt have a hardware z-buffer IIRC but then it generally didnt look like there was much precision far away like there was in 3d Saturn games... i know the Saturn didnt have a hardware depth buffer no matter how much it looked like it used the w-buffer.
4. did most PS2 games use the 32 bit z-buffer mode? i was guessing so since a lot of them looked like they used logarithmic depth buffering (spider man web of shadows, the part of Lament of Innocence when you walk up to the "throne room", and Devil may Cry 1-3 looked like they had pretty even depth distribution) although i realize that i could be guessing wrong so that is why i am asking.
5. did the dreamcast's infinite clip planes mean that there would be disproportionately more precision far away or did it mean that it would look like a log depth buffer was used? i know it didnt use a depth buffer but instead used depth testing and some games looked pretty even while others looked like there was way more precision far away... none of them looked like they had more precision close up.
anyway, i have generally never been a fan of z-buffers (they have had their use and sometimes they looked good enough like in MDK2, some other open gl games, and in serious sam 2), although i love 32 bit fixed point log depth buffers and games from back in the day that used the w-buffer looked a lot better as far as depth went than what DX9 games were limited to... all DX9 games were limited to partial precision z buffers and so there was nothing that looked quite like the original unreal engine did.
and of course, 32 bit fixed point log z buffers wont be used much in the the future either (even though the fps loss from not being able to use early-z doesnt make the game go from smooth to a slideshow)... the unreal DX1x renderer only uses a 32 bit float 1 - z buffer but at least the DLLs are free to the end user (the guy who was kind of enough to spend his time making them didnt charge anything🙂. however, i wish the guy who did the open gl renderer would just make one final update to it so it would use a 32 bit fixed log depth buffer on modern hardware, but that will probably never happen... i can only remember how sweet the original unreal and ut99 looked on the diamond monster 3dII i had back in the day other than that there was no properly rotated grid AA, some dithering artifacts (due to lack of full precision RGBA buffer), small textures, and no lighting and depth calculations werent done per pixel when all should be done per pixel.
2. did the Flipper do T&L in 32bit precision or was its calculations 24 bit precision?
3. how did the PS1 clip things? it didnt have a hardware z-buffer IIRC but then it generally didnt look like there was much precision far away like there was in 3d Saturn games... i know the Saturn didnt have a hardware depth buffer no matter how much it looked like it used the w-buffer.
4. did most PS2 games use the 32 bit z-buffer mode? i was guessing so since a lot of them looked like they used logarithmic depth buffering (spider man web of shadows, the part of Lament of Innocence when you walk up to the "throne room", and Devil may Cry 1-3 looked like they had pretty even depth distribution) although i realize that i could be guessing wrong so that is why i am asking.
5. did the dreamcast's infinite clip planes mean that there would be disproportionately more precision far away or did it mean that it would look like a log depth buffer was used? i know it didnt use a depth buffer but instead used depth testing and some games looked pretty even while others looked like there was way more precision far away... none of them looked like they had more precision close up.
anyway, i have generally never been a fan of z-buffers (they have had their use and sometimes they looked good enough like in MDK2, some other open gl games, and in serious sam 2), although i love 32 bit fixed point log depth buffers and games from back in the day that used the w-buffer looked a lot better as far as depth went than what DX9 games were limited to... all DX9 games were limited to partial precision z buffers and so there was nothing that looked quite like the original unreal engine did.
and of course, 32 bit fixed point log z buffers wont be used much in the the future either (even though the fps loss from not being able to use early-z doesnt make the game go from smooth to a slideshow)... the unreal DX1x renderer only uses a 32 bit float 1 - z buffer but at least the DLLs are free to the end user (the guy who was kind of enough to spend his time making them didnt charge anything🙂. however, i wish the guy who did the open gl renderer would just make one final update to it so it would use a 32 bit fixed log depth buffer on modern hardware, but that will probably never happen... i can only remember how sweet the original unreal and ut99 looked on the diamond monster 3dII i had back in the day other than that there was no properly rotated grid AA, some dithering artifacts (due to lack of full precision RGBA buffer), small textures, and no lighting and depth calculations werent done per pixel when all should be done per pixel.