- Aug 10, 2001
- 10,420
- 2
- 0
The DAC on the Radeon 9700 Pro dithers the image to 24-bit or 30-bit color before it is displayed.Originally posted by: ViRGE
At the very least, 128bit color is really more about internal precision. With only 32bits, you start accumulating errors as you start doing functions and blending layers. 128bits should, for the most part, negate those errors. As for the dithering, I'm not as sure as to what to say. Since monitors are analogue, there's nothing stopping them from just using the whole 128bit signal, is there?
What exactly is more accurate?Even if the output is ultimately 32-bit, those 32-bits will be much more accurate if they were processed at 128-bit's internally.
Uh, does the Radeon 9700 Pro actually display 30 bit color? I know the Matrox card does in Photoshop with a special plug-in, but I thought that ATI's Radeon 9700 Pro only did 24 bits "externally," while doing 128 bit processing internally.Originally posted by: Vespasian
The DAC on the Radeon 9700 Pro dithers the image to 24-bit or 30-bit color before it is displayed.Originally posted by: ViRGE
At the very least, 128bit color is really more about internal precision. With only 32bits, you start accumulating errors as you start doing functions and blending layers. 128bits should, for the most part, negate those errors. As for the dithering, I'm not as sure as to what to say. Since monitors are analogue, there's nothing stopping them from just using the whole 128bit signal, is there?
I'm pretty sure that both the Radeon 9700 and the Parahelia can do 30-bit color and 2-bit alpha.Originally posted by: jliechty
Uh, does the Radeon 9700 Pro actually display 30 bit color? I know the Matrox card does in Photoshop with a special plug-in, but I thought that ATI's Radeon 9700 Pro only did 24 bits "externally," while doing 128 bit processing internally.Originally posted by: Vespasian
The DAC on the Radeon 9700 Pro dithers the image to 24-bit or 30-bit color before it is displayed.Originally posted by: ViRGE
At the very least, 128bit color is really more about internal precision. With only 32bits, you start accumulating errors as you start doing functions and blending layers. 128bits should, for the most part, negate those errors. As for the dithering, I'm not as sure as to what to say. Since monitors are analogue, there's nothing stopping them from just using the whole 128bit signal, is there?
Originally posted by: Vespasian
What exactly is more accurate?Even if the output is ultimately 32-bit, those 32-bits will be much more accurate if they were processed at 128-bit's internally.
Oh, I meant 16bits per color channel, not old fashioned 16bit color.if you think 16-bit and 32-bit color look the same, no offence, but you're blind.
Originally posted by: ViRGE
AbRASiON, the final, external color scheme won't have any effect on speed.
By definition, all of the power needed to render is/will be done at 128bit mode, only downshifting to 32bits for display once everything is rendered. As such, 128bit rendering is going to be initially painful, much like 32bits was on the original TNT.
There aren't enough 128bit games yet to make a judgement, but it may be another cycle or 2 before 128bit color is truely reasonable.
Once that point does happen though, expect to see 16bit rendering dissapear completely; in 128bit mode, there's no benefit to outputting in 16bits, and for cards that don't do 128bit mode, 32bits will be manditory in order to keep the artifacting to a minimum due to the additional texture layers.
PS On a side note, for those of you who want to look for it, a hardware review site put out a great example about why we need more internal bits for rendering. It shows pictures where the gamma has gone to hell and back, and the results of not having enough bits
Originally posted by: AbRASiON
Originally posted by: ViRGE
AbRASiON, the final, external color scheme won't have any effect on speed.
I am aware this shouldn't make much difference - I never claimed it did.
However 3dfx processed 32bit internally and downsampled to 22bit before sending the data out on the V3 which resulted in more speed and still decent quality.
Well, that's not quite true either. You save bandwith when dealing with a 16-bit framebuffer and 16-bit textures. Yea, when dealing with 32-bit textures and front/back buffer, you don't really gain anything by dithering to 16-bit for output, but why would you do that?Originally posted by: ViRGE
Originally posted by: AbRASiON
Originally posted by: ViRGE
AbRASiON, the final, external color scheme won't have any effect on speed.
I am aware this shouldn't make much difference - I never claimed it did.
However 3dfx processed 32bit internally and downsampled to 22bit before sending the data out on the V3 which resulted in more speed and still decent quality.
That's not quite true. There is virtually no speed benefit to downsampling at the end of the render process. Since all your time/resources are used in rendering, all downsampling accomplishes is to reduce the number of colors used, which in the case of newer cards is to match what the DACs and LCD's can do.
Originally posted by: KnightBreed
Well, that's not quite true either. You save bandwith when dealing with a 16-bit framebuffer and 16-bit textures. Yea, when dealing with 32-bit textures and front/back buffer, you don't really gain anything by dithering to 16-bit for output, but why would you do that?Originally posted by: ViRGE
Originally posted by: AbRASiON
Originally posted by: ViRGE
AbRASiON, the final, external color scheme won't have any effect on speed.
I am aware this shouldn't make much difference - I never claimed it did.
However 3dfx processed 32bit internally and downsampled to 22bit before sending the data out on the V3 which resulted in more speed and still decent quality.
That's not quite true. There is virtually no speed benefit to downsampling at the end of the render process. Since all your time/resources are used in rendering, all downsampling accomplishes is to reduce the number of colors used, which in the case of newer cards is to match what the DACs and LCD's can do.
In some circumstances, having a 16-bit framebuffer has its advantages. Smaller memory footprint and faster buffer reads/writes.Originally posted by: ViRGE
Originally posted by: KnightBreed
Well, that's not quite true either. You save bandwith when dealing with a 16-bit framebuffer and 16-bit textures. Yea, when dealing with 32-bit textures and front/back buffer, you don't really gain anything by dithering to 16-bit for output, but why would you do that?Originally posted by: ViRGE
Originally posted by: AbRASiON
Originally posted by: ViRGE
AbRASiON, the final, external color scheme won't have any effect on speed.
I am aware this shouldn't make much difference - I never claimed it did.
However 3dfx processed 32bit internally and downsampled to 22bit before sending the data out on the V3 which resulted in more speed and still decent quality.
That's not quite true. There is virtually no speed benefit to downsampling at the end of the render process. Since all your time/resources are used in rendering, all downsampling accomplishes is to reduce the number of colors used, which in the case of newer cards is to match what the DACs and LCD's can do.
KnightBreed, my point exactly, there's no good reason(speed or otherwise) to downsample to 16bits at the end. But for 128bit, you have to reduce it to 32bit, since that's the highest that's "supported".
Better accuracy and less rounding errors on the final colour value, especially for heavy alpha blending and multiplass rendering.What's the benefit when the output is 24-bit (8,8,8) or 30-bit (10,10,10) color?
That argument was only valid in the 1990s when the first 32 bit cards arrived. Nowadays there is simply no point to using 16 bit colour because of all of the memory bandwidth saving techniques that are in place, both in hardware and in the game engines. In fact some cards lose their bandwidth saving techniques in 16 bit mode and can actually end up having worse performance.I'd much rather have 70fps in 16bit than 35 in 32bit - but it's a long huge argument
No they didn't and in fact doing so would make no sense at all. They'd have the performance hit from 32 bit rendering but the colour output of only 22 bits.However 3dfx processed 32bit internally and downsampled to 22bit before sending the data out on the V3 which resulted in more speed and still decent quality.
Not always. It'll eventually disappear just like 8 bit 3D output has largely disappeared. With today's cards you'd be hard-pressed to get any 3D game to run under 8 bit colour, even if you try to force it manually.16bit WILL be an output option, it will look even worse, but it will be an output option.
Uh, yes you do. With 32 bit colour you can see artifacts even when you have something simple like a spotlight shining through fog.However for most games you don't notice it if you're playing rather than observing.
That argument died in the 1990s when the first 32 bit cards arrived. Nowadays there is simply no point to using 16 bit colour because of all of the memory bandwidth saving techniques that are in place, both in hardware and in the game engines. In fact some cards lose their bandwidth saving techniques in 16 bit mode and can actually end up having worse performance.I'd much rather have 70fps in 16bit than 35 in 32bit - but it's a long huge argument
No they didn't and in fact doing so would make no sense at all. They'd have the performance hit from 32 bit rendering but the colour output of only 22 bits. 3dfx rendered everything in 16 bit mode and then used a postfilter (very similar to FSAA) to upsample the image as it left the framebuffer and went to the monitor. This process had no performance hit at all because it could be done with just a few logic circuits and it didn't require any core or memory resources.However 3dfx processed 32bit internally and downsampled to 22bit before sending the data out on the V3 which resulted in more speed and still decent quality.
Not always. It'll eventually disappear just like 8 bit 3D output has largely disappeared. With today's cards you'd be hard-pressed to get any 3D game to run under 8 bit colour, even if you try to force it manually.16bit WILL be an output option, it will look even worse, but it will be an output option.
And when 128 bit rendering becomes standard enough there will be little point to having 16 bit rendering. All of the rendering pathways will eventually be optimised for full 128 bit rendering, just like they have been optimised for 32 bit output in today's cards.
Uh, yes you do. Wiith 32 bit colour you can see artifacts even when you have something simple like a spotlight shining through fog.However for most games you don't notice it if you're playing rather than observing.
Now imagine something like a flashing campfire inside fog with multiple spot lights intersecting each other while shining through both the fire and through the fog. 32 bit colour simply doesn't have the precision to handle something like that.
Originally posted by: KnightBreed
In some circumstances, having a 16-bit framebuffer has its advantages. Smaller memory footprint and faster buffer reads/writes.
If you have a 32-bit framebuffer and are calculating at 32-bit internally, of course you aren't going to get a speed benefit from dithering to 16-bit. When did I say otherwise?Originally posted by: ViRGE
If you're going from 32 to 16 at the end, there isn't a speed benefit.Originally posted by: KnightBreed
In some circumstances, having a 16-bit framebuffer has its advantages. Smaller memory footprint and faster buffer reads/writes.
I've been under the impression that GPU's execute code natively in 32-bit. Executing shader or color ops in 16-bit will not give you a speed boost. The speed boost comes from the smaller memory footprint and faster buffer reads from the higher available bandwith.Otherwise you're basically rendering in 16 bit mode in the first place, which is the reason for the speed boost.