Help me understand 128-bit floating-point color precision...

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
At the very least, 128bit color is really more about internal precision. With only 32bits, you start accumulating errors as you start doing functions and blending layers. 128bits should, for the most part, negate those errors. As for the dithering, I'm not as sure as to what to say. Since monitors are analogue, there's nothing stopping them from just using the whole 128bit signal, is there?
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
I think 128bit is a waste of time. Keep in mind that it's actually 4x32bit. 1 32bit floating point number per color (red, green, blue, alpha transparency). And games don't use dithering. Haven't even since we left 256 colors behind. I think 16bit color's all we need (my opinion based on little fact). 24bit color was lacking since there were times when color banding was visible due to round-off errors.
 

SexyK

Golden Member
Jul 30, 2001
1,343
4
76
if you think 16-bit and 32-bit color look the same, no offence, but you're blind. Anyway, the point of that precision, as ViRGE said, is to avoid the accumulated rounding errors internally that can really mess up the output. Even if the output is ultimately 32-bit, those 32-bits will be much more accurate if they were processed at 128-bit's internally.

Kramer
 
Aug 10, 2001
10,420
2
0
Originally posted by: ViRGE
At the very least, 128bit color is really more about internal precision. With only 32bits, you start accumulating errors as you start doing functions and blending layers. 128bits should, for the most part, negate those errors. As for the dithering, I'm not as sure as to what to say. Since monitors are analogue, there's nothing stopping them from just using the whole 128bit signal, is there?
The DAC on the Radeon 9700 Pro dithers the image to 24-bit or 30-bit color before it is displayed.
 

ProviaFan

Lifer
Mar 17, 2001
14,993
1
0
Originally posted by: Vespasian
Originally posted by: ViRGE
At the very least, 128bit color is really more about internal precision. With only 32bits, you start accumulating errors as you start doing functions and blending layers. 128bits should, for the most part, negate those errors. As for the dithering, I'm not as sure as to what to say. Since monitors are analogue, there's nothing stopping them from just using the whole 128bit signal, is there?
The DAC on the Radeon 9700 Pro dithers the image to 24-bit or 30-bit color before it is displayed.
Uh, does the Radeon 9700 Pro actually display 30 bit color? I know the Matrox card does in Photoshop with a special plug-in, but I thought that ATI's Radeon 9700 Pro only did 24 bits "externally," while doing 128 bit processing internally.
 
Aug 10, 2001
10,420
2
0
Originally posted by: jliechty
Originally posted by: Vespasian
Originally posted by: ViRGE
At the very least, 128bit color is really more about internal precision. With only 32bits, you start accumulating errors as you start doing functions and blending layers. 128bits should, for the most part, negate those errors. As for the dithering, I'm not as sure as to what to say. Since monitors are analogue, there's nothing stopping them from just using the whole 128bit signal, is there?
The DAC on the Radeon 9700 Pro dithers the image to 24-bit or 30-bit color before it is displayed.
Uh, does the Radeon 9700 Pro actually display 30 bit color? I know the Matrox card does in Photoshop with a special plug-in, but I thought that ATI's Radeon 9700 Pro only did 24 bits "externally," while doing 128 bit processing internally.
I'm pretty sure that both the Radeon 9700 and the Parahelia can do 30-bit color and 2-bit alpha.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: Vespasian
Even if the output is ultimately 32-bit, those 32-bits will be much more accurate if they were processed at 128-bit's internally.
What exactly is more accurate?

The color itself it more accurate. Banding and other artifacts are the results of errors from rendering due to a lack of precision. This may make all the difference between RGB 210, 32, 17; and RGB 215, 30, 19; where the second number is the accurate one, and the first is the result of so many rounding errors.

Or in even simpiler terms: we want to add 1546, and 5263 together. They are precisely 6809, but if we round them first, we get different results. To the nearest 10, we get 6810, to 100 we get 6800, and to 1000 we get 7000. By adding more bits, we'll be closer to the correct answer than the most incorrect answer.
 

FishTankX

Platinum Member
Oct 6, 2001
2,738
0
0
Vespain:The main purpose of the high level internal 128 bit floating point precision, is that when you're blending 15 or 16 textures together you don't get as many rounding errors when you blend them all to 32 bit. These 'Rounding errors' are the primary reason why graphics today don't look as realistic as they should, and thus they've migrated to 128 bit color not because your eyes can see these colors, rather when you're mixing all of these textures in 128 bit you get more precision and thus you don't get any errors rounding down to 32 bit, as you would in 32 bit mixing.

 

AbRASiON

Senior member
Oct 10, 1999
861
4
81
16 bits looks fine to me in most games,.................

but on occassion I can see dithering,.....

I beleive 128bit precision and 128 bit per channel or 32per channel or however it's done - maximum quality as it's being processed stops dithering internally completely - the final output though should only be in 32bits max - you just won't see any difference from 32bit -> 48 / 64 / 128 bit (external image)
HOWEVER a properly done SIXTEEN BIT external image can still look damned good too,.. although most disagree.

The final output could be 22bit like 3dfx pushed or even 16bit and it would look fine.

the problem with 16bit now is it's optomised for 32bit so it makes 16 bit look worse than it is.


I'd much rather have 70fps in 16bit than 35 in 32bit - but it's a long huge argument - if I was running at 75 fps 32bit and 95 in 16 i'd take 32
It depends if you have a budget card or what :) - for example I play War 3 in 16bit 1024 on a Radeon ORIGINAL and it's bloody smooth - I was impressed :)

 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
AbRASiON, the final, external color scheme won't have any effect on speed. By definition, all of the power needed to render is/will be done at 128bit mode, only downshifting to 32bits for display once everything is rendered. As such, 128bit rendering is going to be initially painful, much like 32bits was on the original TNT. There aren't enough 128bit games yet to make a judgement, but it may be another cycle or 2 before 128bit color is truely reasonable. Once that point does happen though, expect to see 16bit rendering dissapear completely; in 128bit mode, there's no benefit to outputting in 16bits, and for cards that don't do 128bit mode, 32bits will be manditory in order to keep the artifacting to a minimum due to the additional texture layers.

PS On a side note, for those of you who want to look for it, a hardware review site put out a great example about why we need more internal bits for rendering. It shows pictures where the gamma has gone to hell and back, and the results of not having enough bits
 

AbRASiON

Senior member
Oct 10, 1999
861
4
81
Originally posted by: ViRGE
AbRASiON, the final, external color scheme won't have any effect on speed.


I am aware this shouldn't make much difference - I never claimed it did.
However 3dfx processed 32bit internally and downsampled to 22bit before sending the data out on the V3 which resulted in more speed and still decent quality.

By definition, all of the power needed to render is/will be done at 128bit mode, only downshifting to 32bits for display once everything is rendered. As such, 128bit rendering is going to be initially painful, much like 32bits was on the original TNT.

I totally agree - it's going to be crazy painful, infact I think they will stick to 48 / 64 / 96 bit as much as possible to avoid using full 128bit rendering across the board when it's not needed most of the time (unless you're doing like 27 passes per frame or some such)
SOME games right now would be slightly better with more than 32 bit (War 3's fog effects in undead effected maps in the fog of war dithers quite badly even in 32bit)
....slightly being the key word.
Oh and 32bit colour rendering in MY opinion wasn't a viable option until the Geforce 1 DDR, that's when 32bit was truely completely viable at reasonable speeds and full detail.


There aren't enough 128bit games yet to make a judgement, but it may be another cycle or 2 before 128bit color is truely reasonable.

I beleive possibly 3 or more cycles.
The reason 32bit initially took so long to get it happening is it's literally 2x the work.
TNT1 / TNT2 couldn't REALLY do 32bits at a decent speed, hence it wasn't until GF1 DDR it could be done.
That was 2 cycles back then - and we were at the beggining of the video technology upsurge
Now we're slowly beggining to reach the "oh dear we are kind of stuck for squeezing out more power from these things" stage (in relation to memory bandwidth)

Think about this, we've just started reaching the stage where we can render full detail high resolution (1024 and higher) in 32bits "dead smooth" - infact we can do a LITTLE bit more in the fact we can waste* frames on AA and AF now.
Now, we're about to take 4x the performance away with one simple click - 32bit colour -> 128bit colour = flat on it's ass, even a 9700 would have trouble unless you turn off AA / AF and drop to 1024 or something........

* when I say AA and AF is a waste, that's my opinion, I find the difference negligible for the speed impact on most cards,... 9700 excepted (i'd still not turn it on if I owned a 9700 though)



Once that point does happen though, expect to see 16bit rendering dissapear completely; in 128bit mode, there's no benefit to outputting in 16bits, and for cards that don't do 128bit mode, 32bits will be manditory in order to keep the artifacting to a minimum due to the additional texture layers.

Just plain wrong.
16bit WILL be an output option, it will look even worse, but it will be an output option.


PS On a side note, for those of you who want to look for it, a hardware review site put out a great example about why we need more internal bits for rendering. It shows pictures where the gamma has gone to hell and back, and the results of not having enough bits


However for most games you don't notice it if you're playing rather than observing.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: AbRASiON
Originally posted by: ViRGE
AbRASiON, the final, external color scheme won't have any effect on speed.


I am aware this shouldn't make much difference - I never claimed it did.
However 3dfx processed 32bit internally and downsampled to 22bit before sending the data out on the V3 which resulted in more speed and still decent quality.

That's not quite true. There is virtually no speed benefit to downsampling at the end of the render process. Since all your time/resources are used in rendering, all downsampling accomplishes is to reduce the number of colors used, which in the case of newer cards is to match what the DACs and LCD's can do.
 
Jun 18, 2000
11,197
769
126
Originally posted by: ViRGE
Originally posted by: AbRASiON
Originally posted by: ViRGE
AbRASiON, the final, external color scheme won't have any effect on speed.


I am aware this shouldn't make much difference - I never claimed it did.
However 3dfx processed 32bit internally and downsampled to 22bit before sending the data out on the V3 which resulted in more speed and still decent quality.

That's not quite true. There is virtually no speed benefit to downsampling at the end of the render process. Since all your time/resources are used in rendering, all downsampling accomplishes is to reduce the number of colors used, which in the case of newer cards is to match what the DACs and LCD's can do.
Well, that's not quite true either. You save bandwith when dealing with a 16-bit framebuffer and 16-bit textures. Yea, when dealing with 32-bit textures and front/back buffer, you don't really gain anything by dithering to 16-bit for output, but why would you do that?
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: KnightBreed
Originally posted by: ViRGE
Originally posted by: AbRASiON
Originally posted by: ViRGE
AbRASiON, the final, external color scheme won't have any effect on speed.


I am aware this shouldn't make much difference - I never claimed it did.
However 3dfx processed 32bit internally and downsampled to 22bit before sending the data out on the V3 which resulted in more speed and still decent quality.

That's not quite true. There is virtually no speed benefit to downsampling at the end of the render process. Since all your time/resources are used in rendering, all downsampling accomplishes is to reduce the number of colors used, which in the case of newer cards is to match what the DACs and LCD's can do.
Well, that's not quite true either. You save bandwith when dealing with a 16-bit framebuffer and 16-bit textures. Yea, when dealing with 32-bit textures and front/back buffer, you don't really gain anything by dithering to 16-bit for output, but why would you do that?

KnightBreed, my point exactly, there's no good reason(speed or otherwise) to downsample to 16bits at the end. But for 128bit, you have to reduce it to 32bit, since that's the highest that's "supported".
 
Jun 18, 2000
11,197
769
126
Originally posted by: ViRGE
Originally posted by: KnightBreed
Originally posted by: ViRGE
Originally posted by: AbRASiON
Originally posted by: ViRGE
AbRASiON, the final, external color scheme won't have any effect on speed.


I am aware this shouldn't make much difference - I never claimed it did.
However 3dfx processed 32bit internally and downsampled to 22bit before sending the data out on the V3 which resulted in more speed and still decent quality.

That's not quite true. There is virtually no speed benefit to downsampling at the end of the render process. Since all your time/resources are used in rendering, all downsampling accomplishes is to reduce the number of colors used, which in the case of newer cards is to match what the DACs and LCD's can do.
Well, that's not quite true either. You save bandwith when dealing with a 16-bit framebuffer and 16-bit textures. Yea, when dealing with 32-bit textures and front/back buffer, you don't really gain anything by dithering to 16-bit for output, but why would you do that?

KnightBreed, my point exactly, there's no good reason(speed or otherwise) to downsample to 16bits at the end. But for 128bit, you have to reduce it to 32bit, since that's the highest that's "supported".
In some circumstances, having a 16-bit framebuffer has its advantages. Smaller memory footprint and faster buffer reads/writes.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
What's the benefit when the output is 24-bit (8,8,8) or 30-bit (10,10,10) color?
Better accuracy and less rounding errors on the final colour value, especially for heavy alpha blending and multiplass rendering.

I'd much rather have 70fps in 16bit than 35 in 32bit - but it's a long huge argument
That argument was only valid in the 1990s when the first 32 bit cards arrived. Nowadays there is simply no point to using 16 bit colour because of all of the memory bandwidth saving techniques that are in place, both in hardware and in the game engines. In fact some cards lose their bandwidth saving techniques in 16 bit mode and can actually end up having worse performance.

However 3dfx processed 32bit internally and downsampled to 22bit before sending the data out on the V3 which resulted in more speed and still decent quality.
No they didn't and in fact doing so would make no sense at all. They'd have the performance hit from 32 bit rendering but the colour output of only 22 bits.

3dfx rendered everything in 16 bit mode and then used a post-filter (very similar to FSAA) to upsample the image as it left the framebuffer and went to the monitor. This process had no performance hit at all because it could be done with just a few logic circuits and it didn't require any core or memory resources.

16bit WILL be an output option, it will look even worse, but it will be an output option.
Not always. It'll eventually disappear just like 8 bit 3D output has largely disappeared. With today's cards you'd be hard-pressed to get any 3D game to run under 8 bit colour, even if you try to force it manually.

And when 128 bit rendering becomes standard enough there will be little point to having 16 bit rendering. All of the rendering pathways will eventually be optimised for full 128 bit rendering, just like they have been optimised for 32 bit output in today's cards. In fact, it's highly likely that 32 bit colour will eventually disappear too.

However for most games you don't notice it if you're playing rather than observing.
Uh, yes you do. With 32 bit colour you can see artifacts even when you have something simple like a spotlight shining through fog.

Now imagine something like a flashing campfire inside fog with multiple spot lights intersecting each other while shining through both the fire and through the fog. 32 bit colour simply doesn't have the precision to handle something like that.
 

AbRASiON

Senior member
Oct 10, 1999
861
4
81
I'd much rather have 70fps in 16bit than 35 in 32bit - but it's a long huge argument
That argument died in the 1990s when the first 32 bit cards arrived. Nowadays there is simply no point to using 16 bit colour because of all of the memory bandwidth saving techniques that are in place, both in hardware and in the game engines. In fact some cards lose their bandwidth saving techniques in 16 bit mode and can actually end up having worse performance.

Some card's don't perform 2x faster in 16bit anymore, only 30% faster etc, however people playing in 16bit normally have slightly older cards.
and the argument certainly didn't "die" in the 1990's when the first 32bit cards arrived, more like in the LATE 1990's when the 3'rd gen of 32bit cards arrived.
Just like our DX9 cards are out 4 months ago from ATI, we won't be playing common DX9 (featured) games for at least 18 months.



However 3dfx processed 32bit internally and downsampled to 22bit before sending the data out on the V3 which resulted in more speed and still decent quality.
No they didn't and in fact doing so would make no sense at all. They'd have the performance hit from 32 bit rendering but the colour output of only 22 bits. 3dfx rendered everything in 16 bit mode and then used a postfilter (very similar to FSAA) to upsample the image as it left the framebuffer and went to the monitor. This process had no performance hit at all because it could be done with just a few logic circuits and it didn't require any core or memory resources.

What possible benefit can a "postfilter" do to INCREASE the image quality of an already heavily dithered image.
That's simply not the case at all - you can't MAKE something out of nothing - that's like saying you can make a 384kbs mp3 from a 128kbs one AND claim you're getting better quality......



16bit WILL be an output option, it will look even worse, but it will be an output option.
Not always. It'll eventually disappear just like 8 bit 3D output has largely disappeared. With today's cards you'd be hard-pressed to get any 3D game to run under 8 bit colour, even if you try to force it manually.

Wake up.
No 3D card has EVER offered 8bit only colour, it's ALWAYS been 16bit, the Voodoo 1 was 16bit, the Verite 1 was 16 bit.
I can assure you for the next 3+ generations of cards 16bit WILL be an option, the only way of disabling it will be software disabling, it can still be "done" by the card (re-read his original post he said it will be outlawed or prohibited or something and completely not possible)




And when 128 bit rendering becomes standard enough there will be little point to having 16 bit rendering. All of the rendering pathways will eventually be optimised for full 128 bit rendering, just like they have been optimised for 32 bit output in today's cards.

Totally true, but it will still be an option - and if it gives even 15% more of an increase in speed - some people will use it (like me) if they can't afford a top end card.



However for most games you don't notice it if you're playing rather than observing.
Uh, yes you do. Wiith 32 bit colour you can see artifacts even when you have something simple like a spotlight shining through fog.

I said playing not observing......... *I* don't notice it frequently, and this particular topic is OPINION so you can't really argue it - there's no fact to be found here.
Some people can't hear the difference between 50,000$ home speaker setups and $20,000 home speaker setups.
I *PERSONALLY* rarely notice 16bit - and even if I did on occassion, if I was getting more frames than 32bit, then I'd use it.


Now imagine something like a flashing campfire inside fog with multiple spot lights intersecting each other while shining through both the fire and through the fog. 32 bit colour simply doesn't have the precision to handle something like that.

Sure don't, ain't gonna stop people using it though.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: KnightBreed

In some circumstances, having a 16-bit framebuffer has its advantages. Smaller memory footprint and faster buffer reads/writes.

If you're going from 32 to 16 at the end, there isn't a speed benefit. Otherwise you're basically rendering in 16 bit mode in the first place, which is the reason for the speed boost.
 
Jun 18, 2000
11,197
769
126
Originally posted by: ViRGE
Originally posted by: KnightBreed
In some circumstances, having a 16-bit framebuffer has its advantages. Smaller memory footprint and faster buffer reads/writes.
If you're going from 32 to 16 at the end, there isn't a speed benefit.
If you have a 32-bit framebuffer and are calculating at 32-bit internally, of course you aren't going to get a speed benefit from dithering to 16-bit. When did I say otherwise?
Otherwise you're basically rendering in 16 bit mode in the first place, which is the reason for the speed boost.
I've been under the impression that GPU's execute code natively in 32-bit. Executing shader or color ops in 16-bit will not give you a speed boost. The speed boost comes from the smaller memory footprint and faster buffer reads from the higher available bandwith.

You have 2 textures, A and B. Both are 16-bit.

1) Read in Texel(x,y) from Texture A.
2) Read in Texel(x,y) from Texture B.
3) Combine two texels or execute shader on color values natively in 32-bit.
4) Dither down to 16-bit for framebuffer write.

This will run faster than 32-bit from start->finish and will look better than 16-bit from start->finish.

Maybe we're arguing about 2 different things, but there's something that needs cleared up. The calculations done on a pixel can be in an entirely different precision then how it will be stored in memory. These internal calculations can affect how the final output will be, regardless of the specified framebuffer precision.

(sometimes I'm not very good at explaining myself, so bare with me)
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
My understanding is that the framebuffer is stored at the same depth as all calculations are done(so if you have a 16bit buffer, it's all 16bit), hence our confusion. Textures can be independant, but the buffer is locked to the rendering depth.