Will we ever have anything higher than 32-bit colour?

Muerto

Golden Member
Dec 26, 1999
1,937
0
0
As far as I know the human eye can't see beyond 32-bit colour. A friend of mine is trying to tell me otherwise. I'm 99% sure I'm right but I just want to be sure. If we could you'd think that we'd have video cards by now that could do 48 or 64-bit colour.
 

Killbat

Diamond Member
Jan 9, 2000
6,641
1
0
There's no point in displaying better than 32bit color, but storing images with greater depth has plenty of usefulness in certain fields, any time stupid detail is important.
 

Finality

Platinum Member
Oct 9, 1999
2,665
0
0
Carmack already stated he wants higher than 32 bit color. I believe he wants 64? Well something like that.

Althought I'm not sure if anyone wants to disappoint him. In any case we are a few years away from it.

I want the PC industry to fix the current PC color scheme so its more like a macs :)
 

MGMorden

Diamond Member
Jul 4, 2000
3,348
0
76
I'm pretty sure we can see higher than 32-bit. Biological "devices" work a lot different than electronic devices. Everything can't always be summed up so nicely. From my understanding the human eye is not limited by a number of colors, but by a given area of the spectrum. Any area in between the bounds of the visible spectrum should be seen. So we should be able to see as many colors as are physically possbile (which I'm guessing is a lot more than 32-bits).
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,005
126
You don't have to use the extra bits for the colour itself - you can use it for other things. For example you could use the extra bits for alpha blending and use 32 bits instead of the 8 used now. This will give a much smoother blending result. As for the rest of the bits, you could use them for anything you might fancy (eg reflectiveness).

And yes - Carmack wants 64 bit colour so internally the colour is more accurate.
 

Mingon

Diamond Member
Apr 2, 2000
3,012
0
0


<< As far as I know the human eye can't see beyond 32-bit colour. A friend of mine is trying to tell me otherwise. I'm 99% sure I'm right but I just want to be sure. If we could you'd think that we'd have video cards by now that could do 48 or 64-bit colour. >>



I think you will find the eye cant see the difference between 32bit shade, by this I mean that the equivalent of a 32bit palette for a colour such as green is not detectable past about 200,000 variations.

 

bgatot

Senior member
Mar 10, 2000
214
0
0
So, before we need 64 bit color, we first need an upgrade path for our Mk. 1 eyeball, eh? :D
 

cvlegion

Senior member
Jan 5, 2001
223
0
0
Maybe I am blowing smoke, but I remember hearing somewhere that in Windows 2000 under the Geforce 2, there is a registry option to change to 64 bit. I cannot attest to whether it works or if it is true, but I distinctly remember it.
 

Mark R

Diamond Member
Oct 9, 1999
8,513
16
81
No graphics card displays higher than 24 bit colour.

32-bit colour means that 32 bits of graphics card memory is used to store 24 bits of colour data - therefore 'wasting' 8 bits per pixel, although in reality this space isn't wasted it's used for caching or some other technique. On older cards this was a method of increasing performance, but on modern cards it basically makes no difference.
 

Sunner

Elite Member
Oct 9, 1999
11,641
0
76


<< I think you will find the eye cant see the difference between 32bit shade, by this I mean that the equivalent of a 32bit palette for a colour such as green is not detectable past about 200,000 variations. >>


If its indeed around 200.000 variations, we have some way to go.
With 32 bit color you have 8888 RBGA, hence you dont get more than 256 shades of every color.
 

MGMorden

Diamond Member
Jul 4, 2000
3,348
0
76
If they do actually break it up into 8 bits per color then that's incredibly stupid. That does come out to 256 each for RGBA, coming to a total of 1024 possible colors, but if they used all 32-bits together and simply &quot;divided&quot; the regions then we'd have 4294967296 possible colors.
 

Mookow

Lifer
Apr 24, 2001
10,162
0
0
I believe both 24 and 32 bit color displays more colors than the human eye can differientiate. In other words, you cant tell the difference between shade x and one shade darker, shade y.
 

Goi

Diamond Member
Oct 10, 1999
6,772
7
91
They do actuallybreak it up into 8bits per color, for red, green and blue, but the total isn't a sum of the 3 but rather a product of the 3, since they are mixed. So in reality we have 2^24 or 16.8 million colors, not 1024.

Also, professional cards in the UNIX world(SUN/SGI systems)have already had 48/64bit color for years now, but none in the PC market that I know of has implemented this yet. There just hasn't been a need to. Carmack's push for 64 bit color is not for absolute color representation, but rather to reduce artifacts and inaccuracies as a result of the blending operations that occur during rendering. Realistically, all that needs to be 64bit is the internal rendering. If the final result is rendered at 32bit nobody would be able to tell the difference(sorta like twice of what 3dfx did with their Voodoo3 - internal 32bit rendering with final 16 bit output).
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,005
126
32-bit colour means that 32 bits of graphics card memory is used to store 24 bits of colour data - therefore 'wasting' 8 bits per pixel,

Um, no they aren't. The extra 8 bits are used for alpha blending. It would be pretty retarded to pass around 32 bits worth of info when you're only using 24 of them.
 

DRGrim

Senior member
Aug 20, 2000
459
0
0


<< 8bit-RGB 1024? Try 0xFF ^ 3 = 0xFD02FF colors ~16M >>


WTF? Is that hexidecimal? Can you actually count that way?
 

Noriaki

Lifer
Jun 3, 2000
13,640
1
71
Yes it's Hex and it's really not that hard.

For a simple thing like FF ^ 3...that's just FF*FF*FF

it's really not that hard...just start the way you did with Decimal numbers back in elementary school. And with enough practice it will come as natuarlly as decimal do ;)



<< Um, no they aren't. The extra 8 bits are used for alpha blending. It would be pretty retarded to pass around 32 bits worth of info when you're only using 24 of them. >>

Actually they are good reasons for doing that. 24bits is 3 bytes, if you have 2 byte words then the read segments aren't word aligned, which can really slow things down. Depending on how the memory is organized you only hit the odd bytes every other cycle.

So take 2x2 words and sending those off, then you are at the start of another word which you can read right away.

If you take 3 bytes, you end up at an odd byte, and you may not be able to read it on this pass.

Of course this is all dependant on memory type and organization and I'm assuming a 2 byte word.

I also don't even know if any of that is true, but it's theoretically possible that using 32bits would be faster than 24.

Waste space yes, but it could be faster.

You are correct though, the last 8bits are used for Alpha blending.
 

Goi

Diamond Member
Oct 10, 1999
6,772
7
91
DisposableHero, quality is always a subjective thing, hence while the visual quality increase from going from 16bpp to 32bpp may not be that apparent to you, it is to other people. Anyway, as I've mentioned, absolute number of colors in the final output is not the point of 48/64bit color. It is the blending operations that occur in the intermediate rendering that causes inaccuracies. To give an analogy, imagine that you're living in a world with only integers. If you wanna take an average of different numbers, you'd end up with decimal points that will be truncated. If you use enough of these truncated results to take more averages, you'd end up with a grossly inaccurate, or even downright wrong final result. With enough decimal points, you'd be able to reduce the error to the point where it will no longer be significant. The same applies to rendering. Today's games may not be doing blending in too many steps, but tomorrow's games will. With enough bits, the error can be reduced to a point where it is not apparent anymore.