How does Windows support 30 bit color?

Mark R

Diamond Member
Oct 9, 1999
8,513
14
81
Some graphics cards, like the Matrox Parhelia and the ATi Radeon X1900 support 30 bit color depth (1.07 billion colors).

What sort of support does Windows have for this color depth?

What about from a software perspective, will high-end programs (photoshop, etc.) automatically take advantage of this?

What about software development? Microsoft's SDKs give very little useful information about high dynamic range images, apart from DirectX
 

Mark R

Diamond Member
Oct 9, 1999
8,513
14
81
Originally posted by: Acanthus
Isnt 32bit the de facto standard? Why would you want LESS precision in apps?

Because, while 32 bits of memory per pixel are frequently used, in the vast majority of cases only 24 bits actually contain color data - the other 8 bits are either wasted, or used for another purpose.

DirectX does support 32bit HDR textures (30 bit color + 2 bits transparency), but how do you get them onto the screen in full detail? According to ATi's specification sheet, you can - but that doesn't necessarily mean that you can do it with Windows.
 

Goi

Diamond Member
Oct 10, 1999
6,763
6
91
Good question. I'd like to know as well. What cards/GPUs support 10 bits per primary color though?
 

Lonyo

Lifer
Aug 10, 2002
21,939
6
81
Matrox cards have 10 bit per colour channel (RGB).
In 32 bit colour you have 8 bit red, 8 bit green, 8 bit blue and 8bit alpha channel AFAIK. (Alpha controls transparency I believe)
 

SonicIce

Diamond Member
Apr 12, 2004
4,774
0
76
whats the use of more than 8 bpp? more precision for calculations? or can you really send the extra colors over vga to a CRT? I dont think LCD's can display over 8 bpp or maybe ths is a limitaton of DVI.
 

Golgatha

Lifer
Jul 18, 2003
12,650
1,512
126
Originally posted by: Lonyo
Matrox cards have 10 bit per colour channel (RGB).
In 32 bit colour you have 8 bit red, 8 bit green, 8 bit blue and 8bit alpha channel AFAIK. (Alpha controls transparency I believe)


Yes, 24 bits per pixel for color and 8bit alpha channel.

alpha channel

An 8-bit layer in a graphics file format that is used for expressing translucency (transparency). The extra eight bits per pixel serves as a mask and represents 256 levels of translucency. For example, TIFF and PNG are graphics formats that support the 8-bit alpha channel. The GIF89a format supports a 1-bit alpha channel that allows one color on the palette to be translucent.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
I think the ramdac (or its functional equivelant) in 8500 (not sure but R300 & above should also support it) supports 10 bit per color channel output. That may possibly explain why some people like the color output from Radeon's.

Perhaps someone else can remember more than me about it.