• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

LCD's 24bit vs 32bit colour

PookyBoy

Senior member
From what I've read, LCD's really only support upto 24bit colour. So I assume there's no use in setting it to 32bit in display properties?
 
most lcd's actually have 16bit and lower. there will no difference between 16bit and 32bit on a lcd.
 
Originally posted by: dc5
most lcd's actually have 16bit and lower. there will no difference between 16bit and 32bit on a lcd.

Lies and blasphemy.

The lowest supported colour depth I've seen on an LCD is 18-bit, and that's on the really crappy/entry level models. Almost everything has 24-bit nowadays.

Secondly, if you can't see the difference between 16bpp and 32bpp (really 24+8) - you need your eyes checked.

- M4H
 
The important difference between 24 bit and 32 bit, regardless of the nature of the display, is in driver performance (to varying degrees, depending on the videocard and the drivers), rather than appearance. Software can move around 32 bit blocks of data more efficiently than 24 bit blocks.
 
Yes, there is a difference between 16-bit and 32-bit. But it's mostly gonna be noticeable in 3d games. 16-bit color as well as 24-bit color won't be able to display translucent things correctly. This type of artifact looks like ass. So it's always best to use 32-bit.

LCDs whether 18-bit or 24-bit will use a diffusion technique(I think it's called that), where it will be able to mimic 32-bit, but the color won't always be as true as with 32-bit.
 
Originally posted by: VIAN
Yes, there is a difference between 16-bit and 32-bit. But it's mostly gonna be noticeable in 3d games. 16-bit color as well as 24-bit color won't be able to display translucent things correctly. This type of artifact looks like ass. So it's always best to use 32-bit.

Two words - photo editing. 😉

LCDs whether 18-bit or 24-bit will use a diffusion technique(I think it's called that), where it will be able to mimic 32-bit, but the color won't always be as true as with 32-bit.

I believe the word you're looking for is "dithering" - 32->24 doesn't need that, as both 24 and 32-bit depths have the same colour addressing - 8 bits per channel - but 32bpp has the extra 8-bit "Alpha channel". Exception in the case of Matrox Gigacolour - 10 bits per channel, 2-bit alpha.

- M4H
 
Erm, monitors don't "support" 32-bit color. Monitors support 24-bit color (or less in the case of very old LCDs). 32-bit, as was previously stated, is 24-bit plus an alpha (translucency) channel. But that is all handled through software, and isn't a property of your monitor, be it CRT or LCD.
 
Back
Top