LCDs and color limitations

alcoholbob

Diamond Member
May 24, 2005
6,387
465
126
As I understand it, 32bit color = 4 billion colors.

Old school LCDs (Dell 2001fp being one of the last of them) were listed as "10-bit". To my understanding, that would mean they would display a bit over a billion colors.

Since then 8bit monitors with high response rates of ~8ms have shown up, which show up to 16.7million colors.

Now there are 6bit monitors with 6ms and lower response times which theoretically would only show 262,000 colors, although supposedly with dithering they show up to 16.2 million colors.

I assume less colors makes it easier to have faster response times.

My question is, is there even a point to setting Windows (and games) to 32bit color? Wouldn't performance be faster with 16bit color (it might be neglible, I don't know), which is 16.7 million colors?

Thanks guys, this is just a headscratcher for me.

EDIT: I have a personal interest in this, since I recently purchased a Dell 1907FP with a Samsung 8bit LTM190EX panel and wondered if performance gains were possible at a lowered setting with (neglible?) graphical differences.
 

ProviaFan

Lifer
Mar 17, 2001
14,993
1
0
32 bit color is actually 24 bit color with an additional 8 bits for an alpha (transparency) channel. All LCDs are 8 bit or less (excepting certain high-end Eizo and NEC models that take 8 bit per channel input and do color correction internally in 10 bits). Many gaming panels are 6 bits, but they can still dither to simulate 8 bit color - while not good enough for photo editing folks, still good enough to warrant sending them 8 bits per channel color information. As you may have seen, when we talk about 8 bit LCDs (or 8 bit RGB files, etc.), we refer to the number of bits in a channel (thus, 24 or 32 bits total). 16 bit color as done in the graphics card sense (not the Photoshop sense where that means 16 bits per channel, or 48 bit color) uses two 5 bit channels for red and blue, with a 6 bit channel for green. This is even more limiting than a 6 bit LCD without dithering, and if you run 16 bit color you will see horrible dithering and banding artifacts unless you're nearly blind.

Cliff's notes: just set your computer to 24 or 32 bit color (they're the same thing as far as end-users are concerned), because you won't gain anything noticeable or worthwhile in performance by stepping down, and your display will just look like crap.
 
Mar 19, 2003
18,289
2
71
16 bit color is not 16.7 million, it's 2^16 or 65536. The values usually used to classify the dynamic range of monitors ("6 bit", "8 bit") are per-channel, so a "6 bit" panel would correspond to 18 bits (2^18 total representable colors) and "8 bit" would be 24 (2^24 or 16.7 million). To be honest I don't exactly know where 32 bit color came from (over 24 bit) since no monitor (at least no consumer monitor) can actually display 4 billion colors...though I know that the extra 8 bits are used as an alpha/transparency channel in games/textures sometimes.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: Astrallite
I have a personal interest in this, since I recently purchased a Dell 1907FP with a Samsung 8bit LTM190EX panel and wondered if performance gains were possible at a lowered setting with (neglible?) graphical differences.

I was under the impression it used one of AUO's panels. Where did you get the info about it using the Samsung?

The LTM190EX still does dithering to achieve its 16.7 million colors. According to what I've read, with its 6-bit (262K color) driver it expands the input colors to 9-bit to increase dynamic range and then outputs 8-bit in the end. With conventional 6-bit+FRC methods, the last few dark tones are unachievable but the 9-bit method makes the effect of missing tones less undesirable. You could think of it like high dynamic range rendering, sort of, except on a much smaller scale.

Yes, 16-bit would be faster in 32-bit in every situation you could come up with, but both are light-speed by now in 2D mode. With 3D, you may see some performance differences, but dithering will be quite obvious and greatly degrade quality.

As for the lower response time for more colors displayed, I think that is always true. With a wider dynamic range, crystals need to adjust a lot more and it takes a lot more time to transition. This is compensated for by kickstarting the crystals to a high voltage and letting them drop since the fall time is almost always faster. Unfortunately this can also create negative twinkly effects when used too aggressively.
 

alcoholbob

Diamond Member
May 24, 2005
6,387
465
126
On the 1907FP, if you turn off the monitor, then hold the menu and + buttons while turning on the monitor, the next time you enter the OSD, the it will display whether it is an AUO or Samsung (SMG) panel.
 

003

Member
Mar 29, 2006
60
0
0
So if I understand correctly, 8-bit LCDs can only display 24-bit color, and not 32-bit? Can CRTs display 32-bit color? My only choices for color are 16-bit and 32-bit.
 

ProviaFan

Lifer
Mar 17, 2001
14,993
1
0
32 bit color, as indicated by your Windows graphics card drivers, is essentially 24 bit color (8 bits per channel), and can be displayed by all modern monitors. Just accept that in terms of quality and speed, 24 bits == 32 bits, and get on with life and enjoy your computer. :)
 

003

Member
Mar 29, 2006
60
0
0
So it is technically 24-bit color with the extra 8 bits used for an alpha channel? Can an 8-bit LCD properly display it like a CRT?
 

ProviaFan

Lifer
Mar 17, 2001
14,993
1
0
Originally posted by: 003
So it is technically 24-bit color with the extra 8 bits used for an alpha channel? Can an 8-bit LCD properly display it like a CRT?
Yes. The alpha channel is almost never used in the 2D environment, and if it is used, the OS does the compositing before sending the graphics to the display, so your display only ever sees 3 bytes per pixel. An 8 bit LCD will behave exactly as a CRT in this regard, with exceptions of course for the variations in image quality that occur between CRT and LCD technology.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: Astrallite
On the 1907FP, if you turn off the monitor, then hold the menu and + buttons while turning on the monitor, the next time you enter the OSD, the it will display whether it is an AUO or Samsung (SMG) panel.

Ah, thanks for that info.

Originally posted by: 003
So if I understand correctly, 8-bit LCDs can only display 24-bit color, and not 32-bit? Can CRTs display 32-bit color? My only choices for color are 16-bit and 32-bit.

The gamma processor on both CRTs and LCDs support only 24-bit color. The graphics card handles 32-bit->24-bit conversion by multiplying the three R,G,B bytes by the last byte [alpha/255]. It then sends the products to the monitor through VGA/DVI.

So, 8-bit/subpixel LCDs can display 24-bit color and CRTs can display 24-bit. There are a couple monitors with special gamma processors that can use up to 14-bit/subpixel (42-bit/pixel) data. They just interpolate to get the extra bits since what is being sent is still 24-bit/pixel.

So-called "6 bit" LCDs take the 24-bit (8 bit/subpixel) source data and drive the crystals in a way that they can display it, which is by employing dithering or frame rate control methods to make up for the missing 2 bits/subpixel.

Windows uses the 8-bit alpha for the icons in its shell (blending edge of an icon into its surface) so that the edges look smoother.