What does 32 vs 24 bit color do for me when talking an LCD???

Caveman

Platinum Member
Nov 18, 1999
2,537
34
91
Thinking of the Dell 2001FP...

If I understand the specs correctly, it has 24 bit color. What does this mean to me in the following context:

1.) I do a lot of work on my computer at home; spreadsheets, CAD, word processing, etc...

2.) I'm an avid flight simulation freak. Almost all of my flight simulation actually run faster in 32 bit color rather than 16 due to coding/driver compatibility issues... So, generally, I'll run a flight sim in 32 bit mode. What will happen if I do? Do the additional 8 bits of pixel data above 24 get guessed at? Will I see any "real world" differences to the graphical quality compared to the 32 bit that I currently run on a CRT? I've always been under the impression that anything above 16 is hard for the eye to see a difference anyway...

I'm sure I'll have no problems with #1... What about #2?
 

yhelothar

Lifer
Dec 11, 2002
18,409
39
91
32bit and 24 bit color both has the same amount of colors, 16 million because 32bit color is really 24bit with an 8bit alpha channel
 

Caveman

Platinum Member
Nov 18, 1999
2,537
34
91
Thanks for the help... So, is this one of those things that makes like a .1% difference in the "perception" of the onscreen color and is not something to care about?
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
Thanks for the help... So, is this one of those things that makes like a .1% difference in the "perception" of the onscreen color and is not something to care about?
Thare are 0% more colors. It's not a peception issue. Although the extra channel can be used for alpha, in practice it never does anything and is primarily there to provide better data alignment.

And modern GPU's aren't designed with much consideration for 16bit color and that's why 32bit color is faster, not because of driver or coding issues.