I noticed earlier a post saying that a human being cannot tell the difference between 96bit color and 128bit color. I'm afraid I'll have to disagree on that one, but I have evidence.
I was working on getting my computer to play DVD movies on my HDTV now that I have it connected and I noticed that my PS2 looked better than the DVD software that came with my video card. At first I didn't under stand why, but then the DVD that was playing changed scene to a much darker scene. I realized that there were telltale divisions in the shadows where one color abruptly changed to another. I thought, "Ahah! 16bit rendering!" So I figured I'd see what I could do about getting some free or open source vob filters on line and see if they looked better. You would not believe the improvement.
But, wait! There's more!
After a while, I began to realize that, aside from the usual progressive to interlace losses, it still didn't look quite right. When it was an outside or brightly lit scene it looked beautiful, but in dimmer light or indoors it didn't look quite right. That was when I noticed the same occurance. Those same telltale divisions were there. They were just much better blended such that rather than 3 distinct divisions, there were 15 difficult to distinguish divisions. Comparison to the PS2 (or other hardware 32bit DVD player) revealed that the PS2 still looked better.
So here is the breakdown:
16bit = poor blending
24bit = good blending
32bit = superb blending if not as good as it gets
On a side note:
This may explain the haze that NVidia users complain about when seeing the same game played on an ATI card. Because ATI cards are only using 96bit color, shader calculation (rounding) errors and the like, while infrequent at worst, occur more often than in NVidia cards which use 128bit color. These errors have a tendency to "disappear" at certain gamma and brightness settings which are not default on NVidia hardware.
Disclaimer:
No statement made in this post is intended to suggest that either NVidia or ATI hardware is better than the other. The only intent was to introduce both the reasoning behind 128bit and a possible reason behind the "ATI haze" issue.
---Edit---
I have used 128bit interchangeably with 32bit and 24bit interchangeably with 96bit because each of the components in the larger is the size of the smaller. 32bit red, green, blue, and alpha values when put together forms 128 bits. However, that is a bit misleading as it is all displayed to the screen in only as many shades as the screen can actually render.
Disclaimer 2:
These errors have a tendency to "disappear" at certain gamma and brightness settings which are not default on NVidia hardware.
It's a digital signals thing that, to be quite honest, I don't completely understand. I made a D- in "Digital Signals with Multivariable Differential Equations" and I don't honestly intend to have horrific flashbacks to that class.