DisposableHero, quality is always a subjective thing, hence while the visual quality increase from going from 16bpp to 32bpp may not be that apparent to you, it is to other people. Anyway, as I've mentioned, absolute number of colors in the final output is not the point of 48/64bit color. It is the blending operations that occur in the intermediate rendering that causes inaccuracies. To give an analogy, imagine that you're living in a world with only integers. If you wanna take an average of different numbers, you'd end up with decimal points that will be truncated. If you use enough of these truncated results to take more averages, you'd end up with a grossly inaccurate, or even downright wrong final result. With enough decimal points, you'd be able to reduce the error to the point where it will no longer be significant. The same applies to rendering. Today's games may not be doing blending in too many steps, but tomorrow's games will. With enough bits, the error can be reduced to a point where it is not apparent anymore.