
Slower cards of TNT2 & 3dfx generations, esp with 16MB RAM really benefit from 16bit colour. For modern gfx cards like GF3, GF4TI & Radeon8500 you only want to consider 32bit colour, and prob AA/AF as well!

Of course actual perf diffs do depend upon the card and CPU used. With 'slower' CPUs and higher end gfx cards you really want to max out gfx settings with AA, Aniso and of course 32bit colour in order to use the full GPU potential that the CPU may not tap.
3Dmark2001 using a mid-range AthlonXP and default res of 1024x768:
Voodoo4 32 = 1600 & 9.5FPS (Car Chase High Detail)
Voodoo4 16 = 2250 & 14.5FPS
GF2 GTS/Pro/TI 32 = 6000 & 41.5
GF2 GTS/Pro/TI 16 = 6100 & 40
GF3 32 = 8800 & 51
GF3 16 = 7500 & 44 (Yes slower I double checked)
GF4TI4200 32 = 10500 & 55
GF4TI4200 16 = 9500 & 51 (Again slower!)
You would expect most benefit to come in higher resolutions where the bandwidth is more limited (eg 1600x1200):
GF3 32 = 5000 & 39
GF3 16 = 6400 & 47.5
Or with AA enabled for the same reason (1024x768):
GF3 32 = 5200 & 38
GF3 16 = 5500 & 40

Obviously you get fewer comparable results when shifting from the default 1024x768x32 as fewer people run or submit them but this should still be quite accurate.

If it's perf you're after then lowering the resolution or detail settings a little may be more pleasurable than 16bit colour, do bear in mind that on GF3 & GF4TI cards you may be actually slowing things down by switching to 16bit! Of course beauty is in the eye of the beholder and it all really comes down to personal preference

.

16bit colour, or more accurately 16bit 'shades of colour' uses 2 to the power of 16, ie 65536. 32bit uses 24bit colour (the other 8bits are reserved) which gives 2 to 24, ie 16777216 shades (16.8 million). It can be difficult for some people to distinguish between 16bit and 32bit colour other than the obvious colour banding, with the way in which modern gfx cards have evolved 32bit is now very nearly as fast as 16bit, and as seen above performance can actually be worse at 16bit. This is most probably due to any relatively modern game and gfx card being designed with 32bit samples in mind, by switching to 16bit you are freeing up bandwidth (less colours means less data to be processed) but you then have to convert 32bit to 16bit before processing. This 'scaling' of colour often leads to distortion and 'banding' (where shades of colours should be smooth but visible different colour bands can be seen). IIRC the extra 8bits in 32bit greatly enhance special effects like smoke and transparency which most modern games are designed to use.

IMHO, if you have a modern gfx card (GF3 or Radeon8500 or higher) then you really should consider 32bit as default and alter detail levels or resolutions in order to make the game smoother. With so many detail options and gfx card capabilities it does take some experimenting to find the balance between high performance and high quality but it really comes down to what is personally acceptible and preferable to you.