Silly question: Any advantage running 32 bit vs. 16 bit color quality?

ArvinC

Member
Feb 12, 2002
91
0
0
Hello everyone.

Just wondering if there was any adanvtage at all in running a rig in 32 bit color quality vs. 16 bit. Is there a "real-world" difference in quality on games and such? Can you save system resources by running in medium resolution? Just wondering... :)

ArvinC
 

Punikin

Member
Sep 21, 2000
142
0
0
Just in case you didn't know, color depth and resolution are two different things. Now that that's out of the way, running in 32bit color makes the image look more colorful and vivid. Can you really tell the difference? Yes, especially if you are use to playing a certain game on 32bit, and then drop down to 16bit. Is 32 bit necessary. Not really. Running in 16bit graphics will usually increase frame rates, but if you have a decent video card, why wouldn't you want to take advantage of it's full potential?
 

richleader

Golden Member
Jan 1, 2001
1,201
0
0
IIRC, crossbar memory architecture doesn't enable on geforce3+ if you are in 16 bit color, so there is little advantage in choosing it. Geforce 2 GTS or GTS-V, on the other hand, are really bandwidth limited and 16 bit color can let them fly.

The most noticeable disadvantage of 16 bit color is often in particle effects such as smoke which appear dithered (splotchy dots) rather than smooth as in 32 bit color. In older games such as half life where walls are often shades of gray, there often won't be enough colors to shade them smoothly. On the other hand, in something like Max Payne with very high rez textures, this will be less noticeable because the dithering will blend into the woodwork, if you will.

On the other hand, different game engines handle this differently. Quake III powered games (elite force, alice, heavy metal fakk2) can look quite decent in 16 bit color, while on the other hand, Unreal powered games (Deus Ex, Undying, Wheel of Time) can look quite craptastic in anything less than 32 bit color.

I used 16 bit color most of the time on my Geforce 2 GTS (which could have free FSAA and still have frames to spare in contrast to using 32 bit color at the same resolution) but on my Geforce 3, I've come to love, appreciate, and require 32 bit color whenever possible.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,000
126
Just wondering if there was any adanvtage at all in running a rig in 32 bit color quality vs. 16 bit. Is there a "real-world" difference in quality on games and such?
Absolutely, 32 bit colour always looks better than 16 bit colour, even in games never designed for it. Always use 32 bit colour whenever possible.
 

Mingon

Diamond Member
Apr 2, 2000
3,012
0
0
IIRC, crossbar memory architecture doesn't enable on geforce3+

Not quite its the LMA II that only works in 32bit snipit below from anands geforce4 review

' the 2nd generation Lightning Memory Architecture are the improvements in the Visibility Subsystem. The original GeForce3 introduced a feature that is known as Z-occlusion culling, which is essentially a technology that allows the GPU to look at z-buffer values in order to see if a pixel will be viewable when rendered '