16-bit vs. 32-bit color.

Metalloid

Diamond Member
Jan 18, 2002
3,064
0
0
Is there a big quality difference when you use 32-bit rather than 16? It seems to me that running in 16-bit color, but bumping the resolution up one notch would look better. Any opinions?
 

Necrolezbeast

Senior member
Apr 11, 2002
838
0
0
I use 32 bit on everything, never really thought of lowering it for higher resolution, but I figure if I'm gonna look at pretty colors they better be pretty. I run all games at 1024x768. Right now I'm using a crappy monitor though, a Packard Bell 15in from my old P100, which maxes out at 1024x768 if I remember right, lol! I was using a 15in HP but the color got jacked up and now everything is very dark(anyone know how to fix this, or what is wrong?), when I change the brightness in the driver props it makes everything look funny, so I went to the PB one.
 

spanky

Lifer
Jun 19, 2001
25,716
3
81
i see differences between 16 and 32 color. everything looks smoother (color wise) under 32bit.
 

Metalloid

Diamond Member
Jan 18, 2002
3,064
0
0
Alright guys thanks for the input. I guess it is just what you prefer. I am going to try both ways and then pick whichever one I personally think looks best.
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
:) Slower cards of TNT2 & 3dfx generations, esp with 16MB RAM really benefit from 16bit colour. For modern gfx cards like GF3, GF4TI & Radeon8500 you only want to consider 32bit colour, and prob AA/AF as well!

:( Of course actual perf diffs do depend upon the card and CPU used. With 'slower' CPUs and higher end gfx cards you really want to max out gfx settings with AA, Aniso and of course 32bit colour in order to use the full GPU potential that the CPU may not tap.

3Dmark2001 using a mid-range AthlonXP and default res of 1024x768:

Voodoo4 32 = 1600 & 9.5FPS (Car Chase High Detail)
Voodoo4 16 = 2250 & 14.5FPS

GF2 GTS/Pro/TI 32 = 6000 & 41.5
GF2 GTS/Pro/TI 16 = 6100 & 40

GF3 32 = 8800 & 51
GF3 16 = 7500 & 44 (Yes slower I double checked)

GF4TI4200 32 = 10500 & 55
GF4TI4200 16 = 9500 & 51 (Again slower!)

You would expect most benefit to come in higher resolutions where the bandwidth is more limited (eg 1600x1200):

GF3 32 = 5000 & 39
GF3 16 = 6400 & 47.5

Or with AA enabled for the same reason (1024x768):

GF3 32 = 5200 & 38
GF3 16 = 5500 & 40

;) Obviously you get fewer comparable results when shifting from the default 1024x768x32 as fewer people run or submit them but this should still be quite accurate.

:D If it's perf you're after then lowering the resolution or detail settings a little may be more pleasurable than 16bit colour, do bear in mind that on GF3 & GF4TI cards you may be actually slowing things down by switching to 16bit! Of course beauty is in the eye of the beholder and it all really comes down to personal preference ;).

:) 16bit colour, or more accurately 16bit 'shades of colour' uses 2 to the power of 16, ie 65536. 32bit uses 24bit colour (the other 8bits are reserved) which gives 2 to 24, ie 16777216 shades (16.8 million). It can be difficult for some people to distinguish between 16bit and 32bit colour other than the obvious colour banding, with the way in which modern gfx cards have evolved 32bit is now very nearly as fast as 16bit, and as seen above performance can actually be worse at 16bit. This is most probably due to any relatively modern game and gfx card being designed with 32bit samples in mind, by switching to 16bit you are freeing up bandwidth (less colours means less data to be processed) but you then have to convert 32bit to 16bit before processing. This 'scaling' of colour often leads to distortion and 'banding' (where shades of colours should be smooth but visible different colour bands can be seen). IIRC the extra 8bits in 32bit greatly enhance special effects like smoke and transparency which most modern games are designed to use.

;) IMHO, if you have a modern gfx card (GF3 or Radeon8500 or higher) then you really should consider 32bit as default and alter detail levels or resolutions in order to make the game smoother. With so many detail options and gfx card capabilities it does take some experimenting to find the balance between high performance and high quality but it really comes down to what is personally acceptible and preferable to you.
 

Metalloid

Diamond Member
Jan 18, 2002
3,064
0
0
Well I have a GF4 TI4400 (soon to be Radeon 9700), and IQ is my main priority, so I might go with 32-bit. Thank you AnAndAustin, that was very helpful. If I did stay with 16-bit, I would be bumping resolution, AA, and AF as high as I could. But I think I will go with 32-bit, and then toy with AF and resolution to my liking. Thanks.
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
;) Without a doubt a TI4400 should be run at 32bit with 2xAA as a minimum. Then max out the detail sliders and see how high the res can go whilst still remaining playable, an average of 60FPS+ is as low as you really want to go and 1280x960x32 with 2xAA should easily be achieved in all current games providing you have a decent CPU. Try toying around with AF as well, this takes a decent hit on nVidia cards but the quality is very good, without getting technical you'll notice the diff on the textures esp in the medium to long range. 1024x768x32 2xAA and 4xAF (eff 8tap) should still play nicely, do try 2xAF and QxAA but definitely avoid 4xAA as this kills all nVidia cards currently. If you go Rad9700 then expect 1600x1200x32 4xAA and 16tap (eff 8xAF) to be very playable, I'd suggest you hold off for a few months as 4400 can easily handle eveything and the next few months will really show a lot more 'bang for buck' and a lot of the guess work surrounding NV30, 'value' NV30, Rad9500 and Rad9700-DDRII should finally be over.
 

Metalloid

Diamond Member
Jan 18, 2002
3,064
0
0
I personally prefer AF to AA, especially in first person shooters. It is not as easy to tell a line that is partially jagged when you are moving fast and focusing on the game than in a slower game.

As far as the Radeon 9700 situation goes, my friend is going to buy my TI4400, so I will probably just sell it to him when I fell that the Radeon 9700 prices are suitable, or when something else like NV30 comes out and blows my mind to the point that I just have to have it.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,958
126
There is a clear and visible difference between 16 bit colour and 32 bit colour. 32 bit colour makes all alpha blending look much smoother and more realistic and it especially helps with multipass rendering. Also a 24/32 bit Z buffer makes scenes look much cleaner and more accurate, especially at long distances.

Always run all games in 32 bit colour, even those that were never designed for it. If you set your Windows desktop to 32 bit colour the drivers will force full 32 bit accuracy in the rendering pipelines and you'll automatically reap the benefits of this.