16bit and 32bit color, whats thats the performance difference?

Actaeon

Diamond Member
Dec 28, 2000
8,657
20
76
I'm curious to the peformance difference of 16bit color, and 32bit color, how big is the difference in terms of raw fps?

I'm espically curious about the performance hit it takes on a GF3...

Thanks!
Actaeon
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
:) Slower card of TNT2 & 3dfx generations, esp with 16MB RAM really benefit from 16bit colour. For modern gfx cards like GF3, GF4TI & Radeon8500 you only want to consider 32bit colour, and prob AA as well!

:( Of course actual perf diffs do depend upon the card and CPU used. With 'slower' CPUs and higher end gfx cards you really want to max out gfx settings with AA, Aniso and of course 32bit colour in order to use the full GPU potential that the CPU may not tap.

3Dmark2001 using a mid-range AthlonXP and default res of 1024x768:

Voodoo4 32 = 1600 & 9.5FPS (Car Chase High Detail)
Voodoo4 16 = 2250 & 14.5FPS

GF2 GTS/Pro/TI 32 = 6000 & 41.5
GF2 GTS/Pro/TI 16 = 6100 & 40

GF3 32 = 8800 & 51
GF3 16 = 7500 & 44 (Yes slower I double checked)

GF4TI4200 32 = 10500 & 55
GF4TI4200 16 = 9500 & 51 (Again slower!)

You would expect most benefit to come in higher resolutions where the bandwidth is more limited (eg 1600x1200):

GF3 32 = 5000 & 39
GF3 16 = 6400 & 47.5

Or with AA enabled for the same reason (1024x768):

GF3 32 = 5200 & 38
GF3 16 = 5500 & 40

;) Obviously you get fewer comparable results when shifting from the default 1024x768x32 as fewer people run or submit them but this should still be quite accurate.
 

SeanH

Member
Apr 9, 2002
114
0
0
In Comanche 4 on my Rdaeon 8500, it went pretty choppy at 32bit. When I changed it, it was veyr smooth at 16bit. (Probably not smooth enough for others) :D
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
:) Obviously lowering the resolution or detail settings a little may be more pleasurable than 16bit colour, do bear in mind that on GF3 & GF4TI cards you may be actually slowing things down by switching to 16bit! But of course beauty is in the eye of the beholder ;).
 

SeanH

Member
Apr 9, 2002
114
0
0
32bit and 16bit are just different color settings, right?

I lowered it to 16bit, because I want to run highest detail on everything, and wanted 1024x768. :D I also put detail in front of everything when I choose display settings for a game, I love detail.

And, if those are color settings, what big difference does 32bit do anyway?
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
:) 16bit colour, or more accurately 16bit shades of colour uses 2 to the power of 16, ie 65536. 32bit uses 24bit colour (the other 8bits are reserved) which gives 2 to 24, ie 16777216 (16.8 million). It is difficult for most people to distinguish between 16bit and 32bit colour but with the way in which modern gfx cards have evolved 32bit is now very nearly as fast as 16bit, and as seen above performance can actually be worse at 16bit. This is most probably due to any relatively modern game actually using 32bit samples for textures and game detail which means that they can be used without translation, but by switching to 16bit you are freeing up bandwidth (less colours means less data to be processed) but you then have to convert 32bit to 16bit before processing. This 'scaling' of colour often leads to distortion and 'banding' (where shades of colours should be smooth but visible different colour bands can be seen).

;) IMHO, if you have a modern gfx card (GF3 or Radeon8500 or higher) then you really should consider 32bit as default and alter detail levels or resolutions in order to make the game smoother. With so many detail options and gfx card capabilities it does take some experiemnting to find the balance between high performance and high quality but it really comes down to what is personally acceptible and preferable to you.
 

SeanH

Member
Apr 9, 2002
114
0
0
Wow, I never knew there was such a difference. So, maybe the textures in Camanche 4 are 16bit? Because I can run 1024x768x16 pretty nicely, but since you explained it, I may look into 32bit a little more.

Thanks.
 

Mingon

Diamond Member
Apr 2, 2000
3,012
0
0
The geforce 3 and 4 GPU's can run slower in 16bit mode under some circumstances as the HSR technique used is only enabled when set to 32bit colour.
 

JeremiahTheGreat

Senior member
Oct 19, 2001
552
0
0
There are some cards that do all the rendering in 32bit colour regardless (like the KyroII), and the difference is simply not there.
 

LegionX

Senior member
Jul 10, 2000
274
0
0
ok what about older games? i play Ultima ONline and it has become a FPS game when playing pvp so is it better to play 16 or 32? currently i have a kyro II but in the next week i will get either a radeon 8500 128m or msi g4 4600. Which would be optimal for me 16 or 32 @ 640x420 ?
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
640? Is that the res you play in? Even with a KyroII (good card in its time) I'd stick 32bit on but I would imagine online gaming may be more restricted by connection speed than gfx card depending on how much info is stored on your HD. With any game you play, the proof is in the pudding though, try out both settings and see what feels better. If you can benchmark then you can measure the perf difference, generally 50 FPS is about as low as you want to go and 100+ is a bit wasteful and uneccesary.

For your next card I would weigh it up as so:

$60 GF2TI or Radeon7500: Equal perf but Radeon has better image quality and VIVO.
$100 GF3TI200 or Radeon8500LE: Equal perf, perhaps Radeon slightly faster and again has better image quality and VIVO.
$140 GF4TI4200: Faster than the fully blown Radeon8500 and GF3TI500, plus it has excellent AA and image quality too.
$220 GF4TI4400: Slightly faster than 4200 and shares the same excellent AA & image quality.

4200's are great value for money and o/c to 4400 levels and are excellent cards for the money.
 

deadcell

Member
Dec 28, 2001
51
0
0
I run Medal of Honor at 1024x768 and i notice a big difference between 32 and 16 bit.......i play 16 bit with just about everything else maxed and it runs very smoothly, as opposed to 32bit where everything is choppy. I have a 1700xp with a GF3ti200 and 256mb DDR (which i suspect is the problem for a lot of my problems, need to upgrade to 512mb soon).
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,000
126
From a purely theoretical standpoint 32 bit colour is half the speed of 16 bit colour and in the golden olden days of videocards (TNT2, G400 etc) that was almost always the case. Nowadays with other factors such as HSR, crossbar memory controllers and equal bottlenecks among the core and memory bandwidth the situation is not so clear-cut.

32 bit colour has higher image quality especially for alpha blending (fog, smoke, lighting, transparency etc) and for heavy multipass rendering (bump mapping, decals etc). In addition, most cards automatically assign a 32 bit/24 bit Z buffer when a 32 bit colour depth is selected and this improves depth calculations over a 16 bit Z, especially at long distances.

Always run games at 32 bit colour instead of 16 bit colour because they'll always look better, even games never designed to take advantage of it.

Wow, I never knew there was such a difference. So, maybe the textures in Camanche 4 are 16bit? Because I can run 1024x768x16 pretty nicely, but since you explained it, I may look into 32bit a little more.
Yes, you definitely should as there'll be a large difference in image qualitty between 16 bit and 32 bit. Look at the smoke and lighting from explosions to see some obvious differences. In 32 bit mode it'll look smooth while 16 bit mode will exhibit banding.

there are some cards that do all the rendering in 32bit colour regardless (like the KyroII), and the difference is simply not there.
The Kyro/Kyro2 will still load 16 bit textures instead of 32 bit textures when the game requests 16 bit colour, causing a small performance difference in the benchmarks. The rest of the rendering will still be 32 bit though.
 

Actaeon

Diamond Member
Dec 28, 2000
8,657
20
76
So, lets say, I'm running a GF3, or any card since the GF3 (R8500, gf4, etc)

It'll run faster in 32bit color, than 16?

Thanks.
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
:( Unless the memory and bandwidth situation becomes a problem, such as with very high detail or resolutions then 16bit actually seems slower on the top cards, but I would imagine it would mostly depend upon the game and the game engine.

:) When the memory bandwidth becomes the limiting factor then you should see an improvement in speed by switching to 16bit, but most people don't think the trade off is worth it due to reduced effects and image quality. You may find it better, if you want the higher resolution, to lower the detail settings a little rather than drop to 16bit, but it is down to individual taste.

:D For the top cards, GF3, GF4TI, Radeon8500 (& LE) and Matrox P512, you really want to enable effects like AA. Because GF3 & GF4 cards use the fast but guessy MS-AA it is well worth enabling Aniso, 2xAA or QxAA with Ani are good and fast, or you could try 4xS-AA depending whether you are using DirX or OpenGL. Radeon8500 cards use SS-AA and as such you don't really need Aniso to resharpen textures, SS-AA has a big enough perf hit as is. Matrox P512 is still very new but offers the selective sampling AA (forgot the name) where only certain parts of the image get SS-AA leading to greater perf but more erratic frame rates. You may find these measures will lower perf and prob resolution but the games should look a lot better for it.

;) I think it's just a matter of trying out the settings and seeing what you prefer.
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
;) The Matrox Parhelia-512 uses Fragment Super-Sampling, SS-AA is used on selective parts of the image and also has the draw-back that some jaggies are missed. However, the P512 can use alternative AA methods so you aren't tied in to Fragment AA.

:D What I was trying to make clear regarding GF3 & GF4 AA was that the best settings in terms of quality and perf is to use either 2xAA & Aniso, QxAA & Aniso or 4xS AA without Aniso. That way you get the benefits of AA but also get sharp detailed textures and still get better perf than simply using SS-AA. IIRC officially 4xS is only for DirX and Aniso is only for OpenGL but I believe this can be overcome via Powerstrip.
 

LegionX

Senior member
Jul 10, 2000
274
0
0
well ultima online has a 800x600 option but isnt the same as true 800x600 - although you see more you can only still attack whetever is within the 640 screen, plus i have bad eyes so dont play in window and the 800x600 is smaller. other games i try and play with 1000+x700+ (forget the actual number).

I am concerned about getting a g4 becuase of image quailty, i was hoping so much for a good parhelia from matrox but the benchmarks just dont justify the price. I have read that radeon has the second best IQ but it is getting slow.

I have to buy a card this month or i will lose the money i have towards it so not sure what to do.
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
:( It's ghastly surprising just how many people hold on to old rumours/truths.

:) First I hear people saying how RIMM is ALWAYS far better than DDR, and now almost everybody seems to think Matrox & ATI still hold all the image quality cards!

:( GF2 had comparitively poor image quality, GF3 were better but still far behind both Matrox and ATI. GF4 cards have excellent image quality! That's coming from consumers as well as reviewers. Even an independent research on the part of Matrox (in prep for their P512) shows that GF4 are now better than even ATI and probably better than all but P512 Matrox cards!

;) Again, the majority of users, even with something better than the 15" monitors most people still use, probably couldn't tell the difference between Matrox P512 and a GF2 for image quality, but the differences are still there.

Matrox PDF file showing nVidia have definitely caught up

:D The GF4TI4200 is a tremendous performer and has excellent image quality. The new enhanced Radeon8500 will probably equal it, but it will be a good while before it comes out in great quantities and the review sites catch them. The new gen of cards due out: well AGP8x is more about marketing than perf, and the vast majority of DX9 capable games will be a year away, by then better cards will be out anyway. The GF4TI4200 is unlikely to come down in price, more so than a any other card .. agreat deal and a great time to buy (just like the Radeon8500LE was when that was released).

:( If you start waiting you may never finish!