16 bit color vs 32bit color

JEDI

Lifer
Sep 25, 2001
29,391
2,738
126
64k colors vs 1.6million (?)

anyone see any difference in non-gaming applications?
 

imported_michaelpatrick33

Platinum Member
Jun 19, 2004
2,364
0
0
Have you looked at jpegs with 16bit and 32bit (really 24bit with 8bit color correction (or something like that I am too sleepy to remember). Heck have you looked at your screensaver at 64k versus 1.6million?
 

1Dark1Sharigan1

Golden Member
Oct 5, 2005
1,466
0
0
1.6 million = 1,600,000
64K = 64,000
1,600,000 / 64,000 = 25

Hence you're getting 25x the possible colors with 32-bit than you can get with 16-bit. I think that in itself explains it all. ;)
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Originally posted by: wahoyaho
just looking at the icons in the desktop i can see jaggies

The icons are where I always notice the difference but not in terms of jaggies as much as a really ugly dull color to the icons
 

wahoyaho

Senior member
Nov 27, 2003
856
0
0
well mostly jaggies on desktops. dull colors in tray icons. for example, the winamp icon looks horrible with 16bit.
 

route66

Senior member
Sep 8, 2005
295
0
0
No one needs more than 16bit colors.

Somewhere there must be a techies book of famous last words, and that must be on it.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Originally posted by: wahoyaho
well mostly jaggies on desktops. dull colors in tray icons. for example, the winamp icon looks horrible with 16bit.

I don't get it, what does jaggies have to do with color depth?
 

1Dark1Sharigan1

Golden Member
Oct 5, 2005
1,466
0
0
Originally posted by: sxr7171

I don't get it, what does jaggies have to do with color depth?

Nothing. You get "jaggies" with desktop icons no matter what color depth you use. "Jaggies" are a result of low resolution of the icons, not color depth (no AA for 2D lol) 16-bit color will make icons lack brilliance when compared to 32-bit and also you'll get "dotting" as well . . .
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Originally posted by: 1Dark1Sharigan1
Originally posted by: sxr7171

I don't get it, what does jaggies have to do with color depth?

Nothing. You get "jaggies" with desktop icons no matter what color depth you use. "Jaggies" are a result of low resolution of the icons, not color depth (no AA for 2D lol) 16-bit color will make icons lack brilliance when compared to 32-bit and also you'll get "dotting" as well . . .

You will get dithering in 16-bit but not jaggies.
 

wahoyaho

Senior member
Nov 27, 2003
856
0
0
i don't nkow what it's called bu ti can see squres wher eit should be a curve with 16bit but less if its 32bit
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Originally posted by: xtknight
Originally posted by: 1Dark1Sharigan1
Originally posted by: sxr7171

I don't get it, what does jaggies have to do with color depth?

Nothing. You get "jaggies" with desktop icons no matter what color depth you use. "Jaggies" are a result of low resolution of the icons, not color depth (no AA for 2D lol) 16-bit color will make icons lack brilliance when compared to 32-bit and also you'll get "dotting" as well . . .

You will get dithering in 16-bit but not jaggies.

Exactly. Bad blending between colors in other words.

Seriously OP, did you work for 3DFX or something and are coming here on like the anniversary of its downfall (not sure if it is or not) or something?

-Kevin