Better image quality in the form of smoother colours and blending effects.been playing games like Q3, UT, & UT2K3, anyways, whats' the difference between 16/32bit(color)?
No it isn't.16 bit color is actually 15 bit color.
Utter rubbish. Windows uses 16 bit colour in the form of 5/6/5 (studies have shown that eyes are more sensitive to shades of green).Windows calls it 16 bit color because "16 bit" is more of an acceptable number than 15 bit.
Irrelevant. Also that's quite a change from "there's no 16 bit" to "yeah but it's only useful for greens".However, that's really not very helpful unless you deal with a whole lot of green in your images.
Uh, because of the fundamental principle that computer arithmetic operates on binary and hence powers of two. 16 is 2 ^ 4; how the heck do you arrive at something as dumb as 15 bit colour?The 16 bit format is really odd, and I have no idea why they decided to create it.
You're right, it's not Windows so stop projecting Apple's idiocy to the entire computing industry to make it sound like common practice. Apple is and always has been the exception to the norm, not the standard.This is not windows, but if you look at this link you can see that the "16 bit" format that OS X uses is really only 15 bits of color data, with one empty bit.
And? The point is that the 16 bit format is fully supported under Windows, unlike on Apple's platforms. Hell, you could limit your colour pallete to just black and white and use 24 bits to repesent it if you like but that in no way implies that you don't support 24 bit colour and that you're only limited to 1 bit.this page dealing with microsoft's direct draw from directX indicates that direct draw supports both 15 bit and 16 bit image data under the label of "16 bit color depth" depending on software and drivers.
Are you making this stuff up as you go along or something?I'm pretty sure quake engine games actually use 15 bit, rather than the 16 bit that they indicate.
Originally posted by: BFG10K
You're right, it's not Windows so stop projecting Apple's idiocy to the entire computing industry to make it sound like common practice. Apple is and always has been the exception to the norm, not the standard.
Very true. I'm experiencing IE6's very faulty support of CSS1 (despite MS's claims to the contrary) with a site I'm working on right now. Unfortunately, I fear that MS will never fully support standards, and will likely attempt to move farther and farther away from said standards to help solidify their monopoly.Originally posted by: BingBongWongFooey
IE6 is languishing with half-ass CSS support, and IE7 will not show up until the next windows release, and only *with* said windows release.
Windows may be the norm for most people, but it is hardly the standard.
No, like you yourself said it can take a wide variety of formats under the 16 bit label. However the standard is 5/6/5.Does windows ONLY use 5/6/5 16 bit color?
Exactly and when it's byte-aligned it's faster to deal with. Also 15 bit colour still allocates 16 bits anyway so why would you not want to use all of them?The reason that 16 bits is so common is because it's the length of a two-byte word, not because 2^16 is "more binary" than 2^15.
It'll be the same as standard 2D so yes, it'll be 5/6/5. AFAIK you simply don't need an alpha channel once all of the blending has been completed.What color space is used for that? 5/6/6 16-bit RGB?
In 2D or 3D? Remember there's currently a big difference between the two and 2D is generally more flexible because it depends on the program you're using.I thought 16-bit was generally 5551 RGBA?
For 3D yes, for 2D not always.Also, the 8 extra bits in 32-bit color are for the alpha channel. (8888)
I wouldn't. Explain to me how allocating 16 bits for a value and then constantly leaving one bit empty is a logical thing to do. The bit is there so why not use it? Why waste space? Wasting resources is a cardinal sin in computer science.I would argue that windows is the exception to the norm.
Yes, I know and I never said otherwise.Anyway, to add to your last post, 32 bit color doesn't only exist in 3D rendering, it's used in many 2D image formats as well. The 8 bits of alpha is used for blending more than one image together
Well I was definitely speaking in a more general sense, not really with regard to color modes.Originally posted by: BFG10K
I wouldn't.I would argue that windows is the exception to the norm.
I dunno, it's probably not. As you mentioned, the eye is more sensitive to green, so it makes sense to throw an extra bit to green. I think it was Canon that developed a new sensor that uses 4 colors: red, green, blue, and emerald, for use in digital cameras. Emerald is closest to green, and the increased sensitivity in the greeney range of colors apparently makes a difference in photo quality.Explain to me how allocating 16 bits for a value and then constantly leaving one bit empty is a logical thing to do.
The bit is there so why not use it? Why waste space? Wasting resources is a cardinal sin in computer science.
(I realize that fusetalk absolutely raped that, but you get the point)TCP Header Format
0 1 2 3
0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
| Source Port | Destination Port |
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
| Sequence Number |
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
| Acknowledgment Number |
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
| Data | |U|A|P|R|S|F| |
| Offset| Reserved |R|C|S|S|Y|I| Window |
| | |G|K|H|T|N|N| |
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
| Checksum | Urgent Pointer |
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
| Options | Padding |
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
| data |
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
TCP Header Format
Note that one tick mark represents one bit position.
Then why did 32bit color "waste" 8 bits (at least until recently, when OSes became aware of alpha-blending)?Originally posted by: BFG10K
I wouldn't. Explain to me how allocating 16 bits for a value and then constantly leaving one bit empty is a logical thing to do. The bit is there so why not use it? Why waste space? Wasting resources is a cardinal sin in computer science.
Originally posted by: jliechty
Then why did 32bit color "waste" 8 bits (at least until recently, when OSes became aware of alpha-blending)?Originally posted by: BFG10K
I wouldn't. Explain to me how allocating 16 bits for a value and then constantly leaving one bit empty is a logical thing to do. The bit is there so why not use it? Why waste space? Wasting resources is a cardinal sin in computer science.
Quite possibly. Of course now that I think about it it doesn't really matter anyway as 16 bit colour blow chunks and everyone uses 32 bit colour instead.Maybe when the decision was first made to use 5/5/5, it wasn't common knowledge that the eyes were more sensitive to green, or maybe that knowledge just wasn't common within the computing industry.
It didn't - 2D programs have been using the extra 8 bits for alpha for many years. Like I said before, for 2D operations it depends a lot on how the program chooses to handle the colour precision.Then why did 32bit color "waste" 8 bits (at least until recently, when OSes became aware of alpha-blending)?