Blow me up, MOD!

Scrapster

Diamond Member
Nov 27, 2000
3,746
0
0
Someone once told me that your computer couldn't recognize numbers outside of the domain: -32767 <= x <= 32767

Can someone explain to me why this is? Are all computers restricted to this limit?

EDIT: QUESTION ANSWERED. THANKS EVERYONE!
 

perry

Diamond Member
Apr 7, 2000
4,018
1
0
That's false. An unsigned long int goes up to 2^32, or signed goes -(2^16) -> 2^16.

-32767 to 32767 is a short int. That's all that was supported in 16bit processors. 32 bit processors support 32 bits...
 

Sparky Anderson

Senior member
Mar 1, 2000
307
0
0
A while ago in some programming languages this was the case because the computer uses binary numbers (1's and 0's, on and off). 2^16 is 65536, which is how many numbers are inside that range you listed. That was 16 bhits, most puters now are at least 32 bits, which gives you up to around 4 billion different numbers or so. It's hard to explain, there's different ways of representing the numbers too, it's quite complex, but basically the answer to your question is no.