CS question- Why byte short int long???

ckkoba

Member
Dec 12, 2000
183
0
0
Why do in languages like Java there is a difference in how you specify a number? Like byte short int and long? Is this a holdback from when memory was precious? And if that's the case, with all the memory we have now, will future programming languages get rid of this nomenclature?

Thanks people,

Joe
 

AndyHui

Administrator Emeritus<br>Elite Member<br>AT FAQ M
Oct 9, 1999
13,141
17
81
Yes, it's a holdover from when memory was scarce.

But not specifying how much memory to reserve is just plain sloppy. Why should you use more than you need? If every variable that you had took up the maximum size, very quickly the cumulative results grow to a huge amount of wasted space.
 

Snapster

Diamond Member
Oct 14, 2001
3,916
0
0
No they wont get rid of it, a simple structure/array or something like a database eats memory pretty easily. :)
 

ckkoba

Member
Dec 12, 2000
183
0
0
Thanks Andy, that makes sense. I'm just griping cause I hate having to think over whether to write byte short int or long every time I declare a number.

Or at the very least make the process of deciding which one it is automatically. :p


(Am i from a generation that expects to be spoonfed? Yah!)
 

rutchtkim

Golden Member
Aug 2, 2001
1,880
0
0
u want to make program efficient and using only resources that are needed, otherwise u'll wind up with a clunky program which could slow down ur system performance. etc......
 

Noriaki

Lifer
Jun 3, 2000
13,640
1
71
Don't waste your time trying to decide for single variables...just type int. lol it takes the lease time to type ;-) And should be size sufficient for anything.

It becomes important when you have 1,000,000,000 of these variables and you need a range of 0-50 for each of them. Using a Integer over a Byte suddenedy upped the storage requirements from 1Gig to 4.

If you are doing a little program with a couple variables don't worry about it. In general just use int for integers of any type and double for floating point nubmers. Those are the *standard* type if you can have such a thing.