Originally posted by: William Gaatjes
I guess the use of an unsigned integer 64 bits long for the amount of seconds should have solved the y2k problem if only 1 variable was used. But then again that are 8 bytes.
For storage sake :
days in unsigned integer 65535 days 2 bytes.
hours in bcd format 99 hours. 1 byte.
minutes in bcd format 99 minutes. 1 byte.
seconds in bcd format 99 seconds. 1 byte.
Some more or less history:
bcd stands for binary coded decimal. 4 bits gives you 16 different combinations of bits.
And only the combinations that code for 0 to 9 are used. This way you can pack two bcd's in 1 byte. Easy for storage sake and you don't have to do a lot of math. Just sent the 4 bits to the 7 segment display driver and you have the number on the display. In old times this saved a lot of calculations and very expensive storage (read ram).
Used in discrete logic chips, microconrollers and old cpu's.
/end history.
This would give you 179 years and some change.
3 bytes saved.
But if the days where 4 bytes. you would have more then 11 million years.
that's 7 bytes. 1 byte saved.
And easy to implement in logic. just bcd counters carrying bits to eachother and afcourse keeping track of the numbers 60 and 24.
Long ago memory was precious and costly and a dumb tradeoff was made in the pc world.
But i have to agree that for math and compare sake 1 big unsigned integer is easier.
sure its a very good implementation, and for homework it might be acceptable. However there are limitations, and it was thought like this which caused the Y2K problem to begin with.
But how was the date and time stored in pc's originally ?
I have no clue. I'll google some time later.
I do seem to remember Apple never had this y2k problem.
I am personnally spoiled with real time clock chips with enough ram to store seconds/10, seconds,minutes,hours,day,months,and years.