Preparing for the year 2038 problem

Red Squirrel

No Lifer
May 24, 2003
70,010
13,489
126
www.anyf.ca
When coding anything to do with dates, is there a defacto standard that people are starting to use now to better prepare for year 2038? It's far away, but yet, not that far. This is when the unix epoch essentially reaches the limit of a 32 bit unsigned integer, and a lot of date related stuff uses unix time stamps.

I'm thinking, is it best to start storing dates as a string instead of unix time? Could do yyyy-mm-dd-hh-mm-ss or something. With leading zeroes. That way you can still sort it propery in SQL queries. You would still use the unix time stamp within date/time functions for now (such as getting the current time), but at least when the programming language comes up with a solution, your own programs's date/time is unaffected and it's just the thing of implementing the new code in your own date/time class.

I was coding a php program and used unix time stamps and then it kind of occurred to me how doing this is no longer future proof. Is everyone just riding this train to destruction and waiting last minute, or is there actual efforts in place already to prepare for it?
 

Azuma Hazuki

Golden Member
Jun 18, 2012
1,532
866
131
By 2038, assuming we're all still alive in 21 years, I would be very surprised if there were any 32-bit machines left. At that point if anyone isn't on 64-bit, and isn't using 64-bit time_t datatypes or equivalent, they deserve everything they get.
 

Merad

Platinum Member
May 31, 2010
2,586
19
81
I was coding a php program and used unix time stamps and then it kind of occurred to me how doing this is no longer future proof. Is everyone just riding this train to destruction and waiting last minute, or is there actual efforts in place already to prepare for it?

In new code it shouldn't be a problem except when people make lazy choices that turn it into a problem. IIRC all of the current mainstream 64 bit operating systems have supported 64 time values since their inception (i.e. a decade or more). All the major databases support them. Most (all?) mainstream languages support them, and all I can think of off the top of my head either default to 64 bit or only support it.

Offhand I really only know of a couple areas of real concern. 32 bit Linux (true 32 bit, not the x32 ABI) still doesn't support 64 times to my knowledge, because they're aren't willing to break the ABI. It's being worked on, however. Embedded systems are always a concern, since vendors often don't give a damn about updating their firmware and customers often damn give a damn about applying updates even when available. And of course, there's all the truly ancient software out there written in COBOL, FORTRAN, etc. that no one dreamed would still be running today, but will most likely still be in use 20 years from now.
 

Red Squirrel

No Lifer
May 24, 2003
70,010
13,489
126
www.anyf.ca
Oh wow is it really as simple as being on a 64 bit system? I would have figured that time_t or equivalent were still hard coded as 32 bit in the core libraries of the OS. I guess one would also want to change the value in mysql from int to bigint and do other similar changes in the case of php but that is rather minor change to do. But if it's as simple as just being on a 64 bit system then it's probably not that big a deal in the long term.
 

Pick2

Golden Member
Feb 14, 2017
1,058
1,507
91
I don't think 32 bit CPU vs 64 bit CPU has ANYTHING to do with the Unix epoch use of 32 bit unsigned integer math. Just like Microsoft's use of '69 instead of 1969 , It was done to save expensive storage space way back when. My first HDD was a whopping 20 Megabytes and cost $500. Ah yes , The Good Old Days :)
 

Schmide

Diamond Member
Mar 7, 2002
5,696
941
126
BTW 2038 is when the signed 2^31 int overflows into the unsigned portion of the encoding. If you treat the 32 bit as unsigned it overflows at Sunday, February 7, 2106 6:28:16 AM. Let me be on the record saying all programmers will be dead from the code wars by then.

Edit: (wrong) I just looked it up and the next issue will happen on Thursday, June 15, 2028 9:33:27.371 AM when the 64bit high frequency microsecond timers overflow. Get your trades in early that morning.

Edit2: Ironically I overflowed the 64bit number. That date when the high frequency time overflows from signed 63bit to unsigned 64 bit is Friday, April 11, 2262 11:47:16.855 PM. The unsigned 64bit overflows in 2554. Even Buck Rodgers will be dead.
 
Last edited:
  • Like
Reactions: Pick2 and Ken g6

KB

Diamond Member
Nov 8, 1999
5,406
389
126
When coding anything to do with dates, is there a defacto standard that people are starting to use now to better prepare for year 2038? It's far away, but yet, not that far. This is when the unix epoch essentially reaches the limit of a 32 bit unsigned integer, and a lot of date related stuff uses unix time stamps.

I'm thinking, is it best to start storing dates as a string instead of unix time? Could do yyyy-mm-dd-hh-mm-ss or something. With leading zeroes. That way you can still sort it propery in SQL queries.


I wouldnt worry about. By the time its an issue we will have 128bit computers.

Storing dates as strings is a bad idea. In databases, string comparisons are much slower than int compares.
 

Gryz

Golden Member
Aug 28, 2010
1,551
204
106
I write software on a specific hardware platform, with our own OS. We use a read-only counter that is available to all software, and that increments by one every 10 milliseconds. (Ticks). It's a 32-bit unsigned value. It is an easy and cheap way to do timestamps. This tick-counter will roll over after 500 days. I'm sure there are a lot of our boxes that stay up for over a year. How to prevent bugs with timestamps that roll over ?

Pretty simple. When our box boots, it initializes this ticks-counter to 2^32-1 minus 10 minutes. So every time a box boots, there will be a roll-over after 10 minutes. If there are bugs, our test-scripts will probably catch them pretty easily. I don't think I would ever have thought of this myself. :)
 
  • Like
Reactions: Ken g6

Red Squirrel

No Lifer
May 24, 2003
70,010
13,489
126
www.anyf.ca
Actually that got me thinking, wonder how Atmega / Arduino handles that at the low level, since the MCU itself is 8 bit, but there are some things that are higher like the ADCs which are 10 bit. I think the ints are 16.
 

Merad

Platinum Member
May 31, 2010
2,586
19
81
Actually that got me thinking, wonder how Atmega / Arduino handles that at the low level, since the MCU itself is 8 bit, but there are some things that are higher like the ADCs which are 10 bit. I think the ints are 16.

Not sure what specifically you're wondering about but the Atmel chips have support for many 16 bit operations that work on pairs of registers. For example they use 16 bit memory addresses and there are a few pairs of registers set up specifically for handling (16 bit) pointers. IIRC the ADC output register is simply 16 bits wide.
 

Red Squirrel

No Lifer
May 24, 2003
70,010
13,489
126
www.anyf.ca
I was thinking more about the clock. But guess they don't use unix time stamp, think it's just a counter from the point it is powered up, and it probably does not really matter if it rolls over I guess. Just need to account for it in the code, ex: if using it for a timer/delay etc.
 
Last edited: