When coding anything to do with dates, is there a defacto standard that people are starting to use now to better prepare for year 2038? It's far away, but yet, not that far. This is when the unix epoch essentially reaches the limit of a 32 bit unsigned integer, and a lot of date related stuff uses unix time stamps.
I'm thinking, is it best to start storing dates as a string instead of unix time? Could do yyyy-mm-dd-hh-mm-ss or something. With leading zeroes. That way you can still sort it propery in SQL queries. You would still use the unix time stamp within date/time functions for now (such as getting the current time), but at least when the programming language comes up with a solution, your own programs's date/time is unaffected and it's just the thing of implementing the new code in your own date/time class.
I was coding a php program and used unix time stamps and then it kind of occurred to me how doing this is no longer future proof. Is everyone just riding this train to destruction and waiting last minute, or is there actual efforts in place already to prepare for it?
I'm thinking, is it best to start storing dates as a string instead of unix time? Could do yyyy-mm-dd-hh-mm-ss or something. With leading zeroes. That way you can still sort it propery in SQL queries. You would still use the unix time stamp within date/time functions for now (such as getting the current time), but at least when the programming language comes up with a solution, your own programs's date/time is unaffected and it's just the thing of implementing the new code in your own date/time class.
I was coding a php program and used unix time stamps and then it kind of occurred to me how doing this is no longer future proof. Is everyone just riding this train to destruction and waiting last minute, or is there actual efforts in place already to prepare for it?