Picture this: it’s January 19th, 2038, at exactly 03:14:07 UTC. Somewhere in a data center, a Unix system quietly ticks over its internal clock counter one more time. But instead of moving fo…
It’s a problem with the internal represensation of a C/C++ type alias called time_t, mostly. That’s the thing that holds the number of elapsed seconds since midnight on Jan. 1, 1970, which is the most common low-level representation of date and time on computers. In theory, time_t could point to a 32-bit type even on a 64-bit system, but I don’t think anyone’s actually dumb enough to do that. It affects more than C/C++ code because most programming languages end up calling C libraries once you go down enough levels.
In other words, there’s no way you can tell whether a given application is affected or not unless you’re aware of the code details, regardless of the bitness of the program and the processor it’s running on.
It’s a problem with the internal represensation of a C/C++ type alias called
time_t
, mostly. That’s the thing that holds the number of elapsed seconds since midnight on Jan. 1, 1970, which is the most common low-level representation of date and time on computers. In theory,time_t
could point to a 32-bit type even on a 64-bit system, but I don’t think anyone’s actually dumb enough to do that. It affects more than C/C++ code because most programming languages end up calling C libraries once you go down enough levels.In other words, there’s no way you can tell whether a given application is affected or not unless you’re aware of the code details, regardless of the bitness of the program and the processor it’s running on.
Never underestimate human stupidity.