Picture this: it’s January 19th, 2038, at exactly 03:14:07 UTC. Somewhere in a data center, a Unix system quietly ticks over its internal clock counter one more time. But instead of moving fo…
It’s a problem with the internal represensation of a C/C++ type alias called time_t, mostly. That’s the thing that holds the number of elapsed seconds since midnight on Jan. 1, 1970, which is the most common low-level representation of date and time on computers. In theory, time_t could point to a 32-bit type even on a 64-bit system, but I don’t think anyone’s actually dumb enough to do that. It affects more than C/C++ code because most programming languages end up calling C libraries once you go down enough levels.
In other words, there’s no way you can tell whether a given application is affected or not unless you’re aware of the code details, regardless of the bitness of the program and the processor it’s running on.
It only depends whether the app and its OS/kernel interface use a 32-bit value to store the time information.
32-bit architecture or OS has nothing to do with this bug, for example 16-bit architectures must’ve used 32-bit time, too (otherwise they’d be able to only count up to 32-65 seconds).
So this is only a problem for 32-bit apps on 32-bit processors?
No, those can have 64 bit value as well, this is only a problem for applications which haven’t switched to using them.
It’s a problem with the internal represensation of a C/C++ type alias called
time_t
, mostly. That’s the thing that holds the number of elapsed seconds since midnight on Jan. 1, 1970, which is the most common low-level representation of date and time on computers. In theory,time_t
could point to a 32-bit type even on a 64-bit system, but I don’t think anyone’s actually dumb enough to do that. It affects more than C/C++ code because most programming languages end up calling C libraries once you go down enough levels.In other words, there’s no way you can tell whether a given application is affected or not unless you’re aware of the code details, regardless of the bitness of the program and the processor it’s running on.
Never underestimate human stupidity.
It only depends whether the app and its OS/kernel interface use a 32-bit value to store the time information.
32-bit architecture or OS has nothing to do with this bug, for example 16-bit architectures must’ve used 32-bit time, too (otherwise they’d be able to only count up to 32-65 seconds).