Linux kernel has had support for 64 bit time for years. On Debian, packages for the upcoming release were updated to 64 bit time earlier this year. I’m fairly sure the other distributions have done or are doing the same. So basically you now have 2 years to upgrade your OS and to pester the vendors of commercial software to do the same.
Like someone else said, it will be 2 very busy years, but we can survive this.
2038, not 2028. We have twelve years to fully migrate to 64-bit time.
404 brain not found. Sorry:)
500 Internal Brain Error
The real problem is not avg Joe devices but things like banking mainframes tbh
Nah, most of the banks solved this years ago already
So the Steam client will have to be updated to 64-bit before 2038.
Not really. 32-bit apps can use 64-bit values.
So this is only a problem for 32-bit apps on 32-bit processors?
It’s a problem with the internal represensation of a C/C++ type alias called
time_t
, mostly. That’s the thing that holds the number of elapsed seconds since midnight on Jan. 1, 1970, which is the most common low-level representation of date and time on computers. In theory,time_t
could point to a 32-bit type even on a 64-bit system, but I don’t think anyone’s actually dumb enough to do that. It affects more than C/C++ code because most programming languages end up calling C libraries once you go down enough levels.In other words, there’s no way you can tell whether a given application is affected or not unless you’re aware of the code details, regardless of the bitness of the program and the processor it’s running on.
I don’t think anyone’s actually dumb enough to do that
Never underestimate human stupidity.
No, those can have 64 bit value as well, this is only a problem for applications which haven’t switched to using them.
It only depends whether the app and its OS/kernel interface use a 32-bit value to store the time information.
32-bit architecture or OS has nothing to do with this bug, for example 16-bit architectures must’ve used 32-bit time, too (otherwise they’d be able to only count up to 32-65 seconds).
The problem doesn’t concern me as much at how bad we’ve become at maintaining shit that already works.
There is also the fact that during Y2K, we didn’t have as much reliance on computers.
There was also a worldwide effort to fix any potential problems before they happened.
Cobol mavens burned both ends of the candle and made bank, while making banks work.
Many were old enough to retire after that.
Issue 2038 will be easier to fix because many systems are already 64-bit, as 32-bit systems could only handle 4 GB of RAM, and programs need more RAM.
The only issue would be critical issues that run on 32-bit systems and must be fixed before that date.
The only issue would be critical issues that run on 32-bit systems and must be fixed before that date.
So, many banks and government agencies which still run on mainframes…
What does a 64-bit system and 4GB RAM have to do with using 64bit timestamps?
32bit systems can use 64bit values without issue. In fact, even an 8bit system can handle 256bit values or even longer ones without issue.
The bittiness of a CPU and its address space have nothing to do with the length of usable data unless you end up with data longer than the RAM volume (and even then there’s swap).
32-bit systems could only handle 4 GB of RAM
I don’t understand why people always say that. Pentium Pro could handle 64 GB even though it was a 32 bit CPU. It had a 36 bit address bus. Later models are the same.
People say it because it was a Windows limitation, not a computing limitation. Windows Server had support for more, but for consumers, it wasn’t easily doable. I believe there’s modern workarounds though. The real limit is how much memory a single application can address at any given time.
There is also the fact that during Y2K, we didn’t have as much reliance on computers.
And we still shouldn’t.
Uniting the reliance upon long-range electric connectivity (radio, PSTN - but that now depends on computers too), the reliance upon computers (like mainframes), the reliance upon microcontrollers, the reliance upon personal computers (like Amiga 500), the reliance upon fast encryption helped by computers, the reliance upon computers used for mining cryptocoins or some beefy LLMs, the reliance upon computers capable of running Elite Dangerous, and the reliance upon computers capable of running devops clusters with hundreds of containers, - it’s wrong, these are all different.
An analog PSTN switching station shouldn’t care about dates. A transceiver generally shouldn’t too. A microcontroller doesn’t care which year it is, generally.
With an Amiga 500 one can find solutions, and it’s not too bad if you don’t.
The rest is honestly too architecturally fragile anyway and shouldn’t be relied upon as some perpetual service.
MACs will have their Y2K in 2040
HFS has this limitation but isn’t the default file system anymore since several years ago.
Did you mean Media Access Controllers, or macOS?
A 64-bit signed integer can represent timestamps far into the future—roughly 292 billion years in fact, which should cover us until well after the heat death of the universe. Until we discover a solution for that fundamental physical limit, we should be fine.
Come on now, heat death will take far more than 10¹⁰⁰ years, not just 3x10¹¹. It’s not the point of the article but get your facts right.
I’m not too worried about the year 2038 problem. I suspect it will be similar to Y2K, with a bit of overhyped panic, but with most stuff being patched beforehand to avoid issues.
Y2K wasn’t overhyped. It was just successfully planned for. This reeks of the paradox of IT. “Everything is broken, what do you even do” vs “nothing is broken, what do you even do?”
Yeah it only felt like it wasn’t a big deal because it became a big deal early enough for there to be plans made. And because good people doing hard work to prevent a problem wasn’t newsworthy after the fact.
That’s the thing though: It was well-prepared and due to that there was no big issue.
2038 is the same: very well prepared and thus it will not be a big issue.
Of course, if ignored, both would be very problematic, but that’s not the point.
I was at Pepsi for Y2K. In 98, we started with MSMail, W95, and Netware2. We had to also replace all 40k desktops. We worked like dogs for those 2 years and only barely had everything ready in time. Without that work, we would not have been able to continue any business operations. Nothing about it was overhyped.