• ik5pvx@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    2
    ·
    16 hours ago

    Linux kernel has had support for 64 bit time for years. On Debian, packages for the upcoming release were updated to 64 bit time earlier this year. I’m fairly sure the other distributions have done or are doing the same. So basically you now have 2 years to upgrade your OS and to pester the vendors of commercial software to do the same.

    Like someone else said, it will be 2 very busy years, but we can survive this.

        • nyan@lemmy.cafe
          link
          fedilink
          English
          arrow-up
          6
          ·
          9 hours ago

          It’s a problem with the internal represensation of a C/C++ type alias called time_t, mostly. That’s the thing that holds the number of elapsed seconds since midnight on Jan. 1, 1970, which is the most common low-level representation of date and time on computers. In theory, time_t could point to a 32-bit type even on a 64-bit system, but I don’t think anyone’s actually dumb enough to do that. It affects more than C/C++ code because most programming languages end up calling C libraries once you go down enough levels.

          In other words, there’s no way you can tell whether a given application is affected or not unless you’re aware of the code details, regardless of the bitness of the program and the processor it’s running on.

          • youmaynotknow@lemmy.ml
            link
            fedilink
            English
            arrow-up
            1
            ·
            3 hours ago

            I don’t think anyone’s actually dumb enough to do that

            Never underestimate human stupidity.

        • InnerScientist@lemmy.world
          link
          fedilink
          English
          arrow-up
          18
          ·
          16 hours ago

          No, those can have 64 bit value as well, this is only a problem for applications which haven’t switched to using them.

        • sgh@lemmy.ml
          link
          fedilink
          English
          arrow-up
          7
          ·
          16 hours ago

          It only depends whether the app and its OS/kernel interface use a 32-bit value to store the time information.

          32-bit architecture or OS has nothing to do with this bug, for example 16-bit architectures must’ve used 32-bit time, too (otherwise they’d be able to only count up to 32-65 seconds).

  • Grandwolf319@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    33
    ·
    21 hours ago

    The problem doesn’t concern me as much at how bad we’ve become at maintaining shit that already works.

    There is also the fact that during Y2K, we didn’t have as much reliance on computers.

    • Zorque@lemmy.world
      link
      fedilink
      English
      arrow-up
      33
      ·
      19 hours ago

      There was also a worldwide effort to fix any potential problems before they happened.

      • SreudianFlip@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        13
        ·
        18 hours ago

        Cobol mavens burned both ends of the candle and made bank, while making banks work.

        Many were old enough to retire after that.

      • MCasq_qsaCJ_234@lemmy.zip
        cake
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        19 hours ago

        Issue 2038 will be easier to fix because many systems are already 64-bit, as 32-bit systems could only handle 4 GB of RAM, and programs need more RAM.

        The only issue would be critical issues that run on 32-bit systems and must be fixed before that date.

        • Bronzebeard@lemmy.zip
          link
          fedilink
          English
          arrow-up
          2
          ·
          8 hours ago

          The only issue would be critical issues that run on 32-bit systems and must be fixed before that date.

          So, many banks and government agencies which still run on mainframes…

        • squaresinger@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 hours ago

          What does a 64-bit system and 4GB RAM have to do with using 64bit timestamps?

          32bit systems can use 64bit values without issue. In fact, even an 8bit system can handle 256bit values or even longer ones without issue.

          The bittiness of a CPU and its address space have nothing to do with the length of usable data unless you end up with data longer than the RAM volume (and even then there’s swap).

        • setsubyou@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          17 hours ago

          32-bit systems could only handle 4 GB of RAM

          I don’t understand why people always say that. Pentium Pro could handle 64 GB even though it was a 32 bit CPU. It had a 36 bit address bus. Later models are the same.

          • Flatfire@lemmy.ca
            link
            fedilink
            English
            arrow-up
            17
            ·
            16 hours ago

            People say it because it was a Windows limitation, not a computing limitation. Windows Server had support for more, but for consumers, it wasn’t easily doable. I believe there’s modern workarounds though. The real limit is how much memory a single application can address at any given time.

    • rottingleaf@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      5
      ·
      12 hours ago

      There is also the fact that during Y2K, we didn’t have as much reliance on computers.

      And we still shouldn’t.

      Uniting the reliance upon long-range electric connectivity (radio, PSTN - but that now depends on computers too), the reliance upon computers (like mainframes), the reliance upon microcontrollers, the reliance upon personal computers (like Amiga 500), the reliance upon fast encryption helped by computers, the reliance upon computers used for mining cryptocoins or some beefy LLMs, the reliance upon computers capable of running Elite Dangerous, and the reliance upon computers capable of running devops clusters with hundreds of containers, - it’s wrong, these are all different.

      An analog PSTN switching station shouldn’t care about dates. A transceiver generally shouldn’t too. A microcontroller doesn’t care which year it is, generally.

      With an Amiga 500 one can find solutions, and it’s not too bad if you don’t.

      The rest is honestly too architecturally fragile anyway and shouldn’t be relied upon as some perpetual service.

  • LostXOR@fedia.io
    link
    fedilink
    arrow-up
    6
    arrow-down
    7
    ·
    19 hours ago

    A 64-bit signed integer can represent timestamps far into the future—roughly 292 billion years in fact, which should cover us until well after the heat death of the universe. Until we discover a solution for that fundamental physical limit, we should be fine.

    Come on now, heat death will take far more than 10¹⁰⁰ years, not just 3x10¹¹. It’s not the point of the article but get your facts right.

    I’m not too worried about the year 2038 problem. I suspect it will be similar to Y2K, with a bit of overhyped panic, but with most stuff being patched beforehand to avoid issues.

    • KairuByte@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      26
      ·
      18 hours ago

      Y2K wasn’t overhyped. It was just successfully planned for. This reeks of the paradox of IT. “Everything is broken, what do you even do” vs “nothing is broken, what do you even do?”

      • captainastronaut@seattlelunarsociety.org
        link
        fedilink
        English
        arrow-up
        4
        ·
        9 hours ago

        Yeah it only felt like it wasn’t a big deal because it became a big deal early enough for there to be plans made. And because good people doing hard work to prevent a problem wasn’t newsworthy after the fact.

      • squaresinger@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        9 hours ago

        That’s the thing though: It was well-prepared and due to that there was no big issue.

        2038 is the same: very well prepared and thus it will not be a big issue.

        Of course, if ignored, both would be very problematic, but that’s not the point.

    • Xaphanos@lemmy.world
      link
      fedilink
      English
      arrow-up
      28
      ·
      19 hours ago

      I was at Pepsi for Y2K. In 98, we started with MSMail, W95, and Netware2. We had to also replace all 40k desktops. We worked like dogs for those 2 years and only barely had everything ready in time. Without that work, we would not have been able to continue any business operations. Nothing about it was overhyped.