Some thoughts on how useful Anubis really is. Combined with comments I read elsewhere about scrapers starting to solve the challenges, I’m afraid Anubis will be outdated soon and we need something else.

  • rtxn@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    ·
    edit-2
    18 days ago

    The current version of Anubis was made as a quick “good enough” solution to an emergency. The article is very enthusiastic about explaining why it shouldn’t work, but completely glosses over the fact that it has worked, at least to an extent where deploying it and maybe inconveniencing some users is preferable to having the entire web server choked out by a flood of indiscriminate scraper requests.

    The purpose is to reduce the flood to a manageable level, not to block every single scraper request.

    • poVoq@slrpnk.net
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      18 days ago

      And it was/is for sure the lesser evil compared to what most others did: put the site behind Cloudflare.

      I feel people that complain about Anubis have never had their server overheat and shut down on an almost daily basis because of AI scrapers 🤦

      • tofu@lemmy.nocturnal.gardenOP
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        18 days ago

        Yeah, I’m just wondering what’s going to follow. I just hope everything isn’t going to need to go behind an authwall.

      • daniskarma@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        17 days ago

        I still think captchas are a better solution.

        In order to surpass them they have to run AI inference which is also comes with compute costs. But for legitimate users you don’t run unauthorized intensive tasks on their hardware.

        • poVoq@slrpnk.net
          link
          fedilink
          English
          arrow-up
          0
          ·
          17 days ago

          They are much worse for accessibility, and also take longer to solve and are more distruptive for the majority of users.

          • daniskarma@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            0
            arrow-down
            1
            ·
            edit-2
            17 days ago

            Anubis is worse for privacy. As you have to have JavaScript enabled. And worse for the environment as the cryptographic challenges with PoW are just a waste.

            Also reCaptcha types are not really that disturbing most of the time.

            As I said, the polite thing you just be giving users the options. Anubis PoW running directly just for entering a website is one of the most rudest piece of software I’ve seen lately. They should be more polite, and just give an option to the user, maybe the user could chose to solve a captcha or run Anubis PoW, or even just having Anubis but after a button the user could click.

            I don’t think is good practice to run that type of software just for entering a website. If that tendency were to grow browsers would need to adapt and straight up block that behavior. Like only allow access to some client resources after an user action.

            • poVoq@slrpnk.net
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              17 days ago

              Are you seriously complaining about an (entirely false) negative privacy aspect of Anubis and then suggest reCaptcha from Google is better?

              Look, no one thinks Anubis is great, but often it is that or the website becoming entirely inaccessible because it is DDOSed to death by the AI scrapers.

      • moseschrute@crust.piefed.social
        link
        fedilink
        English
        arrow-up
        0
        ·
        16 days ago

        Out of curiosity, what’s the issue with Cloudflair? Aside from the constant worry they may strong arm you into their enterprise pricing if you’re site is too popular lol. I understand support open source, but why not let companies handle the expensive bits as long as they’re willing?

        I guess I can answer my own question. If the point of the Fediverse is to remove a single point of failure, then I suppose Cloidflare could become a single point to take down the network. Still, we could always pivot away from those types of services later, right?

        • Limonene@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          16 days ago

          Cloudflare has IP banned me before for no reason (no proxy, no VPN, residential ISP with no bot traffic). They’ve switched their captcha system a few times, and some years it’s easy, some years it’s impossible.

          • interdimensionalmeme@lemmy.ml
            link
            fedilink
            English
            arrow-up
            0
            ·
            18 days ago

            What CPU do you have made after 2004 that doesn’t have automatic temperature control ?
            I don’t think there is any, unless you somehow managed to disable it ?
            Even a raspberry pi without a heatsink won’t overheat to shutdown

            • poVoq@slrpnk.net
              link
              fedilink
              English
              arrow-up
              1
              ·
              18 days ago

              You are right, it is actually worse, it usually just overloads the CPU so badly that it starts to throttle and then I can’t even access the server via SSH anymore. But sometimes it also crashes the server so that it reboots, and yes that can happen on modern CPUs as well.

      • mobotsar@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        18 days ago

        Is there a reason other than avoiding infrastructure centralization not to put a web server behind cloudflare?

        • poVoq@slrpnk.net
          link
          fedilink
          English
          arrow-up
          1
          ·
          18 days ago

          Yes, because Cloudflare routinely blocks entire IP ranges and puts people into endless captcha loops. And it snoops on all traffic and collects a lot of metadata about all your site visitors. And if you let them terminate TLS they will even analyse the passwords that people use to log into the services you run. It’s basically a huge survelliance dragnet and probably a front for the NSA.

        • Björn Tantau@swg-empire.de
          link
          fedilink
          English
          arrow-up
          0
          ·
          18 days ago

          Cloudflare would need https keys so they could read all the content you worked so hard to encrypt. If I wanted to do bad shit I would apply at Cloudflare.

          • mobotsar@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            18 days ago

            Maybe I’m misunderstanding what “behind cloudflare” means in this context, but I have a couple of my sites proxied through cloudflare, and they definitely don’t have my keys.

            I wouldn’t think using a cloudflare captcha would require such a thing either.

            • StarkZarn@infosec.pub
              link
              fedilink
              English
              arrow-up
              2
              ·
              18 days ago

              That’s because they just terminate TLS at their end. Your DNS record is “poisoned” by the orange cloud and their infrastructure answers for you. They happen to have a trusted root CA so they just present one of their own certificates with a SAN that matches your domain and your browser trusts it. Bingo, TLS termination at CF servers. They have it in cleartext then and just re-encrypt it with your origin server if you enforce TLS, but at that point it’s meaningless.

            • Björn Tantau@swg-empire.de
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              18 days ago

              Hmm, I should look up how that works.

              Edit: https://developers.cloudflare.com/ssl/origin-configuration/ssl-modes/#custom-ssltls

              They don’t need your keys because they have their own CA. No way I’d use them.

              Edit 2: And with their own DNS they could easily route any address through their own servers if they wanted to, without anyone noticing. They are entirely too powerful. Is there some way to prevent this?

    • AnUnusualRelic@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      18 days ago

      The problem is that the purpose of Anubis was to make crawling more computationally expensive and that crawlers are apparently increasingly prepared to accept that additional cost. One option would be to pile some required cycles on top of what’s currently asked, but it’s a balancing act before it starts to really be an annoyance for the meat popsicle users.