I got into the self-hosting scene this year when I wanted to start up my own website run on old recycled thinkpad. A lot of time was spent learning about ufw, reverse proxies, header security hardening, fail2ban.

Despite all that I still had a problem with bots knocking on my ports spamming my logs. I tried some hackery getting fail2ban to read caddy logs but that didnt work for me. I nearly considered giving up and going with cloudflare like half the internet does. But my stubbornness for open source self hosting and the recent cloudflare outages this year have encouraged trying alternatives.

Coinciding with that has been an increase in exposure to seeing this thing in the places I frequent like codeberg. This is Anubis, a proxy type firewall that forces the browser client to do a proof-of-work security check and some other nice clever things to stop bots from knocking. I got interested and started thinking about beefing up security.

I’m here to tell you to try it if you have a public facing site and want to break away from cloudflare It was VERY easy to install and configure with caddyfile on a debian distro with systemctl. In an hour its filtered multiple bots and so far it seems the knocks have slowed down.

https://anubis.techaro.lol/

My botspam woes have seemingly been seriously mitigated if not completely eradicated. I’m very happy with tonights little security upgrade project that took no more than an hour of my time to install and read through documentation. Current chain is caddy reverse proxy -> points to Anubis -> points to services

Good place to start for install is here

https://anubis.techaro.lol/docs/admin/native-install/

  • termaxima@slrpnk.net
    link
    fedilink
    English
    arrow-up
    2
    ·
    12 hours ago

    I am very annoyed that I have to enable cloudflare’s JavaScript on so many websites, I would much prefer if more of them used Anubis so I didn’t have third-party JavaScript running as often.

    ( coming from an annoying user who tries to enable the fewest things possible in NoScript )

  • drkt@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    13
    ·
    1 day ago

    Stop playing wack-a-mole with these fucking people and build TARPITS!

    Make it HURT to crawl your site illegitimately.

  • sudoer777@lemmy.ml
    link
    fedilink
    English
    arrow-up
    14
    ·
    edit-2
    1 day ago

    I host my main server on my own hardware, and a VPN on Hetzner because my shitty ISP doesn’t let me port forward. For the past year, bots were hitting my Forgejo instance hard. I forgot to disable registration and they generated hundreds of accounts with hundreds of repos with sketchy links, generating terrabytes of traffic from my VPS, costing me money in traffic. I disabled registration and deleted the spam, and bots still kept hitting my server for several months, which would cause memory leaks over time and crash it and consume CPU, and still costed me money with terrabytes of traffic per month. A few weeks ago, I put Anubis on the VPS. Now, zero bots hit my Forgejo instance and I don’t pay for their traffic anymore. Problem solved.

    • LOLseas@lemmy.zip
      link
      fedilink
      English
      arrow-up
      1
      ·
      12 hours ago

      This is the first time I’ve ever seen it misspelled like that. It’s ‘terabyte/terabytes’. 1,024 GBs worth of data.

    • Jason2357@lemmy.ca
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 day ago

      Its always code forges and wikis that are effected by this because the scrapers spider down into every commit or edit in your entire history, then come back the next day and check every “page” again to see if any changed. Consider just blocking pages that are commit history at your reverse proxy.

  • daniskarma@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    37
    ·
    edit-2
    1 day ago

    I don’t think you have a usecase for Anubis.

    Anubis is mainly aimed against bad AI scrappers and some ddos mitigation if you have a heavy service.

    You are getting hit exactly the same, anubis doesn’t put up a block list or anything. It just put itself in front of the service. The load on your server and the risk you take it’s very similar anubis or not anubis here. Most bots are not AI scrappers they are just proving. So the hit on your server is the same.

    What you want is to properly set up fail2ban or, even better, crowdsec. That would actually block and ban bots that try to prove your server.

    If you are just self-hosting with Anubis the only thing you are doing is deriving the log noise towards Anubis logs and making your devices do a PoW every once in a while when you want to use your services.

    Being honest I don’t know what you are self hosting. But at least it’s something that’s going to get ddos or AI scrapped, there’s not much point with Anubis.

    Also Anubis is not a substitute for fail2ban or crowdsec. You need something to detect and ban brute force attacks. If not the attacker would only need to execute the anubis challenge get the token for the week and then they are free to attack your services as they like.

  • smh@slrpnk.net
    link
    fedilink
    English
    arrow-up
    19
    ·
    1 day ago

    The creator is active on a professional slack I’m on and they’re lovely and receptive to user feedback. Their tool is very popular in the online archives/cultural heritage scene (we combine small budgets and juicy, juicy data).

    My site has enabled js-free screening when the site load is low, under the theory that if the site load is too high then no one’s getting in anyway.

  • quick_snail@feddit.nl
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    3
    ·
    1 day ago

    Kinda sucks how it makes websites inaccessible to folks who have to disable JavaScript for security.

    • poVoq@slrpnk.net
      link
      fedilink
      English
      arrow-up
      30
      arrow-down
      5
      ·
      1 day ago

      I kinda sucks how AI scrapers make websites inaccessible to everyone 🙄

          • quick_snail@feddit.nl
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            5
            ·
            edit-2
            1 day ago

            Lol I’m the sysadmin for many sites that doesn’t have these issues, so obviously I do…

            It you’re the one that thinks you need this trash pow fronting for a static site, then clearly you’re the one who is ignorant

            • poVoq@slrpnk.net
              link
              fedilink
              English
              arrow-up
              12
              ·
              1 day ago

              Obviously I don’t think you need Anubis for a static site. And if that is what your admin experience is limited too, than you have a strong case of dunning krueger.

    • WhyJiffie@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      14
      ·
      1 day ago

      there’s a fork that has non-js checks. I don’t remember the name but maybe that’s what should be made more known

      • quick_snail@feddit.nl
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 day ago

        Please share if you know.

        The only way I know how to do this is running a Tor Onion Service, since the tor protocol has built-in pow support (without js)

        • WhyJiffie@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          8
          ·
          edit-2
          1 day ago

          It’s this one: https://git.gammaspectra.live/git/go-away

          the project name is a bit unfortunate to show for users, maybe change that if you will use it.

          some known privacy services use it too, including the invidious at nadeko.net, so you can check there how it works. It’s one of the most popular inv servers so I guess it cannot be bad, and they use multiple kinds of checks for each visitor

    • url@feddit.fr
      link
      fedilink
      Français
      arrow-up
      16
      arrow-down
      1
      ·
      2 days ago

      Did i forgot to mention it doesnt work without js that i keep disabled

  • non_burglar@lemmy.world
    link
    fedilink
    English
    arrow-up
    182
    arrow-down
    1
    ·
    2 days ago

    Anubis is an elegant solution to the ai bot scraper issue, I just wish the solution to everything wasn’t just spending compute everywhere. In a world where we need to rethink our energy consumption and generation, even on clients, this is a stupid use of computing power.

    • Leon@pawb.social
      link
      fedilink
      English
      arrow-up
      113
      arrow-down
      4
      ·
      edit-2
      2 days ago

      It also doesn’t function without JavaScript. If you’re security or privacy conscious chances are not zero that you have JS disabled, in which case this presents a roadblock.

      On the flip side of things, if you are a creator and you’d prefer to not make use of JS (there’s dozens of us) then forcing people to go through a JS “security check” feels kind of shit. The alternative is to just take the hammering, and that feels just as bad.

      No hate on Anubis. Quite the opposite, really. It just sucks that we need it.

      • SmokeyDope@piefed.socialOP
        link
        fedilink
        English
        arrow-up
        60
        ·
        edit-2
        2 days ago

        Theres a compute option that doesnt require javascript. The responsibility lays on site owners to properly configure IMO, though you can make the argument its not default I guess.

        https://anubis.techaro.lol/docs/admin/configuration/challenges/metarefresh

        From docs on Meta Refresh Method

        Meta Refresh (No JavaScript)

        The metarefresh challenge sends a browser a much simpler challenge that makes it refresh the page after a set period of time. This enables clients to pass challenges without executing JavaScript.

        To use it in your Anubis configuration:

        # Generic catchall rule
        - name: generic-browser
          user_agent_regex: >-
            Mozilla|Opera
          action: CHALLENGE
          challenge:
            difficulty: 1 # Number of seconds to wait before refreshing the page
            algorithm: metarefresh # Specify a non-JS challenge method
        

        This is not enabled by default while this method is tested and its false positive rate is ascertained. Many modern scrapers use headless Google Chrome, so this will have a much higher false positive rate.

        • z3rOR0ne@lemmy.ml
          link
          fedilink
          English
          arrow-up
          14
          ·
          2 days ago

          Yeah I actually use the noscript extension and i refuse to just whitelist certain sites unless I’m very certain I trust them.

          I run into Anubis checks all the time and while I appreciate the software, having to consistently temporarily whitelist these sites does get cumbersome at times. I hope they make this noJS implementation the default soon.

          • Prathas@lemmy.zip
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            1
            ·
            edit-2
            1 day ago

            Wait, you keep temporarily allowing then over and over again? Why temporary?

            Sincerely,
            Another NoScript fan

            • z3rOR0ne@lemmy.ml
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 day ago

              Most of the Anubis encounters I have are to redlib instances that are shuffled around, go down all the time, and generally are more ephemeral than other sites. Because I use another extension called Libredirect to shuffle which redlib instance I visit when clicking on a reddit link, I don’t bother whitelisting them permanently.

              I already have solved this on my desktop by self hosting my own redlib instance via localhost and using libredirect to just point there, but on my phone I still do the whole nojs temp unblock random redlib instance. Eventually I plan on using wireguard to host a private redlib instance on a vps so I can just not deal with this.

              This is a weird case I know, but its honestly not that bad.

      • quick_snail@feddit.nl
        link
        fedilink
        English
        arrow-up
        7
        ·
        1 day ago

        This is why we need these sites to have .onions. Tor Browser has a PoW that doesn’t require js

      • cecilkorik@piefed.ca
        link
        fedilink
        English
        arrow-up
        13
        ·
        2 days ago

        if you are a creator and you’d prefer to not make use of JS (there’s dozens of us) then forcing people to go through a JS “security check” feels kind of shit. The alternative is to just take the hammering, and that feels just as bad.

        I’m with you here. I come from an older time on the Internet. I’m not much of a creator, but I do have websites, and unlike many self-hosters I think, in the spirit of the internet, they should be open to the public as a matter of principle, not cowering away for my own private use behind some encrypted VPN. I want it to be shared. Sometimes that means taking a hammering. It’s fine. It’s nothing that’s going to end the world if it goes down or goes away, and I try not to make a habit of being so irritating that anyone would have much legitimate reason to target me.

        I don’t like any of these sort of protections that put the burden onto legitimate users. I get that’s the reality we live in, but I reject that reality, and substitute my own. I understand that some people need to be able to block that sort of traffic to be able to limit and justify the very real costs of providing services for free on the Internet and Anubis does its job for that. But I’m not one of those people. It has yet to cost me a cent above what I have already decided to pay, and until it does, I have the freedom to adhere to my principles on this.

        To paraphrase another great movie: Why should any legitimate user be inconvenienced when the bots are the ones who suck. I refuse to punish the wrong party.

      • Nate Cox@programming.dev
        link
        fedilink
        English
        arrow-up
        23
        arrow-down
        19
        ·
        2 days ago

        I feel comfortable hating on Anubis for this. The compute cost per validation is vanishingly small to someone with the existing budget to run a cloud scraping farm, it’s just another cost of doing business.

        The cost to actual users though, particularly to lower income segments who may not have compute power to spare, is annoyingly large. There are plenty of complaints out there about Anubis being painfully slow on old or underpowered devices.

        Some of us do actually prefer to use the internet minus JS, too.

        Plus the minor irritation of having anime catgirls suddenly be a part of my daily browsing.

    • cadekat@pawb.social
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      1
      ·
      2 days ago

      Scarcity is what powers this type of challenge: you have to prove you spent a certain amount of electricity in exchange for access to the site, and because electricity isn’t free, this imposes a dollar cost on bots.

      You could skip the detour through hashes/electricity and do something with a proof-of-stake cryptocurrency, and just pay for access. The site owner actually gets compensated instead of burning dead dinosaurs.

      Obviously there are practical roadblocks to this today that a JavaScript proof-of-work challenge doesn’t face, but longer term…

      • daniskarma@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        15 hours ago

        I think the issue is that many sites are too aggressive with it. Anubis can be configured to only ask for challenges if the site is under unusual load, for instance when a botnet it’s actually ddosing the site. That’s when it shines.

        Making it constantly ask for challenges when the service is not under attack is just a massive waste of energy. And many sites just enable it constantly because they can defer bot pings from their logs that way. That’s for instance what op is doing. It’s just a big misunderstanding of the tool.

      • artyom@piefed.social
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        1 day ago

        You could skip the detour through hashes/electricity and do something with a proof-of-stake cryptocurrency, and just pay for access. The site owner actually gets compensated instead of burning dead dinosaurs.

        Maybe if the act of transferring crypto didn’t use a comparable or greater amount of energy…

        • cadekat@pawb.social
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 day ago

          That’s why I specified a proof-of-stake cryptocurrency. They use so much less energy that it is practically negligible in comparison, and more on the order of traditional online transactions.

      • Nate Cox@programming.dev
        link
        fedilink
        English
        arrow-up
        11
        arrow-down
        6
        ·
        2 days ago

        The cost here only really impacts regular users, too. The type of users you actually want to block have budgets which easily allow for the compute needed anyways.

        • chicken@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          13
          ·
          2 days ago

          I think maybe they wouldn’t if they are trying to scale their operations to scanning through millions of sites and your site is just one of them

          • cadekat@pawb.social
            link
            fedilink
            English
            arrow-up
            15
            ·
            2 days ago

            Yeah, exactly. A regular user isn’t going to notice an extra few cents on their electricity bill (boiling water costs more), but a data centre certainly will when you scale up.

    • quick_snail@feddit.nl
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      1 day ago

      Great article, but I disagree about WAFs.

      Try to secure a nonprofit’s web infrastructure with as 1 IT guy and no budget for devs or security.

      It would be nice if we could update servers constantly and patch unmaintained code, but sometimes you just need to front it with something that plugs those holes until you have the capacity to do updates.

      But 100% the WAF should be run locally, not a MiTM from evil US corp in bed with DHS.

  • sudo@programming.dev
    link
    fedilink
    English
    arrow-up
    50
    arrow-down
    7
    ·
    2 days ago

    I’ve repeatedly stated this before: Proof of Work bot-management is only Proof of Javascript bot-management. It is nothing to a headless browser to by-pass. Proof of JavaScript does work and will stop the vast majority of bot traffic. That’s how Anubis actually works. You don’t need to punish actual users by abusing their CPU. POW is a far higher cost on your actual users than the bots.

    Last I checked Anubis has an JavaScript-less strategy called “Meta Refresh”. It first serves you a blank HTML page with a <meta> tag instructing the browser to refresh and load the real page. I highly advise using the Meta Refresh strategy. It should be the default.

    I’m glad someone is finally making an open source and self hostable bot management solution. And I don’t give a shit about the cat-girls, nor should you. But Techaro admitted they had little idea what they were doing when they started and went for the “nuclear option”. Fuck Proof of Work. It was a Dead On Arrival idea decades ago. Techaro should strip it from Anubis.

    I haven’t caught up with what’s new with Anubis, but if they want to get stricter bot-management, they should check for actual graphics acceleration.

    • rtxn@lemmy.world
      link
      fedilink
      English
      arrow-up
      15
      arrow-down
      2
      ·
      edit-2
      2 days ago

      POW is a far higher cost on your actual users than the bots.

      That sentence tells me that you either don’t understand or consciously ignore the purpose of Anubis. It’s not to punish the scrapers, or to block access to the website’s content. It is to reduce the load on the web server when it is flooded by scraper requests. Bots running headless Chrome can easily solve the challenge, but every second a client is working on the challenge is a second that the web server doesn’t have to waste CPU cycles on serving clankers.

      POW is an inconvenience to users. The flood of scrapers is an existential threat to independent websites. And there is a simple fact that you conveniently ignored: it fucking works.

      • sudo@programming.dev
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        1 day ago

        Its like you didn’t understand anything I said. Anubis does work. I said it works. But it works because most AI crawlers don’t have a headless browser to solve the PoW. To operate efficiently at the high volume required, they use raw http requests. The vast majority are probably using basic python requests module.

        You don’t need PoW to throttle general access to your site and that’s not the fundamental assumption of PoW. PoW assumes (incorrectly) that bots won’t pay the extra flops to scrape the website. But bots are paid to scape the website users aren’t. They’ll just scale horizontally and open more parallel connections. They have the money.

        • poVoq@slrpnk.net
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          2
          ·
          1 day ago

          You are arguing a strawman. Anubis works because because most AI scrapers (currently) don’t want to spend extra on running headless chromium, and because it slightly incentivises AI scrapers to correctly identify themselves as such.

          Most of the AI scraping is frankly just shoddy code written by careless people that don’t want to ddos the independent web, but can’t be bothered to actually fix that on their side.

          • sudo@programming.dev
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            1 day ago

            You are arguing a strawman. Anubis works because because most AI scrapers (currently) don’t want to spend extra on running headless chromium

            WTF, That’s what I already said? That was my entire point from the start!? You don’t need PoW to force headless usage. Any JavaScript challenge will suffice. I even said the Meta Refresh challenge Anubis provides is sufficient and explicitly recommended it.

            • poVoq@slrpnk.net
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              3
              ·
              1 day ago

              And how do you actually check for working JS in a way that can’t be easily spoofed? Hint: PoW is a good way to do that.

              Meta refresh is a downgrade in usability for everyone but a tiny minority that has disabled JS.

              • sudo@programming.dev
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 day ago

                And how do you actually check for working JS in a way that can’t be easily spoofed? Hint: PoW is a good way to do that.

                Accessing the browsers API in any way is way harder to spoof than some hashing. I already suggested checking if the browser has graphics acceleration. That would filter out the vast majority of headless browsers too. PoW is just math and is easy to spoof without running any JavaScript. You can even do it faster than real JavaScript users something like Rust or C.

                Meta refresh is a downgrade in usability for everyone but a tiny minority that has disabled JS.

                What are you talking about? It just refreshes the page without doing any of the extra computation that PoW does. What extra burden does it put on users?

                • poVoq@slrpnk.net
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  1
                  ·
                  1 day ago

                  If you check for GPU (not generally a bad idea) you will have the same people that currently complain about JS, complain about this breaking with their anti-fingerprinting browser addons.

                  But no, you can’t spoof PoW obviously, that’s the entire point of it. If you do the calculation in Javascript or not doesn’t really matter for it to work.

                  In the current shape Anubis has zero impact on usability for 99% of the site visitors, not so with meta refresh.

    • SmokeyDope@piefed.socialOP
      link
      fedilink
      English
      arrow-up
      35
      ·
      edit-2
      2 days ago

      Something that hasn’t been mentioned much in discussions about Anubis is that it has a graded tier system of how sketchy a client is and changing the kind of challenge based on a a weighted priority system.

      The default bot policies it comes with has it so squeaky clean regular clients are passed through, then only slightly weighted clients/IPs get the metarefresh, then its when you get to moderate-suspicion level that JavaScript Proof of Work kicks. The bot policy and weight triggers for these levels, challenge action, and duration of clients validity are all configurable.

      It seems to me that the sites who heavy hand the proof of work for every client with validity that only last every 5 minutes are the ones who are giving Anubis a bad wrap. The default bot policy settings Anubis comes with dont trigger PoW on the regular Firefox android clients ive tried including hardened ironfox. meanwhile other sites show the finger wag every connection no matter what.

      Its understandable why some choose strict policies but they give the impression this is the only way it should be done which Is overkill. I’m glad theres config options to mitigate impact normal user experience.

      • sudo@programming.dev
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 day ago

        Anubis is that it has a graded tier system of how sketchy a client is and changing the kind of challenge based on a a weighted priority system.

        Last I checked that was just User-Agent regexes and IP lists. But that’s where Anubis should continue development, and hopefully they’ve improved since. Discerning real users from bots is how you do proper bot management. Not imposing a flat tax on all connections.

  • 0_o7@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    29
    arrow-down
    1
    ·
    2 days ago

    I don’t mind Anubis but the challenge page shouldn’t really load an image. It’s wasting extra bandwidth for nothing.

    Just parse the challenge and move on.

      • Voroxpete@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        3
        ·
        edit-2
        2 days ago

        It’s actually a brilliant monetization model. If you want to use it as is, it’s free, even for large corporate clients.

        If you want to get rid of the puppygirls though, that’s when you have to pay.

        (The absolute Chads at the UN left the puppygirls in, and I have to respect that

        • frongt@lemmy.zip
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 day ago

          It’s open source, so you could always just patch it without paying too. But you should support the maintainers if you think they deserve it.

    • Kilgore Trout@feddit.it
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      1
      ·
      edit-2
      2 days ago

      It’s a palette of 10 colours. I would guess it uses an indexed colorspace, reducing the size to a minimum.
      edit: 28 KB on disk

      • CameronDev@programming.dev
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        3
        ·
        2 days ago

        A HTTP get request is a few hundred bytes. The response is 28KB. Thats 280x. If a large botnet wanted to denial of service an Anubis protected site, requesting that image could be enough.

        Ideally, Anubis should serve as little data as possible until the POW is completed. Caching the POW algorithm (and the image) to a CDN would also mitigate the issue.

        • teolan@lemmy.world
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          1
          ·
          2 days ago

          The whole point of Anubis is to not have to go through a CDN to sustain scrapping botnets

          • CameronDev@programming.dev
            link
            fedilink
            English
            arrow-up
            4
            ·
            2 days ago

            I dunno that is true, nothing in the docs indicates that it is explicitly anti-CDN. And using a CDN for a static javascript resource and an image isn’t the same as running the entire site through a CDN proxy.

  • Appoxo@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 day ago

    Maybe you know the answer to my question:
    If I’d want to use any app that doesnt run in a webbrowser (e.g. the native jellyfin app), how would that work? Does it still work then?

    • chaospatterns@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      If the app is just a WebView wrapper around the application, then the challenge page would load and try to be evaluated.

      If it’s a native Android/iOS app, then it probably wouldn’t work because the app would try to make HTTP API calls and get back something unexpected.

      • Appoxo@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        3
        ·
        17 hours ago

        Authelia already broke the functionality for jellyfin and symfonium.
        So I guess the answer is no.

    • SmokeyDope@piefed.socialOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      It explicitly checks for web browser properties to apply challenges and all its challenges require basic web functionality like page refresh. Unless the connection to your server involves handling a user agents string it won’t work, I think this I how it is anyway. Hope this helped.

      • Appoxo@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 day ago

        Assuming what you said is correct, it wouldnt help my use case.
        Not hosting any page meant for public consumption anyway so it’s not really important.
        But thanks for answering :)

    • merc@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      1
      ·
      2 days ago

      The front page of the web site is excellent. It describes what it does, and it does its feature set in quick, simple terms.

      I can’t tell you how many times I’ve gone to a website for some open-source software and had no idea what it was or how it was trying to do it. They often dive deep into the 300 different ways of installing it, tell you what the current version is and what features it has over the last version, but often they just assume you know the basics.

    • Cyberflunk@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      8
      ·
      2 days ago

      thank you! this needed said.

      • This post is a bit critical of a small well-intentioned project, so I felt obliged to email the maintainer to discuss it before posting it online. I didn’t hear back.

      i used to watch the dev on mastodon, they seemed pretty radicalized on killing AI, and anyone who uses it (kidding!!) i’m not even surprised you didn’t hear back

      great take on the software, and as far as i can tell, playwright still works/completes the unit of work. at scale anubis still seems to work if you have popular content, but does hasnt stopped me using claude code + virtual browsers

      im not actively testing it though. im probably very wrong about a few things, but i know anubis isn’t hindering my personal scraping, it does fuck up perplexity and chatgpt bots, which is fun to see.

      good luck Blue team!