• Lvxferre@mander.xyz
    link
    fedilink
    English
    arrow-up
    127
    ·
    edit-2
    9 months ago

    Good old honeytrap. I’m not sure, but I think that it’s doable.

    Have a honeytrap page somewhere in your website. Make sure that legit users won’t access it. Disallow crawling the honeytrap page through robots.txt.

    Then if some crawler still accesses it, you could record+ban it as you said… or you could be even nastier and let it do so. Fill the honeytrap page with poison - nonsensical text that would look like something that humans would write.

    • CosmicTurtle@lemmy.world
      link
      fedilink
      English
      arrow-up
      59
      ·
      9 months ago

      I think I used to do something similar with email spam traps. Not sure if it’s still around but basically you could help build NaCL lists by posting an email address on your website somewhere that was visible in the source code but not visible to normal users, like in a div that was way on the left side of the screen.

      Anyway, spammers that do regular expression searches for email addresses would email it and get their IPs added to naughty lists.

      I’d love to see something similar with robots.

      • Lvxferre@mander.xyz
        link
        fedilink
        English
        arrow-up
        32
        ·
        edit-2
        9 months ago

        Yup, it’s the same approach as email spam traps. Except the naughty list, but… holy fuck a shareable bot IP list is an amazing addition, it would increase the damage to those web crawling businesses.

        • Nighed@sffa.community
          link
          fedilink
          English
          arrow-up
          12
          ·
          9 months ago

          but with all of the cloud resources now, you can switch through IP addresses without any trouble. hell, you could just browse by IP6 and not even worry with how cheap those are!

          • Lvxferre@mander.xyz
            link
            fedilink
            English
            arrow-up
            12
            ·
            9 months ago

            Yeah, that throws a monkey wrench into the idea. That’s a shame, because “either respect robots.txt or you’re denied access to a lot of websites!” is appealing.

    • KairuByte@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      9
      ·
      9 months ago

      I’m the idiot human that digs through robots.txt and the site map to see things that aren’t normally accessible by an end user.

      • Lvxferre@mander.xyz
        link
        fedilink
        English
        arrow-up
        6
        ·
        9 months ago

        For banning: I’m not sure but I don’t think so. It seems to me that prefetching behaviour is dictated by a page linking another, to avoid any issue all that the site owner needs to do is to not prefetch links for the honeytrap.

        For poisoning: I’m fairly certain that it doesn’t. At most you’d prefetch a page full of rubbish.