I just started using this myself, seems pretty great so far!

Clearly doesn’t stop all AI crawlers, but a significantly large chunk of them.

  • Daniel Quinn
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    4
    ·
    14 hours ago

    It’s a rather brilliant idea really, but when you consider the environmental implications of forcing web requests to ensure proof of work to function, this effectively burns a more coal for every site that implements it.

    • zutto@lemmy.fedi.zutto.fiOP
      link
      fedilink
      English
      arrow-up
      16
      ·
      13 hours ago

      You have a point here.

      But when you consider the current worlds web traffic, this isn’t actually the case today. For example Gnome project who was forced to start using this on their gitlab, 97% of their traffic could not complete this PoW calculation.

      IE - they require only a fraction of computational cost to serve their gitlab, which saves a lot of resources, coal, and most importantly, time of hundreds of real humans.

      (Source for numbers)

      Hopefully in the future we can move back to proper netiquette and just plain old robots.txt file!

    • marauding_gibberish142@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      5
      ·
      10 hours ago

      I don’t think AI companies care, and I wholeheartedly support any and all FOSS projects using PoW when serving their websites. I’d rather have that than have them go down