I just started using this myself, seems pretty great so far!
Clearly doesn’t stop all AI crawlers, but a significantly large chunk of them.
I just started using this myself, seems pretty great so far!
Clearly doesn’t stop all AI crawlers, but a significantly large chunk of them.
It’s a rather brilliant idea really, but when you consider the environmental implications of forcing web requests to ensure proof of work to function, this effectively burns a more coal for every site that implements it.
You have a point here.
But when you consider the current worlds web traffic, this isn’t actually the case today. For example Gnome project who was forced to start using this on their gitlab, 97% of their traffic could not complete this PoW calculation.
IE - they require only a fraction of computational cost to serve their gitlab, which saves a lot of resources, coal, and most importantly, time of hundreds of real humans.
(Source for numbers)
Hopefully in the future we can move back to proper netiquette and just plain old robots.txt file!
I don’t think AI companies care, and I wholeheartedly support any and all FOSS projects using PoW when serving their websites. I’d rather have that than have them go down