

31·
3 days ago@pfr @nikodindon That assumes it won’t get worse, which I hope it does. AI companies have forced me to take down web stuff that I had running for almost 2 decades, because their scrapers are so aggressive.
he/him. from the birdsite (@Andres4NY and before that @NEGreenways).
#Dad #NYC #Bikes #FreeTransit #SafeStreets #BanCars #Debian #FreeSoftware #ACAB #Vegetarian #WearAMask
My wife’s an #epidemiologist, so you’ll get some #COVID talk too.
Trans rights are human rights.


@pfr @nikodindon That assumes it won’t get worse, which I hope it does. AI companies have forced me to take down web stuff that I had running for almost 2 decades, because their scrapers are so aggressive.
@meldrik They’re impossible to block based on IP ranges alone. It’s why all the FOSS git forges and bug trackers have started using stuff like anubis. But yes, I initially tried to block them (this was before anubis existed).
It was a few things that I had to take down; a gitweb instance with some of my own repos, for example. And a personal photo gallery. The scrapers would do pathological things like running repeated search queries for random email addresses or strings.