• baltakatei@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    33
    ·
    1 day ago

    Explains why my personal blog, wiki, and git repo keep getting hammered by hordes of AI company scrapers. If AI was intelligent, they’d download a single snapshot every month or so and share. But no, eight different scrapers using thousands of different IP addresses (to evade my fail2ban measures) each have to follow every single blame and diff link when a simple git clone operation would get them the hundreds of megabytes of content in one go.

    They are getting better, though. More hits are to RecentChanges on my wiki, so there seem to be some optimizations going on. But I refuse to increase my operating costs beyond a few USD/month to serve AI bots when I know barely anyone human visits.