I protect my forgejo instance from AI Web Crawlers

3 pointsposted 9 hours ago
by todsacerdoti

1 Comments

immibis

8 hours ago

My issue with Gitea (which Forgejo is a fork of) was that crawlers would hit the "download repository as zip" link over and over. Each access creates a new zip file on disk which is never cleaned up. I disabled that (by setting the temporary zip directory to read-only, so the feature won't work) and haven't had a problem since then.

It's easy to assume "I received a lot of requests, therefore the problem is too many requests" but you can successfully handle many requests.

This is a clever way of doing a minimally invasive botwall though - I like it.