Ugh, apparently yesterday a bot visited my Forgejo instance and queried everything, which caused Forgejo to create repo archives for everything. Git on the instance is 2.1 GB in size, but the repo archive filled up everything and is 120 GB. I really didn’t expect such a spike.
That meant that it filled up the whole hard drive and the server and all the services and websites on it went down while I was sleeping.
Luckily it seems that just deleting that directory fixes the problem temporarily. I also disabled the possibility of downloading archived from the UI but I’m not sure if this will prevent bots from generating those archives again. I also can’t just make the directory read only because it uses it for other things like mirroring, etc too.
For small instances like mine those archives are quite a headache.
Anubis is usually installed in such a case.
I need to look into it, thanks!
Yeah, I put not protection in front of mine, aftering noticing bots were scanning code and did grab emails. Using Anibus for now, still looking at other alternatives.
Not saying this is an option for you, only that I kept my forgejo instance private to avoid dealing with this AI crawler bullshit. I hope you find a good solution.
I was just about to install Gitea. Any substantial differences between the two?
I don’t know the specifics but forgejo is a gotea fork. There was/is some controversy around gitea governance and movent towards prioritizing a closed source paid/private versions of gitea.
Again, I don’t know details, just very broad strokes. I chose forgejo because it’s under active Foss development and I didnt want to deal with potentially going with gitea and then having to abandon it later for whatever reason might develop.
Yeah I understand, but the whole point of me hosting my instance was to make my code public.
Have you looked at Codeberg?
Codeberg is a instance of forgejo, I run my own instance because I don’t want to be dependent on others.
And I totally understand that. These AI crawlers really suck.
I appreciate that you make your stuff public. I can’t find the specific repos right now but I know I’ve referenced your code for various fediverse things I’ve dabbled in over the last year or so.
You should limit the amount of storage available to a single service.
Also, set up Anubis or restrict access
Yeah, I really need to figure out how to do quotas per service.
If you have a Linux server, you can try partitioning your drive using LVM. You can prevent services from consuming all disk space by giving each one their own logical volume.
I already have LVM but I was using it to combine drives. But it’s not a bad idea, if I can’t do it with Docker, at least that would be a different solution.
Does it not require an account for that? I would open a feature request if it doesn’t, else it creates a denial-of-service attack.
It does not, because that feature is usually used for scripts to download some specific release archive, etc. and other git hosting solutions do the same.
Are you using anything to defend against bots?
I have nothing against bots per se, they help to spread the word about my open source code which I want to share with others.
It’s just unfortunate that forgejo fills up the hard drive to such an extend and doesn’t quite let you disable this archive feature.
Are you saying if someone (such as a scraper) tries to download a snapshot, forgejo makes a disk file containing the snapshot, sends it, and keeps it around forever? That sounds crazy to me and I’d open a bug or try to fix it.
It makes a zip file and a tarball, and keeps them for cached for other people to download in the future.
Ok so it’s just a matter of limiting the cache size.
There is no setting like that, at least I can’t find it.
I think you should open a Forgejo issue requesting a cache size limit option. It seems like quite a big problem if bots can fill up your hard drive like this without you setting a limit on all data used by Forgejo (when, for single-user instances, you probably only want to limit archive size or size of any data the public can create, not the size of your own repos)
Ok, there was one issue already and I added my comment to it: https://codeberg.org/forgejo/forgejo/issues/7011#issuecomment-7022288
I used cloudfares captcha equivalent and my bots dropped to zero
But then how do people who search for code like yours find your open source code if not though a search engine which uses a indexing not?
I’ve searched the docs a bit and found this setting: https://forgejo.org/docs/latest/admin/config-cheat-sheet/#quota-subjects-list
It seems to be partially for your case, though I don’t see artifacts, but you could limit all of forgejo to like 5GB and probably be good.
Hm, I’m afraid none of them really seems to cover the repo-archives case, therefor I’m afraid the size:all doesn’t include the repo-archives either.
But I’m running it in a container, perhaps I can limit the size the container gets assigned.
It kinda seems like it. Docker apparently does have this functionality as seen here: https://stackoverflow.com/questions/40494536/how-to-specify-the-size-of-a-shared-docker-volume/40499023#40499023
You could try limiting it to 5 GB using the forgejo settings and 7GB using docker and then just look, how big it is.
i “fixed” the problem of those fucking bots by blocking everyone except my country
Sadly that’s not the solution to my problem. The whole point op open-sourcing for me is to make it accessible to as many people as possible.
A few days late, but I have a pretty similar usecase to you on https://forgejo.ellis.link/. My solution is go-away, https://git.gammaspectra.live/git/go-away, which just sits as a reverse proxy in between traefik and Forgejo. I haven’t enabled fancy stuff like TLS fingerprinting. It’s been effective enough at killing the bots downloading archives and DDoSing the server from residential IPs. My config is based on the example Forgejo config, but with a few tweaks. Too long to post here, though, so message me if you need access
For now I feel disabling archives and my simple list of bots to drop in Nginx seems to work very well, it doesn’t create the archives anymore and the load went down also on the server.