mirror of
https://github.com/fosrl/pangolin.git
synced 2026-05-08 13:49:15 -05:00
[GH-ISSUE #436] Uploading Files to Home Server Causing Pangolin VPS to Explode in Size #1449
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @kylepyke on GitHub (Mar 30, 2025).
Original GitHub issue: https://github.com/fosrl/pangolin/issues/436
I'm having an issue where uploading files to my homeserver (e.g. Immich) through my Pangolin VPS causes the VPS hard drive to fill up. It looks like the
/var/lib/docker/overlay2folder balloons?On the VPS, I'm only hosting services related to Pangolin: Traefik, Gerbil, and Crowdsec.
Thanks!
@oschwartz10612 commented on GitHub (Apr 1, 2025):
Hello!
It sounds like one of the containers running on your Pangolin VPS (likely Traefik or potentially CrowdSec) might be logging excessive data or temporarily buffering the uploads within its Docker layer, causing the /var/lib/docker/overlay2 growth. You could try checking the logs for each container (docker logs <container_name>) during an upload to see if one is generating massive output, and also review the Traefik and CrowdSec configurations to ensure log levels aren't set to debug or that there isn't unexpected buffering enabled.
@kylepyke commented on GitHub (Apr 1, 2025):
Thanks! I will check configs for log level, though 40gb+ of logs seems like a lot. I am intrigued by a buffering issue, though. Where would I check those settings in Traefik and Crowdsec?
@leodr99 commented on GitHub (Apr 6, 2025):
had the same issue some time ago on an unrelated project.
check the current log usage of the containers
sudo du -h $(docker inspect --format='{{.LogPath}}' $(docker ps -qa))if like me you see some gigantic absurd sizes, it's best to either limit the logging by each container, or globally.
to keep the docker log files in check globally, just edit||create
/etc/docker/daemon.json, and limit them by adding:{ "log-opts": { "max-size": "10m", "max-file": "3" } }, to the file. (note: it's needed to restart the docker daemon to take effect)if by the per-container route, then:
on each service stanza. Then recreate the container/stack.
my 2cents...
@kylepyke commented on GitHub (Apr 6, 2025):
Wow, that's super helpful, thanks!
@kylepyke commented on GitHub (Apr 6, 2025):
I have figured out the issue, but I don't have a solution!!!! @miloschwartz, maybe you have an answer? It's not logs causing the issue, it's Traefik over-using RAM, and memory swapping.
When I upload files from Immich, traefik fills up my 2GB of VPS ram pretty quickly– particularly with big video files, and starts memory swapping, then filling up my hard drive. Would limiting memory usage and swappiness solve this, or would it just crash Traefik?
@kylepyke commented on GitHub (Apr 6, 2025):
So this is an Immich issue. Since Immich doesn't support chunk uploads (yet), Traefik will eat all of your RAM and swap to eating your HDD. More info here: https://github.com/immich-app/immich/pull/2101 and here https://www.reddit.com/r/immich/comments/1fhvv1i/chunk_upload/
Leaving this here for the next person who finds out the hard way.
@miloschwartz commented on GitHub (Apr 6, 2025):
This is good to know. Thanks for the update / finding this.
@TuncTaylan commented on GitHub (Apr 10, 2025):
I think it is premature to close this issue, because other applications may also be affected by this.
For example Minio (web ui) is causing similar issues, when uploading bigger files.
There needs to be options or documentation about the configs around the buffering sizes or caches. It should be possible to use write to directly to disk instead of caching in the memory.
Edit: With enough RAM I'm getting following log:
crowdsec | time="2025-04-10T16:28:59Z" level=warning msg="Disrupting transaction with body size above the configured limit (Action Reject)" band=inband chain_rule_id=3231508716 name=myAppSecComponent runner_uuid=1111e24a-bc7f-4862-9709-b69e66382a5a tx_id=ed578e15-f17c-4166-9172-75816b4dff66 type=appsecThe question is, where is that configuration?
@miloschwartz commented on GitHub (Apr 10, 2025):
I agree it would be good to get some more information on this. Might want to checkout Traefik and Crowdsec's documentation as it's going to be the best resource for these tools.
@kylepyke commented on GitHub (Apr 10, 2025):
FYI, I upgraded the RAM on my VPS to 4GB, and still cannot upload a 2.5G video file through Immich. That should be plenty of memory... I feel like something else is going on.
@TuncTaylan commented on GitHub (Apr 11, 2025):
I'm currently sampling more data around it, the warning message came from this line:
ad40dcbb05/internal/corazawaf/transaction.go (L841), a component of coraza (WAF implementation, thats being used by the crowdsec-bouncer-traefik-plugin) and the required settings is this:
ad40dcbb05/coraza.conf-recommended (L40)I have to find out, how to set this within the setup in pangolin + crowdsec.
So far I've found these:
I have to test it though.
Edit: Ideal way should be writing the buffers directly to disc in order to stay in the RAM budget of 1GB (recommended VPS size!)
Edit2: - maxRequestBodyBytes or memRequestBodyBytes didn't do anything: https://doc.traefik.io/traefik/middlewares/http/buffering/
Edit3: the culprit is definitely crowdsec or some appsec plugin of it, because, if I disable crowdsecAppsecEnabled in the traefic->dynamic_settings.yaml I can upload files with 1gb and 10gb without any issues.
So I have to experiment with the setting crowdsecAppsecBodyLimit here: https://plugins.traefik.io/plugins/6335346ca4caa9ddeffda116/crowdsec-bouncer-traefik-plugin
Edit4: Even with the adjustment to crowdsecAppsecBodyLimit, traefik gets oom'ed, so I give up and go into the weekend.
@TuncTaylan commented on GitHub (Apr 11, 2025):
I think I got this.
The configuration we need "crowdsecAppsecBodyLimit" was first implemented in the version v1.4.0 of crowdsec-bouncer-traefik-plugin. Newest version is v1.4.2, so I had to update the version first, after that I could upload 1GB and even 10GB files without an issue (well read along).
Update the version of the plugin to currently the newest v1.4.2 in the file: config/traefik/traefik_config.yml
After that the default value of the needed parameter "crowdsecAppsecBodyLimit" is already at 10MB, but can be adjusted in the file: config/traefik/dynamic_config.yml
Current issue is, for every file greater than 10MB (default value or set above) there is an error message:
level=error msg="Failed to process request body", which I'm guessing that it works as designed, since it doesn't transmit to Crowdsec Appsec Server.And then there is this note:
/!\ Appsec maximum body limit is defaulted to 10MB By careful when you upgrade to >1.4.x
I'll have the pull request soon.
Edit: Here is the PR https://github.com/fosrl/pangolin/pull/515#issue-2989648287
@kylepyke commented on GitHub (Apr 14, 2025):
Hmmm... This didn't work for me. I'm experiencing the same issues. Pangolin v1.2, Traefik v 3.3.5, CrowdSec v1.4.2
@TuncTaylan commented on GitHub (Apr 14, 2025):
Can you share your configs (both of traefik configs)?
@kylepyke commented on GitHub (Apr 14, 2025):
Docker Compose:
dynamic_config.yml:
traefik_config.yml:
@kylepyke commented on GitHub (Apr 14, 2025):
Immich running on separate home server, docker-compose:
@TuncTaylan commented on GitHub (Apr 15, 2025):
Did you also have the same problem with the default value of crowdsecAppsecBodyLimit with 10MB?
You've set it to up to 5000000000 (approx. 4.5GB).
Just for this I've setup a resource to my immich at home with the default setting of crowdsecAppsecBodyLimit 10485760.
I could upload a 2GB test video without any issues.
At first I tried to set crowdsecAppsecBodyLimit value impulsively to the "highest" file size I would upload, that failed also.
So try the default setting (10MB), it works.
@kylepyke commented on GitHub (Apr 15, 2025):
Oh wow, this worked! Thanks! @TuncTaylan @oschwartz10612 @miloschwartz
@muffn commented on GitHub (Oct 21, 2025):
is this issue truly resolved? I think I have the same problem, as soon as I upload a large file (no matter the service, opencloud or immich break my vps) the logs are flooded and my vps becomes unresponsive until I restart it.
It is definitely related to crowdsec, as commenting out the crowdsec middleware "gets rid of the issue" but that isn't really a solution. It also starts right at the 10_000 bytes mark just as referenced in this issue.
docker-compose.yml
dynamic_config.yml
traefik_config.yml
@kylepyke commented on GitHub (Jan 14, 2026):
Hmmm... I'm also still experiencing this issue with OpenCloud now. Seems my system bloats until it crashes.
@muffn commented on GitHub (Jan 14, 2026):
Probably this issue: https://github.com/fosrl/pangolin/issues/2120
I resolved the Instant crash by setting the appsec bodylimit
But crowdsec is not usable on a 1GB vps right now due to the memory leak. I upgraded to 4GB RAM and all issues went away
@kylepyke commented on GitHub (Jan 14, 2026):
I've set the appsec bodylimit, and have 4gb on my VPS. Still have that issue.
@github-actions[bot] commented on GitHub (Jan 29, 2026):
This issue has been automatically marked as stale due to 14 days of inactivity. It will be closed in 14 days if no further activity occurs.
@github-actions[bot] commented on GitHub (Feb 12, 2026):
This issue has been automatically closed due to inactivity. If you believe this is still relevant, please open a new issue with up-to-date information.