serv process are not killed (SSHd) #3001

Closed
opened 2025-11-02 04:56:48 -06:00 by GiteaMirror · 9 comments
Owner

Originally created by @bruestel on GitHub (Mar 4, 2019).

Description

Our CI server is checking the git repositories every minute. After a couple of minutes (less than an hour) all the memory is used and the CPU load is very high.
It seems to me that with each ssh connection from our CI there is a new process (/usr/bin/gitea serv key-9 --config=/etc/gitea/app.ini) created. The process gets never disposed.

I've cleaned my logs (#4450) but it didn't help. I think it is a different problem.

Thanks for your help.

Originally created by @bruestel on GitHub (Mar 4, 2019). - Gitea version (or commit ref): 1.7.3 built with go1.11.5 : bindata, sqlite, pam - Git version: 2.21.0 - Operating system: Arch Linux - Database (use `[x]`): - [ ] PostgreSQL - [ ] MySQL - [ ] MSSQL - [x] SQLite - Can you reproduce the bug at https://try.gitea.io: - [ ] Yes (provide example URL) - [ ] No - [x] Not relevant - Log gist: https://gist.github.com/bruestel/6e94ac1229e7860230fc481f3e8f384c ## Description Our CI server is checking the git repositories every minute. After a couple of minutes (less than an hour) all the memory is used and the CPU load is very high. It seems to me that with each ssh connection from our CI there is a new process (`/usr/bin/gitea serv key-9 --config=/etc/gitea/app.ini`) created. The process gets never disposed. I've cleaned my logs (#4450) but it didn't help. I think it is a different problem. Thanks for your help.
GiteaMirror added the type/questionissue/stale labels 2025-11-02 04:56:48 -06:00
Author
Owner

@lunny commented on GitHub (Mar 17, 2019):

@bruestel yes, every ssh request will create a new process but it should be ended some seconds after finished it's work. So maybe you can find some logs to track why it did not end.

@lunny commented on GitHub (Mar 17, 2019): @bruestel yes, every ssh request will create a new process but it should be ended some seconds after finished it's work. So maybe you can find some logs to track why it did not end.
Author
Owner

@stale[bot] commented on GitHub (May 16, 2019):

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs during the next 2 weeks. Thank you for your contributions.

@stale[bot] commented on GitHub (May 16, 2019): This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs during the next 2 weeks. Thank you for your contributions.
Author
Owner

@zeripath commented on GitHub (May 16, 2019):

Hmm... I suspect that these are due to deadlocked processes.

I think it's time to review serv.go again to check if there are deadlocks.

@zeripath commented on GitHub (May 16, 2019): Hmm... I suspect that these are due to deadlocked processes. I think it's time to review serv.go again to check if there are deadlocks.
Author
Owner

@lunny commented on GitHub (May 16, 2019):

gitea serv will invoke git command and blocked util that command returned. If the command is blocked, it will always wait.

@lunny commented on GitHub (May 16, 2019): `gitea serv` will invoke `git` command and blocked util that command returned. If the command is blocked, it will always wait.
Author
Owner

@zeripath commented on GitHub (May 16, 2019):

OK, So both serv and hook appear to be only using private urls which is good - but each action it does is within a new session. We may be better to actually just make serv and hook do all of their logic in a private url and then return their results in simplified fashion to serv/hook.

@zeripath commented on GitHub (May 16, 2019): OK, So both serv and hook appear to be only using private urls which is good - but each action it does is within a new session. We may be better to actually just make serv and hook do all of their logic in a private url and then return their results in simplified fashion to serv/hook.
Author
Owner

@lunny commented on GitHub (May 17, 2019):

@zeripath I like your idea. But that may not the reason of this issue.

@lunny commented on GitHub (May 17, 2019): @zeripath I like your idea. But that may not the reason of this issue.
Author
Owner

@zeripath commented on GitHub (May 17, 2019):

It depends on the cause I guess, and although moving the logic won't immediately fix what I suspect is the problem - it will help formulate the complete solution. (It should also make both serv and hook faster as they won't need to make multiple internal API requests - and we might be able to drop the gitlogger.)

I suspect that there's deadlock in hook caused by the calls to send the webhooks.

@zeripath commented on GitHub (May 17, 2019): It depends on the cause I guess, and although moving the logic won't immediately fix what I suspect is the problem - it will help formulate the complete solution. (It should also make both serv and hook faster as they won't need to make multiple internal API requests - and we might be able to drop the gitlogger.) I suspect that there's deadlock in hook caused by the calls to send the webhooks.
Author
Owner

@stale[bot] commented on GitHub (Jul 16, 2019):

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs during the next 2 weeks. Thank you for your contributions.

@stale[bot] commented on GitHub (Jul 16, 2019): This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs during the next 2 weeks. Thank you for your contributions.
Author
Owner

@stale[bot] commented on GitHub (Jul 30, 2019):

This issue has been automatically closed because of inactivity. You can re-open it if needed.

@stale[bot] commented on GitHub (Jul 30, 2019): This issue has been automatically closed because of inactivity. You can re-open it if needed.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/gitea#3001