mirror of
https://github.com/go-gitea/gitea.git
synced 2026-03-13 19:37:47 -05:00
Gitea keeping taking 100% CPU usage, and runs slowly in git pushing #5023
Closed
opened 2025-11-02 06:11:18 -06:00 by GiteaMirror
·
53 comments
No Branch/Tag Specified
main
release/v1.25
release/v1.24
release/v1.23
release/v1.22
release/v1.21
release/v1.20
release/v1.19
release/v1.18
release/v1.17
release/v1.16
release/v1.15
release/v1.14
release/v1.13
release/v1.12
release/v1.11
release/v1.10
release/v1.9
release/v1.8
v1.25.3
v1.25.2
v1.25.1
v1.25.0
v1.24.7
v1.25.0-rc0
v1.26.0-dev
v1.24.6
v1.24.5
v1.24.4
v1.24.3
v1.24.2
v1.24.1
v1.24.0
v1.23.8
v1.24.0-rc0
v1.25.0-dev
v1.23.7
v1.23.6
v1.23.5
v1.23.4
v1.23.3
v1.23.2
v1.23.1
v1.23.0
v1.23.0-rc0
v1.24.0-dev
v1.22.6
v1.22.5
v1.22.4
v1.22.3
v1.22.2
v1.22.1
v1.22.0
v1.23.0-dev
v1.22.0-rc1
v1.21.11
v1.22.0-rc0
v1.21.10
v1.21.9
v1.21.8
v1.21.7
v1.21.6
v1.21.5
v1.21.4
v1.21.3
v1.21.2
v1.20.6
v1.21.1
v1.21.0
v1.21.0-rc2
v1.21.0-rc1
v1.20.5
v1.22.0-dev
v1.21.0-rc0
v1.20.4
v1.20.3
v1.20.2
v1.20.1
v1.20.0
v1.19.4
v1.21.0-dev
v1.20.0-rc2
v1.20.0-rc1
v1.20.0-rc0
v1.19.3
v1.19.2
v1.19.1
v1.19.0
v1.19.0-rc1
v1.20.0-dev
v1.19.0-rc0
v1.18.5
v1.18.4
v1.18.3
v1.18.2
v1.18.1
v1.18.0
v1.17.4
v1.18.0-rc1
v1.19.0-dev
v1.18.0-rc0
v1.17.3
v1.17.2
v1.17.1
v1.17.0
v1.17.0-rc2
v1.16.9
v1.17.0-rc1
v1.18.0-dev
v1.16.8
v1.16.7
v1.16.6
v1.16.5
v1.16.4
v1.16.3
v1.16.2
v1.16.1
v1.16.0
v1.15.11
v1.17.0-dev
v1.16.0-rc1
v1.15.10
v1.15.9
v1.15.8
v1.15.7
v1.15.6
v1.15.5
v1.15.4
v1.15.3
v1.15.2
v1.15.1
v1.14.7
v1.15.0
v1.15.0-rc3
v1.14.6
v1.15.0-rc2
v1.14.5
v1.16.0-dev
v1.15.0-rc1
v1.14.4
v1.14.3
v1.14.2
v1.14.1
v1.14.0
v1.13.7
v1.14.0-rc2
v1.13.6
v1.13.5
v1.14.0-rc1
v1.15.0-dev
v1.13.4
v1.13.3
v1.13.2
v1.13.1
v1.13.0
v1.12.6
v1.13.0-rc2
v1.14.0-dev
v1.13.0-rc1
v1.12.5
v1.12.4
v1.12.3
v1.12.2
v1.12.1
v1.11.8
v1.12.0
v1.11.7
v1.12.0-rc2
v1.11.6
v1.12.0-rc1
v1.13.0-dev
v1.11.5
v1.11.4
v1.11.3
v1.10.6
v1.12.0-dev
v1.11.2
v1.10.5
v1.11.1
v1.10.4
v1.11.0
v1.11.0-rc2
v1.10.3
v1.11.0-rc1
v1.10.2
v1.10.1
v1.10.0
v1.9.6
v1.9.5
v1.10.0-rc2
v1.11.0-dev
v1.10.0-rc1
v1.9.4
v1.9.3
v1.9.2
v1.9.1
v1.9.0
v1.9.0-rc2
v1.10.0-dev
v1.9.0-rc1
v1.8.3
v1.8.2
v1.8.1
v1.8.0
v1.8.0-rc3
v1.7.6
v1.8.0-rc2
v1.7.5
v1.8.0-rc1
v1.9.0-dev
v1.7.4
v1.7.3
v1.7.2
v1.7.1
v1.7.0
v1.7.0-rc3
v1.6.4
v1.7.0-rc2
v1.6.3
v1.7.0-rc1
v1.7.0-dev
v1.6.2
v1.6.1
v1.6.0
v1.6.0-rc2
v1.5.3
v1.6.0-rc1
v1.6.0-dev
v1.5.2
v1.5.1
v1.5.0
v1.5.0-rc2
v1.5.0-rc1
v1.5.0-dev
v1.4.3
v1.4.2
v1.4.1
v1.4.0
v1.4.0-rc3
v1.4.0-rc2
v1.3.3
v1.4.0-rc1
v1.3.2
v1.3.1
v1.3.0
v1.3.0-rc2
v1.3.0-rc1
v1.2.3
v1.2.2
v1.2.1
v1.2.0
v1.2.0-rc3
v1.2.0-rc2
v1.1.4
v1.2.0-rc1
v1.1.3
v1.1.2
v1.1.1
v1.1.0
v1.0.2
v1.0.1
v1.0.0
v0.9.99
Labels
Clear labels
$20
$250
$50
$500
backport/done
💎 Bounty
docs-update-needed
good first issue
hacktoberfest
issue/bounty
issue/confirmed
issue/critical
issue/duplicate
issue/needs-feedback
issue/not-a-bug
issue/regression
issue/stale
issue/workaround
lgtm/need 2
modifies/api
modifies/translation
outdated/backport/v1.18
outdated/theme/markdown
outdated/theme/timetracker
performance/bigrepo
performance/cpu
performance/memory
performance/speed
pr/breaking
proposal/accepted
proposal/rejected
pr/wip
pull-request
reviewed/wontfix
💰 Rewarded
skip-changelog
status/blocked
topic/accessibility
topic/api
topic/authentication
topic/build
topic/code-linting
topic/commit-signing
topic/content-rendering
topic/deployment
topic/distribution
topic/federation
topic/gitea-actions
topic/issues
topic/lfs
topic/mobile
topic/moderation
topic/packages
topic/pr
topic/projects
topic/repo
topic/repo-migration
topic/security
topic/theme
topic/ui
topic/ui-interaction
topic/ux
topic/webhooks
topic/wiki
type/bug
type/deprecation
type/docs
type/enhancement
type/feature
type/miscellaneous
type/proposal
type/question
type/refactoring
type/summary
type/testing
type/upstream
Mirrored from GitHub Pull Request
No Label
Milestone
No items
No Milestone
Projects
Clear projects
No project
No Assignees
Notifications
Due Date
No due date set.
Dependencies
No dependencies set.
Reference: github-starred/gitea#5023
Reference in New Issue
Block a user
Blocking a user prevents them from interacting with repositories, such as opening or commenting on pull requests or issues. Learn more about blocking a user.
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @duchenpaul on GitHub (Mar 7, 2020).
Gitea version (or commit ref): 1.12.0+dev-440-gf5a20250a
Git version: git version 2.20.1
Operating system: Raspbian GNU/Linux 10 (buster) on Raspberry Pi zero w
Database (use
[x]):Can you reproduce the bug at https://try.gitea.io:
Log gist:
Architecture: linux-arm-6
Description
After upgrading the gitea to this version, I found that gitea runs very slow.
By checking the CPU usage, the CPU usage keeps 100%.
And I checked the ps, found that seems the
gitea hookprocesses (2395, 964) are taking up lots resource, I would like to know what are they and how can I track them, and how to stop them.As far as I know, there is no git hook setting in my repo that runs on this server. So I think this could be the gitea's system hook.
And I see PID 2366 saying
duchenpaul/homebridge-docker.gitis executing hook, but there is no hook script set in that repoAs the PID 961, I dont know what is going on there.
I checked the log, only the sql that queried in the log, no errors
[Update] Summarize
gitea serv key-3 --config=/home/git/gitea/custom/conf/app.iniis introduced from git client's requests, for some reason, some git request did not finishes properly, instead, taking up all the CPU for hours and slow down the system.0768823) in the release, I am using v1.11.1 (ff24f81), it is ok, but as long as I switch to v1.11.2, above thing start to happen in a short time.Screenshots
@duchenpaul commented on GitHub (Mar 8, 2020):
Also, I observed that there are 3 processes called
gitea serv key-3 --config=/home/git/gitea/custom/conf/app.iniwhich taking up the most of CPU usage, what are they? How can it be stopped@guillep2k commented on GitHub (Mar 8, 2020):
Those three processes (939, 998 and 1321) sound like the server side of git clients. The fact that all of them have
key-3means that they all used the same credentials (public key). It's probably you trying to push, then cancelling, but for some reason the server side didn't realize about that yet.But the culprit seems to be:
which is Gitea's update hook on the repository. The
14:45value is the accumulated CPU time used, which is not normal unless you're pushing on the Linux kernel source code in an old Pentium computer.And yes, those hooks are Gitea's git plumbing that make things work behind the scenes.
Some ideas:
git fsckand agit gcon the repo via the admin menu. It's possible that there's something wrong with it.Also, a Raspberry Pi Zero W doesn't have much RAM in it. Try checking whether your system is paging too much. Use the
topprogram and hit themkey until the memory graph shows up:Or maybe
vmstat 3 999(you cancel that withCTRL+C):The
siandsocolumns should be 0 most of the time. Paging in and out is something that really slows down the system.(Note: don't take the values in my pictures as expected for your system: mine is a CentOS 7 VM with 12GB of RAM assigned to it).
@duchenpaul commented on GitHub (Mar 8, 2020):
Thank you for your replying,
I did a reboot again, looks ok for now.
I didn't cancel any transactions in-fight, from what I see for client side, all transactions are finished normally.
I believe PID 964 I referring to is blocked due to the high CPU usage of ssh transactions
My questions are:
anyway, I will keep monitoring for 3 days, if it is ok, I will close this issue.
This gitea server is only serving myself, I think it still is able to handle the job well. 😄

output for
vmstat 3 999@duchenpaul commented on GitHub (Mar 8, 2020):
[Update] I found these 2 gitea serv process pops up again, while I didn't even do anything.
Anyone can help me find out who caused this?
@duchenpaul commented on GitHub (Mar 8, 2020):
Close for now, I found it could be my github desktop causing this. I change to GitExtensions, it never hangs.
@lunny commented on GitHub (Mar 8, 2020):
Some git tools may check repo status on background.
@duchenpaul commented on GitHub (Mar 8, 2020):
Yes, you can tell from the log of github desktop, but how come the process stalls on the server side, and taking up all cpu resource, even the client is quit, the client end PC is powered off.
@zeripath commented on GitHub (Mar 8, 2020):
Not sure - we would need to see some logs from gitea as to what it's doing.
For example the
updatehook should just be basically empty - I've suggested it's removal entirely as it only can slow things down.@duchenpaul commented on GitHub (Mar 9, 2020):
Yeah, I will reopen this issue, and set my gitea log level to debug to collect more info.
I didn't use
updatehook in any repo. only thepost-receiveone in a few repos just to deploy the code on remote server.Now I found that even gitea dump can stall sometime, just hanging there and put the CPU in full throttle.
@duchenpaul commented on GitHub (Mar 9, 2020):
This is my log conf for debugging:
is there any thing to add to help locate the problem?
@duchenpaul commented on GitHub (Mar 9, 2020):
It got stall again.
The PID 8480 takes 100% CPU usage. Even the gitea service is stoped by me.
This process starts at 09:46:14, I checked each log around that time, what I find is only
Started Session c187 of user git.which my git client is tring to update the repo in backgroundattached the logs I can find below.
syslog.log
access.log
auth.log
daemon.log
gitea.log
@guillep2k commented on GitHub (Mar 9, 2020):
Have you tried running the fsck and gc? Those are important, given that you're using an SD-card to store the repos. If you are unable to do it via Gitea (e.g. system gets locked up before you get the chance), you can shutdown Gitea and run the commands yourself:
(Change the initial path accordingly if required)
@guillep2k commented on GitHub (Mar 9, 2020):
Note:
STACKTRACE_LEVEL = Debugis excessive. You should change it toErrorso important messages aren't buried too deep in the logs.@duchenpaul commented on GitHub (Mar 9, 2020):
Yes, I did tried fsck and gc, thank you for your bash snippt, it works like a charm, and ends perfectly.
I guess some bugs introduced between 1.11.0+dev-481-g1df701fd1(good) and 1.10.5+18-gd6657644a(bad) on arch linux-arm-6.
I just formated the SD card, to rule out my file system issue, and restore with the image I backupped days ago, with the gitea version:
1.11.0+dev-481-g1df701fd1.Keeping running for hours, with some push pull actions performed. It is ok.
And I upgrade gitea to master version which is
1.10.5+18-gd6657644a, problem reproduces. Now I rolled back to the previous version, to see if my assuming is correct.@guillep2k commented on GitHub (Mar 9, 2020):
Update
It seems that we had some issues when tagging our latest
masterimage; it's1.12.0+devbut the version string says1.10.5. 😳Original message (just in case it's useful)
But... those are not actual releases, and you are in fact downgrading. 🤔 Releases have only three digits in their version number, with nothing else after that.
Our latest release is
1.11.2. If you want the cutting edge (i.e. the version we're currently working on but is not polished), you could go formaster. If you want to build from sources, you need to checkout exactlyv1.11.2(ormaster).The
masterversion is1.12.0+dev-something, e.g.1.12.0+dev-435-g6a57364dcwhich means "435 commits after the1.12.0+devtag was stablished", i.e. 435 commits after we've started working in 1.12.1.10.5is our latest patch release for the 1.10 branch, which we're still maintaining for severe bugs.@duchenpaul commented on GitHub (Mar 9, 2020):
Yes, I noticed there is issue with version number. But I am sure there is issue in new version.
I download the gitea from the master folder here: https://dl.gitea.io/gitea/master/gitea-master-linux-arm-6, this is the newest version and where this thing occurs.
@duchenpaul commented on GitHub (Mar 9, 2020):
Upgrade to Gitea Version: 1.11.2 (
0768823), from github releases, problem persists.I will downgrade version by version to narrow down the problem
@duchenpaul commented on GitHub (Mar 9, 2020):
Looks like the bug is between this 2 commits:
ff24f81-0768823.I am running v1.11.1(
ff24f81), so far so good@zeripath commented on GitHub (Mar 9, 2020):
Are you definitely running MySQL and not Sqlite?
@duchenpaul commented on GitHub (Mar 9, 2020):
MySQL: version: mysql Ver 15.1 Distrib 10.3.22-MariaDB, for debian-linux-gnueabihf (armv8l) using readline 5.2
I dont think this is related to database.
the gitea server cannot handle some request from client side, these are processes like below, just handing there for hours, taking up all the CPU.
I wonder if there is a log to track this behavior? and interesting that this only happens on v1.11.2 in the github v1.11.x releases. Must be somewhere has issue.
@zeripath commented on GitHub (Mar 9, 2020):
The stacktraces are not helpful - they'd be more useful for figuring out why something is causing trouble once we have identified an issue. That being said they're not too difficult to remove from the logs for quick review.
I've looked at your daemon.log and I can't see any request to Gitea that is opened without closing and these are all at most around a second.
I'm looking at the others right now.
@zeripath commented on GitHub (Mar 9, 2020):
Do you have any logs from when there is a hang?
@duchenpaul commented on GitHub (Mar 9, 2020):
These are the logs I catched when there is a hang
@duchenpaul commented on GitHub (Mar 9, 2020):
I dont know if this is appropraite but if you like, I could give you ssh access to my server, just to help navigate the problem.
my email: duchenpaul@gmail.com
@zeripath commented on GitHub (Mar 9, 2020):
Running
grep git auth.log | grep sshd:sessionreveals only a few sessions greater than 2 minutes:sshd[5028] is:
@zeripath commented on GitHub (Mar 9, 2020):
But I can't see what was happening at that point because there are no logs.
@duchenpaul commented on GitHub (Mar 9, 2020):
Change a way to think, I pasted to process hanging details up there. This process started at Mar 9 09:46:14 and below is what I extracted from auth log, I believe this’s the hanging session, it doesn’t have a stop as I killed it manually or just reboot the server.
@duchenpaul commented on GitHub (Mar 9, 2020):
If you want to check the log, just see the time stamp “ Mar 9 09:46:14” this is China time, utc +8 hrs , and the Gitea log is utc time, you should do a conversion
@zeripath commented on GitHub (Mar 9, 2020):
But I can't find a post or get to gitea that is not completed in daemon.log...
@duchenpaul commented on GitHub (Mar 10, 2020):
Of course you cannt find it. Let me summarize the problem:
The hanging of the process is introduced by git requests from client side.
My git client is github desktop, it does git fetch regularly and some of its request just hanging there.
Maybe I should post github desktop log.
@duchenpaul commented on GitHub (Mar 10, 2020):
I updated some info I collected in past few day at the issue report comment (the first comment)
@jedy commented on GitHub (Mar 10, 2020):
I have the same problem after I updated gitea to 1.11.2. My system is Linux amd64 and I use sqlite.
@lunny commented on GitHub (Mar 10, 2020):
One process call
/home/git/gitea/gitea hook --config=/home/git/gitea/custom/conf/app.ini post-receiveon issue information.@jedy If you exit all git clients and then upgrade to v1.11.2, are they still occured?
@guillep2k commented on GitHub (Mar 10, 2020):
I can only suggest creating a core dump from one of those processes. We may at least be able to see what it's doing. The problem is that there's no way of sanitizing a core dump, so you should not share its raw contents.
If you've got delve or gdb installed, maybe you could open those cores ant tell us what are they doing.
@duchenpaul commented on GitHub (Mar 10, 2020):
OK I will get it back to you.
@jedy commented on GitHub (Mar 10, 2020):
I compiled 1.11.2 with go 1.13.8. It seems OK now.
@duchenpaul commented on GitHub (Mar 10, 2020):
@jedy Could you compile a linux-arm-6 version for me if possible?
@jedy commented on GitHub (Mar 10, 2020):
@duchenpaul Sorry. I don't have an arm environment.It got some error to cross compile on linux.
@jedy commented on GitHub (Mar 10, 2020):
gdb backtrace of the hanging process:
@jedy commented on GitHub (Mar 10, 2020):
I ran strace on the hanging process. A lots of SIGURG popped up. I thought it's maybe related to the new preemptible runtime of Go which uses SIGURG. So I tried go 1.13.8. After an hour of running well, I think that's the problem.
@guillep2k commented on GitHub (Mar 10, 2020):
Nice catch! We need to look closely into this.
@guillep2k commented on GitHub (Mar 10, 2020):
The backtrace shows only the first thread, which is probably not the most useful. 🙁
@lunny commented on GitHub (Mar 10, 2020):
#10684 may fix this, it will be released on v1.11.3
@duchenpaul commented on GitHub (Mar 10, 2020):
My go version is
Go1.13.8, is that accurate?@duchenpaul commented on GitHub (Mar 10, 2020):
Here is my gdb dump, any thought on this?
@joaodforce commented on GitHub (Mar 10, 2020):
I'm having this same issue after updating to 1.11.2
I'm using sqlite and running under Fedora Server 30 x64
If someone needs more info regarding this I'm available.
@guillep2k commented on GitHub (Mar 10, 2020):
1.11.3is out in a few minutes. Please try with that and let us know your results. (Make sure no Gitea process is still active before upgrading).@duchenpaul commented on GitHub (Mar 11, 2020):
Looking good so far, but we lost code language bar?

There used to have an indicator showing the percentage of languages in this project.
@jolheiser commented on GitHub (Mar 11, 2020):
Language statistics are only in versions 1.12+ (1.12 is the currently unreleased master branch)
@duchenpaul commented on GitHub (Mar 11, 2020):
OK but found in 1.11.2, looks it is fixed! Kudos!!
@mrsdizzie commented on GitHub (Mar 11, 2020):
I'm not sure this is actually fixed -- if compiling with go 1.13 fixes it then master and new versions should still have the same problem using go 1.14 going forward
@zeripath commented on GitHub (Mar 11, 2020):
Yes @mrsdizzie We're gonna need to figure out what is causing this. The go release document is not very clear as to how this is supposed to be solved.
@duchenpaul commented on GitHub (Mar 13, 2020):
@zeripath @mrsdizzie let me know if you need any assistant from me