Endless query loop causing unusual CPU usage #6706

Closed
opened 2025-11-02 07:04:20 -06:00 by GiteaMirror · 5 comments
Owner

Originally created by @seadra on GitHub (Jan 17, 2021).

  • Gitea version (or commit ref): 1.13.1 (self compiled with sqlite3, using go 1.15.5)
  • Git version: 2.20.1
  • Operating system: Debian (armv7l)
  • Database (use [x]):
    • PostgreSQL
    • MySQL
    • MSSQL
    • SQLite
  • Can you reproduce the bug at https://try.gitea.io:
    • Yes (provide example URL)
    • No
  • Log gist:

Description

gitea has unusual CPU usage. From the logs, it seems gitea is endlessly running this query in the background:

..m.io/xorm/core/db.go:286:afterProcess() [I] [SQL] SELECT user_id, count(*) AS count FROM notification WHERE user_id IN (SELECT user_id FROM notification WHERE updated_unix >= ? AND updated_unix < ?) AND status = ? GROUP BY user_id [1610911440 1610911450 1] - 1.048278ms

The user_id is numbers 1610911440 and 1610911450 are incremented by 10 in each iteration. gitea has been running for 2 days, and the log file is filled continuously with this query.

Screenshots

Originally created by @seadra on GitHub (Jan 17, 2021). <!-- NOTE: If your issue is a security concern, please send an email to security@gitea.io instead of opening a public issue --> <!-- 1. Please speak English, this is the language all maintainers can speak and write. 2. Please ask questions or configuration/deploy problems on our Discord server (https://discord.gg/gitea) or forum (https://discourse.gitea.io). 3. Please take a moment to check that your issue doesn't already exist. 4. Make sure it's not mentioned in the FAQ (https://docs.gitea.io/en-us/faq) 5. Please give all relevant information below for bug reports, because incomplete details will be handled as an invalid report. --> - Gitea version (or commit ref): 1.13.1 (self compiled with sqlite3, using go 1.15.5) - Git version: 2.20.1 - Operating system: Debian (armv7l) <!-- Please include information on whether you built gitea yourself, used one of our downloads or are using some other package --> <!-- Please also tell us how you are running gitea, e.g. if it is being run from docker, a command-line, systemd etc. ---> <!-- If you are using a package or systemd tell us what distribution you are using --> - Database (use `[x]`): - [ ] PostgreSQL - [ ] MySQL - [ ] MSSQL - [X] SQLite - Can you reproduce the bug at https://try.gitea.io: - [ ] Yes (provide example URL) - [ ] No - Log gist: <!-- It really is important to provide pertinent logs --> <!-- Please read https://docs.gitea.io/en-us/logging-configuration/#debugging-problems --> <!-- In addition, if your problem relates to git commands set `RUN_MODE=dev` at the top of app.ini --> ## Description gitea has unusual CPU usage. From the logs, it seems gitea is endlessly running this query in the background: ``` ..m.io/xorm/core/db.go:286:afterProcess() [I] [SQL] SELECT user_id, count(*) AS count FROM notification WHERE user_id IN (SELECT user_id FROM notification WHERE updated_unix >= ? AND updated_unix < ?) AND status = ? GROUP BY user_id [1610911440 1610911450 1] - 1.048278ms ``` The user_id is numbers 1610911440 and 1610911450 are incremented by 10 in each iteration. gitea has been running for 2 days, and the log file is filled continuously with this query. ## Screenshots <!-- **If this issue involves the Web Interface, please include a screenshot** -->
GiteaMirror added the performance/speed label 2025-11-02 07:04:20 -06:00
Author
Owner

@noerw commented on GitHub (Jan 17, 2021):

Increased CPU usage is a known issue, see #7910.
This query is related to checking notifications for a signed in user every 10 seconds, and is most likely not responsible for base load.
Closing this as duplicate, unless you can provide more context.

@noerw commented on GitHub (Jan 17, 2021): Increased CPU usage is a known issue, see #7910. This query is related to checking notifications for a signed in user every 10 seconds, and is most likely not responsible for base load. Closing this as duplicate, unless you can provide more context.
Author
Owner

@zeripath commented on GitHub (Jan 17, 2021):

https://docs.gitea.io/en-us/config-cheat-sheet/#ui---notification-uinotification

You can also change the frequency and or turn it off by changing the EVENT_SOURCE_UPDATE_TIME value

@zeripath commented on GitHub (Jan 17, 2021): https://docs.gitea.io/en-us/config-cheat-sheet/#ui---notification-uinotification You can also change the frequency and or turn it off by changing the EVENT_SOURCE_UPDATE_TIME value
Author
Owner

@seadra commented on GitHub (Jan 17, 2021):

How can I get more context? Is there an easy way to profile the gitea to see what is causing the CPU load?

I don't think my issue is a duplicate of #7910 because it didn't happen in versions <=1.12.

@seadra commented on GitHub (Jan 17, 2021): How can I get more context? Is there an easy way to profile the gitea to see what is causing the CPU load? I don't think my issue is a duplicate of #7910 because it didn't happen in versions <=1.12.
Author
Owner

@seadra commented on GitHub (Jan 17, 2021):

I tried running go tool pprof http://localhost:6060/debug/pprof/profile?seconds=30 and top10 shows the following

Duration: 30s, Total samples = 1.72s ( 5.73%)
Entering interactive mode (type "help" for commands, "o" for options)
(pprof) top10
Showing nodes accounting for 1420ms, 82.56% of 1720ms total
Showing top 10 nodes out of 91
      flat  flat%   sum%        cum   cum%
     400ms 23.26% 23.26%      400ms 23.26%  runtime.futex
     320ms 18.60% 41.86%      320ms 18.60%  runtime.epollwait
     190ms 11.05% 52.91%      190ms 11.05%  runtime._LostSIGPROFDuringAtomic64
     150ms  8.72% 61.63%      150ms  8.72%  runtime._ExternalCode
     140ms  8.14% 69.77%      140ms  8.14%  runtime.usleep
     120ms  6.98% 76.74%      120ms  6.98%  kernelcas
      40ms  2.33% 79.07%      740ms 43.02%  runtime.findrunnable
      20ms  1.16% 80.23%      100ms  5.81%  code.gitea.io/gitea/modules/queue.(*WorkerPool).doWork
      20ms  1.16% 81.40%       20ms  1.16%  runtime.heapBitsSetType
      20ms  1.16% 82.56%       40ms  2.33%  runtime.mallocgc
(pprof)

During the 30 seconds, no requests was made to gitea (supposed to be completely idle), but it kept consuming 4-10% CPU.

Any ideas?

@seadra commented on GitHub (Jan 17, 2021): I tried running `go tool pprof http://localhost:6060/debug/pprof/profile?seconds=30` and `top10` shows the following ``` Duration: 30s, Total samples = 1.72s ( 5.73%) Entering interactive mode (type "help" for commands, "o" for options) (pprof) top10 Showing nodes accounting for 1420ms, 82.56% of 1720ms total Showing top 10 nodes out of 91 flat flat% sum% cum cum% 400ms 23.26% 23.26% 400ms 23.26% runtime.futex 320ms 18.60% 41.86% 320ms 18.60% runtime.epollwait 190ms 11.05% 52.91% 190ms 11.05% runtime._LostSIGPROFDuringAtomic64 150ms 8.72% 61.63% 150ms 8.72% runtime._ExternalCode 140ms 8.14% 69.77% 140ms 8.14% runtime.usleep 120ms 6.98% 76.74% 120ms 6.98% kernelcas 40ms 2.33% 79.07% 740ms 43.02% runtime.findrunnable 20ms 1.16% 80.23% 100ms 5.81% code.gitea.io/gitea/modules/queue.(*WorkerPool).doWork 20ms 1.16% 81.40% 20ms 1.16% runtime.heapBitsSetType 20ms 1.16% 82.56% 40ms 2.33% runtime.mallocgc (pprof) ``` During the 30 seconds, no requests was made to gitea (supposed to be completely idle), but it kept consuming 4-10% CPU. Any ideas?
Author
Owner

@zeripath commented on GitHub (Feb 7, 2021):

The doWork loop here implies that there is some background work going on.

To understand the runtime.futex load you'd need to find out what is calling that - it's too low level to talk about. It's probably just things waiting for work - maybe even sqlite.

Now there is a polling loop:
c11db35aec/modules/queue/queue_bytefifo.go (L115)
Which ideally would be changeable to something that is blocked rather than being a timer loop but unfortunately I've not been able to find that.

There is always background polling going on and 4% is hardly a lot of work on a raspberry pi.

@zeripath commented on GitHub (Feb 7, 2021): The doWork loop here implies that there is some background work going on. To understand the runtime.futex load you'd need to find out what is calling that - it's too low level to talk about. It's probably just things waiting for work - maybe even sqlite. Now there is a polling loop: https://github.com/go-gitea/gitea/blob/c11db35aec40e8e47ee8c678e508a7cdd06a2891/modules/queue/queue_bytefifo.go#L115 Which ideally would be changeable to something that is blocked rather than being a timer loop but unfortunately I've not been able to find that. There is always background polling going on and 4% is hardly a lot of work on a raspberry pi.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/gitea#6706