Gitea startup writes to disk at 150-200mb/s nonstop. #9606

Closed
opened 2025-11-02 08:44:18 -06:00 by GiteaMirror · 6 comments
Owner

Originally created by @Qubitium on GitHub (Sep 23, 2022).

Description

Gitea instance with multiple repos during restart causes massive amount of disk writes in amount of 150-200MB/s for several hours.

Nonstop logs when LEVEL = trace

        <nil>
        3 100644 go/src/github.com/json-iterator/go/output_tests/slice/ptr_map/int32_string/types.go 00e90409eb199fbc004e6d1f0eddd3156fd89950
2022/09/23 03:11:00 ...rvices/pull/patch.go:127:attemptMerge() [T] [632d23bd-56] Attempt to merge:
        1 100644 go/src/github.com/json-iterator/go/output_tests/slice/ptr_int32/json_test.go 3c35ffe5f176d233700d5bbd0346520b9466e683
        <nil>
        3 100644 go/src/github.com/json-iterator/go/output_tests/slice/ptr_int32/json_test.go 3c35ffe5f176d233700d5bbd0346520b9466e683
2022/09/23 03:11:00 ...dules/git/command.go:153:Run() [D] [632d23bd-56] /raid0/gitea/data/tmp/local-repo/pull.git607819695: /usr/bin/git -c protocol.version=2 -c uploadpack.allowfilter=true -c uploadpack.allowAnySHA1InWant=true -c credential.helper= -c filter.lfs.required= -c filter.lfs.smudge= -c filter.lfs.clean= update-index --remove -z --index-info
2022/09/23 03:11:00 ...ll/patch_unmerged.go:163:unmergedFiles() [T] [632d23bd-56] Got line: 3 100644 go/src/github.com/json-iterator/go/output_tests/slice/ptr_int32/types.go 522a5912d68b5d26d8aab0ba1429723e081316da Current State:
        1 100644 go/src/github.com/json-iterator/go/output_tests/slice/ptr_int32/types.go 522a5912d68b5d26d8aab0ba1429723e081316da
        <nil>
        <nil>
2022/09/23 03:11:00 ...ll/patch_unmerged.go:163:unmergedFiles() [T] [632d23bd-56] Got line: 1 100644 go/src/github.com/json-iterator/go/output_tests/slice/ptr_map/int32_string/json_test.go 3c35ffe5f176d233700d5bbd0346520b9466e683 Current State:
        1 100644 go/src/github.com/json-iterator/go/output_tests/slice/ptr_int32/types.go 522a5912d68b5d26d8aab0ba1429723e081316da
        <nil>
        3 100644 go/src/github.com/json-iterator/go/output_tests/slice/ptr_int32/types.go 522a5912d68b5d26d8aab0ba1429723e081316da
2022/09/23 03:11:00 ...dules/git/command.go:153:Run() [D] [632d23bd-74] /raid0/gitea/data/tmp/local-repo/pull.git1658027271: /usr/bin/git -c protocol.version=2 -c uploadpack.allowfilter=true -c uploadpack.allowAnySHA1InWant=true -c credential.helper= -c filter.lfs.required= -c filter.lfs.smudge= -c filter.lfs.clean= update-index --add --replace --cacheinfo 100644 f1005b898a5f4568501fba8d998b207436c7cb8c app/src/main/res/values/colors.xml

Gitea Version

1.17.2

Can you reproduce the bug on the Gitea demo site?

No

Log Gist

No response

Screenshots

No response

Git Version

2.36.1.507.g7c58a9bb42

Operating System

Debian 11

How are you running Gitea?


### Database

MySQL
Originally created by @Qubitium on GitHub (Sep 23, 2022). ### Description Gitea instance with multiple repos during restart causes massive amount of disk writes in amount of 150-200MB/s for several hours. Nonstop logs when LEVEL = trace ``` <nil> 3 100644 go/src/github.com/json-iterator/go/output_tests/slice/ptr_map/int32_string/types.go 00e90409eb199fbc004e6d1f0eddd3156fd89950 2022/09/23 03:11:00 ...rvices/pull/patch.go:127:attemptMerge() [T] [632d23bd-56] Attempt to merge: 1 100644 go/src/github.com/json-iterator/go/output_tests/slice/ptr_int32/json_test.go 3c35ffe5f176d233700d5bbd0346520b9466e683 <nil> 3 100644 go/src/github.com/json-iterator/go/output_tests/slice/ptr_int32/json_test.go 3c35ffe5f176d233700d5bbd0346520b9466e683 2022/09/23 03:11:00 ...dules/git/command.go:153:Run() [D] [632d23bd-56] /raid0/gitea/data/tmp/local-repo/pull.git607819695: /usr/bin/git -c protocol.version=2 -c uploadpack.allowfilter=true -c uploadpack.allowAnySHA1InWant=true -c credential.helper= -c filter.lfs.required= -c filter.lfs.smudge= -c filter.lfs.clean= update-index --remove -z --index-info 2022/09/23 03:11:00 ...ll/patch_unmerged.go:163:unmergedFiles() [T] [632d23bd-56] Got line: 3 100644 go/src/github.com/json-iterator/go/output_tests/slice/ptr_int32/types.go 522a5912d68b5d26d8aab0ba1429723e081316da Current State: 1 100644 go/src/github.com/json-iterator/go/output_tests/slice/ptr_int32/types.go 522a5912d68b5d26d8aab0ba1429723e081316da <nil> <nil> 2022/09/23 03:11:00 ...ll/patch_unmerged.go:163:unmergedFiles() [T] [632d23bd-56] Got line: 1 100644 go/src/github.com/json-iterator/go/output_tests/slice/ptr_map/int32_string/json_test.go 3c35ffe5f176d233700d5bbd0346520b9466e683 Current State: 1 100644 go/src/github.com/json-iterator/go/output_tests/slice/ptr_int32/types.go 522a5912d68b5d26d8aab0ba1429723e081316da <nil> 3 100644 go/src/github.com/json-iterator/go/output_tests/slice/ptr_int32/types.go 522a5912d68b5d26d8aab0ba1429723e081316da 2022/09/23 03:11:00 ...dules/git/command.go:153:Run() [D] [632d23bd-74] /raid0/gitea/data/tmp/local-repo/pull.git1658027271: /usr/bin/git -c protocol.version=2 -c uploadpack.allowfilter=true -c uploadpack.allowAnySHA1InWant=true -c credential.helper= -c filter.lfs.required= -c filter.lfs.smudge= -c filter.lfs.clean= update-index --add --replace --cacheinfo 100644 f1005b898a5f4568501fba8d998b207436c7cb8c app/src/main/res/values/colors.xml ``` ### Gitea Version 1.17.2 ### Can you reproduce the bug on the Gitea demo site? No ### Log Gist _No response_ ### Screenshots _No response_ ### Git Version 2.36.1.507.g7c58a9bb42 ### Operating System Debian 11 ### How are you running Gitea? ``` ### Database MySQL
GiteaMirror added the issue/needs-feedback label 2025-11-02 08:44:18 -06:00
Author
Owner

@wxiaoguang commented on GitHub (Sep 23, 2022):

I guess the problem is caused by:

  1. There is a PR merge check task in the queue
  2. Something wrong happens during the merge check
  3. The check task is re-queued then run again

But these ten-line logs are insufficient to get more clues.

@wxiaoguang commented on GitHub (Sep 23, 2022): I guess the problem is caused by: 1. There is a PR merge check task in the queue 2. Something wrong happens during the merge check 3. The check task is re-queued then run again But these ten-line logs are insufficient to get more clues.
Author
Owner

@wxiaoguang commented on GitHub (Oct 7, 2022):

Inactive since 14 days ago, I will close this issues. Feel free to re-open if you have more logs and clues.

@wxiaoguang commented on GitHub (Oct 7, 2022): Inactive since 14 days ago, I will close this issues. Feel free to re-open if you have more logs and clues.
Author
Owner

@zeripath commented on GitHub (Oct 8, 2022):

The provided log snippet does not actually show anything incorrect happening at all. All that is showing that is there a PR being checked. All PRs are checked on startup.

That there are a lot of logs created when running as Trace should not be surprising.

Please run with debug, give us longer logs and show us where the error is.

@zeripath commented on GitHub (Oct 8, 2022): The provided log snippet does not actually show anything incorrect happening at all. All that is showing that is there a PR being checked. All PRs are checked on startup. That there are a lot of logs created when running as Trace should not be surprising. Please run with debug, give us longer logs and show us where the error is.
Author
Owner

@Qubitium commented on GitHub (Oct 10, 2022):

@wxiaoguang @zeripath It was indeed caused by a PR merge check that went on a loop. We remove the branch in question to "fix" this issue.

Are there protection code in place to prevent such PR tasks going astray and looping?

@Qubitium commented on GitHub (Oct 10, 2022): @wxiaoguang @zeripath It was indeed caused by a PR merge check that went on a loop. We remove the branch in question to "fix" this issue. Are there protection code in place to prevent such PR tasks going astray and looping?
Author
Owner

@wxiaoguang commented on GitHub (Oct 10, 2022):

I think it's feasible to add some protection code to avoid looping checks.

It's better to resolve the problem if we know the reason why the PR gets stuck and looping checked.

I haven't read the related code yet, maybe zeripath could help to suggest?

@wxiaoguang commented on GitHub (Oct 10, 2022): I think it's feasible to add some protection code to avoid looping checks. It's better to resolve the problem if we know the reason why the PR gets stuck and looping checked. I haven't read the related code yet, maybe zeripath could help to suggest?
Author
Owner

@zeripath commented on GitHub (Oct 10, 2022):

As I said above, the provided log snippet is inadequate to properly understand is going on.

We need more logs.

@zeripath commented on GitHub (Oct 10, 2022): As I said above, the provided log snippet is inadequate to properly understand is going on. We need more logs.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/gitea#9606