Container registry fails with duplicate key (HTTP 500) #9840

Closed
opened 2025-11-02 08:51:00 -06:00 by GiteaMirror · 13 comments
Owner

Originally created by @rmannibucau on GitHub (Nov 15, 2022).

Description

Regularly on container updates (~10 apps by build) we get some HTTP 500 because a duplicate key is found:

2022/11/15 17:02:45 ...ntainer/container.go:84:apiError() [E] [6373c635-17] pq: duplicate key value violates unique constraint "UQE_package_version_s"
2022/11/15 17:02:45 [6373c635-17] router: completed PUT /v2/org/app/blobs/uploads/12345?digest=sha256:12345 for 10.244.0.1:4937, 500 Internal Server Error in 19.4ms @ packages/api.go:292(packages.ContainerRoutes.func2.2)

The reason seems to be that the container id is generated using util.Crypto methods and the entropy in the gitea pod is not high enough to be really random.
Can be neat to use a more advanced algorithm mixing random, timestamp and thread to ensure the unicity, at least in a mono instance deployment.

Gitea Version

1.17

Can you reproduce the bug on the Gitea demo site?

No

Log Gist

No response

Screenshots

No response

Git Version

No response

Operating System

No response

How are you running Gitea?

Kubernetes

Database

PostgreSQL

Originally created by @rmannibucau on GitHub (Nov 15, 2022). ### Description Regularly on container updates (~10 apps by build) we get some HTTP 500 because a duplicate key is found: ``` 2022/11/15 17:02:45 ...ntainer/container.go:84:apiError() [E] [6373c635-17] pq: duplicate key value violates unique constraint "UQE_package_version_s" 2022/11/15 17:02:45 [6373c635-17] router: completed PUT /v2/org/app/blobs/uploads/12345?digest=sha256:12345 for 10.244.0.1:4937, 500 Internal Server Error in 19.4ms @ packages/api.go:292(packages.ContainerRoutes.func2.2) ``` The reason seems to be that the container id is generated using util.Crypto methods and the entropy in the gitea pod is not high enough to be really random. Can be neat to use a more advanced algorithm mixing random, timestamp and thread to ensure the unicity, at least in a mono instance deployment. ### Gitea Version 1.17 ### Can you reproduce the bug on the Gitea demo site? No ### Log Gist _No response_ ### Screenshots _No response_ ### Git Version _No response_ ### Operating System _No response_ ### How are you running Gitea? Kubernetes ### Database PostgreSQL
GiteaMirror added the topic/packagestype/bug labels 2025-11-02 08:51:00 -06:00
Author
Owner

@KN4CK3R commented on GitHub (Nov 16, 2022):

Should be a duplicate of #19586

The problem should be gone if you limit the upload with max-concurrent-uploads.

@KN4CK3R commented on GitHub (Nov 16, 2022): Should be a duplicate of #19586 The problem should be gone if you limit the upload with [`max-concurrent-uploads`](https://docs.docker.com/engine/reference/commandline/dockerd/).
Author
Owner

@rmannibucau commented on GitHub (Nov 16, 2022):

@KN4CK3R it can be the same ticket - not sure from reading it but it looks close - but please note that 1. max-concurrent-uploads is not always an option (to be concrete in my case I don't have docker at all....and there is no concurrency too on the platform when it fails). I'd like to also emphasis that this is a bug in the gitea id generation which looks quite easy to fix so think it is worth fixing the code when possible instead of trying to use workarounds outside of gitea, no?

@rmannibucau commented on GitHub (Nov 16, 2022): @KN4CK3R it can be the same ticket - not sure from reading it but it looks close - but please note that 1. `max-concurrent-uploads` is not always an option (to be concrete in my case I don't have docker at all....and there is *no* concurrency too on the platform when it fails). I'd like to also emphasis that this is a bug in the gitea id generation which looks quite easy to fix so think it is worth fixing the code when possible instead of trying to use workarounds outside of gitea, no?
Author
Owner

@KN4CK3R commented on GitHub (Nov 16, 2022):

It has nothing to do with the upload id.

@KN4CK3R commented on GitHub (Nov 16, 2022): It has nothing to do with the upload id.
Author
Owner

@rmannibucau commented on GitHub (Nov 16, 2022):

@KN4CK3R how can you get twice the same id when a single build pushes images sequentially? The only cause I found was that the entropy on the machine was way too low (almost 0) so the crypto generator was not that random. If it is another cause the error should probably guide to something more relevant since nothing is logged except the id is already taken.

@rmannibucau commented on GitHub (Nov 16, 2022): @KN4CK3R how can you get twice the same id when a single build pushes images sequentially? The only cause I found was that the entropy on the machine was way too low (almost 0) so the crypto generator was not that random. If it is another cause the error should probably guide to something more relevant since nothing is logged except the id is already taken.
Author
Owner

@KN4CK3R commented on GitHub (Nov 16, 2022):

pq: duplicate key value violates unique constraint "UQE_package_version_s" tells us the same package version gets inserted twice. That's what the other issue is about.

@KN4CK3R commented on GitHub (Nov 16, 2022): `pq: duplicate key value violates unique constraint "UQE_package_version_s"` tells us the same package version gets inserted twice. That's what the other issue is about.
Author
Owner

@rmannibucau commented on GitHub (Nov 16, 2022):

@KN4CK3R ok but since a container can be pushed N (>= 1) times with the same tag - and gitea code is compliant with that and it complies to my tests where often we can do it smoothly without any errors. So the error shouldn't occur right? So only option I saw from a code standpoint - for this issue maybe not the other one once again - is that the id generation is not random enough. Is it that hard to solve the issue? It is really bothering to have to run ~10 times the same CI job and get 9 failures due to it. I installed an entropy generator yesterday on the machine but not sure it will be sufficient.

@rmannibucau commented on GitHub (Nov 16, 2022): @KN4CK3R ok but since a container can be pushed N (>= 1) times with the same tag - and gitea code is compliant with that and it complies to my tests where _often_ we can do it smoothly without any errors. So the error shouldn't occur right? So only option I saw from a code standpoint - for this issue maybe not the other one once again - is that the id generation is not random enough. Is it that hard to solve the issue? It is really bothering to have to run ~10 times the same CI job and get 9 failures due to it. I installed an entropy generator yesterday on the machine but not sure it will be sufficient.
Author
Owner

@KN4CK3R commented on GitHub (Nov 16, 2022):

I know you don't want to hear that but it has nothing to do with the upload id. crypto/rand blocks until there is enough entropy. This will not generate duplicates.

@KN4CK3R commented on GitHub (Nov 16, 2022): I know you don't want to hear that but it has nothing to do with the upload id. `crypto/rand` blocks until there is enough entropy. This will not generate duplicates.
Author
Owner

@rmannibucau commented on GitHub (Nov 16, 2022):

@KN4CK3R ok, have to admit I don't care much what is the solution but I need one - and concurrency control is not sufficient.

@rmannibucau commented on GitHub (Nov 16, 2022): @KN4CK3R ok, have to admit I don't care much what is the solution but I need one - and concurrency control is not sufficient.
Author
Owner

@KN4CK3R commented on GitHub (Nov 16, 2022):

Sure, we all want a solution and that's the attitude the open source community needs. If you can propose a solution we will happily implement it.

@KN4CK3R commented on GitHub (Nov 16, 2022): Sure, we all want a solution and that's the attitude the open source community needs. If you can propose a solution we will happily implement it.
Author
Owner

@rmannibucau commented on GitHub (Nov 16, 2022):

@KN4CK3R have to admit if the issue is not the id I'm not sure which one it is so hard to help :s.

@rmannibucau commented on GitHub (Nov 16, 2022): @KN4CK3R have to admit if the issue is not the id I'm not sure which one it is so hard to help :s.
Author
Owner

@KN4CK3R commented on GitHub (Nov 16, 2022):

https://github.com/go-gitea/gitea/issues/19586#issuecomment-1317191633

@KN4CK3R commented on GitHub (Nov 16, 2022): https://github.com/go-gitea/gitea/issues/19586#issuecomment-1317191633
Author
Owner

@rmannibucau commented on GitHub (Nov 16, 2022):

I'm using jib (serialize=true to get a single thread) i observe the same behavior but the files are likely not the same (layers). Would using a monothreaded queue be an option in gitea if your analyzis right? can be easy to test

@rmannibucau commented on GitHub (Nov 16, 2022): I'm using jib (serialize=true to get a single thread) i observe the same behavior but the files are likely not the same (layers). Would using a monothreaded queue be an option in gitea if your analyzis right? can be easy to test
Author
Owner

@wxiaoguang commented on GitHub (May 4, 2023):

I think this issue is related to #19586 and it's likely having been resolved.

Feel free to reopen if there is still any problem.

@wxiaoguang commented on GitHub (May 4, 2023): I think this issue is related to #19586 and it's likely having been resolved. Feel free to reopen if there is still any problem.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/gitea#9840