Upgrade from v1.7 to v1.8 removed all deploy keys. #3530

Closed
opened 2025-11-02 05:15:59 -06:00 by GiteaMirror · 9 comments
Owner

Originally created by @jakimfett on GitHub (Jun 30, 2019).

  • Gitea version (or commit ref):
    v1.7.0+dev-196-g7f38e2d0d built with go1.11.2 : bindata, pam, sqlite, sqlite_unlock_notify
  • Git version: v2.11.0
  • Operating system: Debian v9.9
  • Database (use [x]):
    • PostgreSQL
    • MySQL
    • MSSQL
    • SQLite
  • Can you reproduce the bug at https://try.gitea.io:
    • Yes (provide example URL)
    • No
    • Not relevant
  • Log gist:
    https://build.jakimfett.com/jobs/gitea/23

Description

When upgrading from gitea.1.7.0+dev-196-g7f38e2d0d to v1.8.3+1-ge94a84248, all deploy keys were wiped from the repositories in my instance.

Screenshots

n/a - no screenshots exist of the keys existing, and taking a screenshot of an empty interface seems redundant.

Originally created by @jakimfett on GitHub (Jun 30, 2019). - Gitea version (or commit ref): v1.7.0+dev-196-g7f38e2d0d built with go1.11.2 : bindata, pam, sqlite, sqlite_unlock_notify - Git version: v2.11.0 - Operating system: Debian v9.9 - Database (use `[x]`): - [ ] PostgreSQL - [ ] MySQL - [ ] MSSQL - [x] SQLite - Can you reproduce the bug at https://try.gitea.io: - [ ] Yes (provide example URL) - [ ] No - [x] Not relevant - Log gist: https://build.jakimfett.com/jobs/gitea/23 ## Description When upgrading from gitea.1.7.0+dev-196-g7f38e2d0d to v1.8.3+1-ge94a84248, all deploy keys were wiped from the repositories in my instance. ## Screenshots n/a - no screenshots exist of the keys existing, and taking a screenshot of an empty interface seems redundant.
GiteaMirror added the issue/staletype/bug labels 2025-11-02 05:15:59 -06:00
Author
Owner

@zeripath commented on GitHub (Jun 30, 2019):

This is weird as I don't think there's any good reason for why that might happen. I've checked both trees and my PR #5939 is in both of these.

Did you have a backup of your db? I would be interested to find out what happened.

@zeripath commented on GitHub (Jun 30, 2019): This is weird as I don't think there's any good reason for why that might happen. I've checked both trees and my PR #5939 is in both of these. Did you have a backup of your db? I would be interested to find out what happened.
Author
Owner

@jakimfett commented on GitHub (Jul 1, 2019):

Sadly, no.

I have the un-modified git-tracked directory that I compiled the 1.7 version with, if you'd like a tarball of that.
I was using the release v1.8 tag, but the 1.7 build was off master, if that makes any difference?

Honestly I was more confused than anything else, it's a new-ish instance with mostly mirrors, so there's nothing lost or really needing to be fixed, and I even debated about reporting it.

@jakimfett commented on GitHub (Jul 1, 2019): Sadly, no. I have the un-modified git-tracked directory that I compiled the 1.7 version with, if you'd like a tarball of that. I was using the release v1.8 tag, but the 1.7 build was off master, if that makes any difference? Honestly I was more confused than anything else, it's a new-ish instance with mostly mirrors, so there's nothing lost or really needing to be fixed, and I even debated about reporting it.
Author
Owner

@zeripath commented on GitHub (Jul 1, 2019):

Ok looking at the diff for these I can't see any reason for this to happen. I literally have no idea what could cause this!

Could you check your sqlite db to see if there is another table that could be your old deploy key table?

@zeripath commented on GitHub (Jul 1, 2019): Ok looking at the diff for these I can't see any reason for this to happen. I literally have no idea what could cause this! Could you check your sqlite db to see if there is another table that could be your old deploy key table?
Author
Owner

@jakimfett commented on GitHub (Jul 4, 2019):

@zeripath commented:
Could you check your sqlite db...

Yup, absolutely.

...

How, exactly, would I go about doing that?
(I normally play with mariaDB or similar.)

Also, what table would I be looking for?
Is there a nomenclature for tables being migrated in a minour version (eg, v1.7-->v1.8) bump?

@jakimfett commented on GitHub (Jul 4, 2019): > @zeripath [commented](https://github.com/go-gitea/gitea/issues/7334#issuecomment-507443052): > Could you check your sqlite db... Yup, absolutely. ... How, exactly, would I go about doing that? (I normally play with mariaDB or similar.) Also, what table would I be looking for? Is there a nomenclature for tables being migrated in a minour version (eg, v1.7-->v1.8) bump?
Author
Owner

@zeripath commented on GitHub (Jul 4, 2019):

Either the sqlite3 db_file or get hold of sqlitebrowser.

The table name should always be deploy_key. I just wonder if somehow there's another table been created - if so it would have deploy_key as a prefix. I don't know how such a table would be created though.

@zeripath commented on GitHub (Jul 4, 2019): Either the `sqlite3 db_file` or get hold of sqlitebrowser. The table name should always be `deploy_key`. I just wonder if somehow there's another table been created - if so it would have deploy_key as a prefix. I don't know how such a table would be created though.
Author
Owner

@jakimfett commented on GitHub (Jul 4, 2019):

Here's my tables:

sqlite> .databases
main: /home/gitea/run/data/gitea.db
sqlite> .tables
access                     oauth2_grant
access_token               oauth2_session
action                     org_user
attachment                 protected_branch
collaboration              public_key
comment                    pull_request
commit_status              reaction
deleted_branch             release
deploy_key                 repo_indexer_status
email_address              repo_redirect
external_login_user        repo_topic
follow                     repo_unit
gpg_key                    repository
hook_task                  review
issue                      star
issue_assignees            stopwatch
issue_dependency           team
issue_label                team_repo
issue_user                 team_unit
issue_watch                team_user
label                      topic
lfs_lock                   tracked_time
lfs_meta_object            two_factor
login_source               u2f_registration
milestone                  upload
mirror                     user
notice                     user_open_id
notification               version
oauth2_application         watch
oauth2_authorization_code  webhook
sqlite>

And the table:

sqlite> .schema deploy_key
CREATE TABLE `deploy_key` (
`id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, 
`key_id` INTEGER NULL, 
`repo_id` INTEGER NULL, 
`name` TEXT NULL, 
`fingerprint` TEXT NULL, 
`mode` INTEGER DEFAULT 1 NOT NULL, 
`created_unix` INTEGER NULL, 
`updated_unix` INTEGER NULL
);
CREATE UNIQUE INDEX `UQE_deploy_key_s` ON `deploy_key` (`key_id`,`repo_id`);
CREATE INDEX `IDX_deploy_key_key_id` ON `deploy_key` (`key_id`);
CREATE INDEX `IDX_deploy_key_repo_id` ON `deploy_key` (`repo_id`);
sqlite> SELECT * FROM deploy_key;
1|10|2|user@server|SHA256:<snip>|<mode>|1559508132|1560435910
2|11|2|user@server|SHA256:<snip>|<mode>|1559520995|1559520995
4|13|2|user@server|SHA256:<snip>|<mode>|1560027077|1560179849
5|14|18|user@server|SHA256:<snip>|<mode>|1561823160|1561823160
6|14|21|user@server|SHA256:<snip>|<mode>|1561908025|1561920159
sqlite>

Hopefully this tells you more than it tells me.

@jakimfett commented on GitHub (Jul 4, 2019): Here's my tables: ``` sqlite> .databases main: /home/gitea/run/data/gitea.db sqlite> .tables access oauth2_grant access_token oauth2_session action org_user attachment protected_branch collaboration public_key comment pull_request commit_status reaction deleted_branch release deploy_key repo_indexer_status email_address repo_redirect external_login_user repo_topic follow repo_unit gpg_key repository hook_task review issue star issue_assignees stopwatch issue_dependency team issue_label team_repo issue_user team_unit issue_watch team_user label topic lfs_lock tracked_time lfs_meta_object two_factor login_source u2f_registration milestone upload mirror user notice user_open_id notification version oauth2_application watch oauth2_authorization_code webhook sqlite> ``` And the table: ``` sqlite> .schema deploy_key CREATE TABLE `deploy_key` ( `id` INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL, `key_id` INTEGER NULL, `repo_id` INTEGER NULL, `name` TEXT NULL, `fingerprint` TEXT NULL, `mode` INTEGER DEFAULT 1 NOT NULL, `created_unix` INTEGER NULL, `updated_unix` INTEGER NULL ); CREATE UNIQUE INDEX `UQE_deploy_key_s` ON `deploy_key` (`key_id`,`repo_id`); CREATE INDEX `IDX_deploy_key_key_id` ON `deploy_key` (`key_id`); CREATE INDEX `IDX_deploy_key_repo_id` ON `deploy_key` (`repo_id`); sqlite> SELECT * FROM deploy_key; 1|10|2|user@server|SHA256:<snip>|<mode>|1559508132|1560435910 2|11|2|user@server|SHA256:<snip>|<mode>|1559520995|1559520995 4|13|2|user@server|SHA256:<snip>|<mode>|1560027077|1560179849 5|14|18|user@server|SHA256:<snip>|<mode>|1561823160|1561823160 6|14|21|user@server|SHA256:<snip>|<mode>|1561908025|1561920159 sqlite> ``` Hopefully this tells you more than it tells me.
Author
Owner

@zeripath commented on GitHub (Jul 4, 2019):

Ok that implies that you should have some deploy keys. In particular 3 keys for repo 2 and 1 key that covers repo 18 and 21.

It would be interesting to know which of those keys have different fingerprints. I would hope that all 3 keys for repo 2 have different fingerprints and are different from key 14.

@zeripath commented on GitHub (Jul 4, 2019): Ok that implies that you should have some deploy keys. In particular 3 keys for repo 2 and 1 key that covers repo 18 and 21. It would be interesting to know which of those keys have different fingerprints. I would hope that all 3 keys for repo 2 have different fingerprints and are different from key 14.
Author
Owner

@stale[bot] commented on GitHub (Sep 2, 2019):

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs during the next 2 weeks. Thank you for your contributions.

@stale[bot] commented on GitHub (Sep 2, 2019): This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs during the next 2 weeks. Thank you for your contributions.
Author
Owner

@jakimfett commented on GitHub (Sep 7, 2019):

I haven't had time to try to reproduce this, and the server I was developing on is in storage for a move.

Closing this until/unless I run into the issue again.

@jakimfett commented on GitHub (Sep 7, 2019): I haven't had time to try to reproduce this, and the server I was developing on is in storage for a move. Closing this until/unless I run into the issue again.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/gitea#3530