Actions workflows stall out cloning docker scout action #13971

Open
opened 2025-11-02 10:58:46 -06:00 by GiteaMirror · 4 comments
Owner

Originally created by @cyberops7 on GitHub (Jan 12, 2025).

Description

When I include a workflow step for docker/scout-action@v1 (https://github.com/docker/scout-action), the run stalls while trying to git clone the necessary repository. It is a large repository (several GB) compared to the other actions I use. The Set up Job step runs, and successfully clones the other actions it is going to use, but when it gets to cloing the docker scout action, it never is able to move past that step, remaining (seemingly indefinitely) at:

☁  git clone 'https://github.com/docker/scout-action' # ref=v1
  cloning https://github.com/docker/scout-action to /root/.cache/act/754ec737f035b7cc4d77b1b2cb4e77fb44b95373b198fcf3cc596ab23f50d816

It never gets to the steps where the logs state "Cloned https://github.com/..." and then "Checked out v{targed_tag_here}"
Even after letting the action run sit there for 30m, this is the contents of that directory:

~/.cache/act/754ec737f035b7cc4d77b1b2cb4e77fb44b95373b198fcf3cc596ab23f50d816/.git # ls -lah
total 24K
drwxr-xr-x    4 root     root        4.0K Jan 12 18:14 .
drwxr-xr-x    3 root     root        4.0K Jan 12 18:14 ..
-rw-r--r--    1 root     root          23 Jan 12 18:14 HEAD
-rw-r--r--    1 root     root         130 Jan 12 18:14 config
drwxr-xr-x    4 root     root        4.0K Jan 12 18:14 objects
drwxr-xr-x    4 root     root        4.0K Jan 12 18:14 refs

It did download a lot, evidenced by:

~/.cache/act/754ec737f035b7cc4d77b1b2cb4e77fb44b95373b198fcf3cc596ab23f50d816/.git/objects/pack # ls -lah
total 6G
drwxr-xr-x    2 root     root        4.0K Jan 12 18:14 .
drwxr-xr-x    4 root     root        4.0K Jan 12 18:14 ..
-rw-------    1 root     root        6.1G Jan 12 18:18 tmp_pack_3230832379

but it never unpacks what it downloaded.

I can manually run git clone 'https://github.com/docker/scout-action' inside the pod/container right into /root, and that completes just fine:

~ # git clone 'https://github.com/docker/scout-action'
Cloning into 'scout-action'...
remote: Enumerating objects: 1899, done.
remote: Counting objects: 100% (22/22), done.
remote: Compressing objects: 100% (19/19), done.
remote: Total 1899 (delta 9), reused 3 (delta 3), pack-reused 1877 (from 4)
Receiving objects: 100% (1899/1899), 6.26 GiB | 13.64 MiB/s, done.
Resolving deltas: 100% (1539/1539), done.
Updating files: 100% (16/16), done.
~ #

with this file structure:

~/scout-action/.git # ls -lah
total 52K
drwxr-xr-x    8 root     root        4.0K Jan 12 18:45 .
drwxr-xr-x    5 root     root        4.0K Jan 12 18:45 ..
-rw-r--r--    1 root     root          21 Jan 12 18:45 HEAD
drwxr-xr-x    2 root     root        4.0K Jan 12 18:32 branches
-rw-r--r--    1 root     root         259 Jan 12 18:45 config
-rw-r--r--    1 root     root          73 Jan 12 18:32 description
drwxr-xr-x    2 root     root        4.0K Jan 12 18:32 hooks
-rw-r--r--    1 root     root        1.6K Jan 12 18:45 index
drwxr-xr-x    2 root     root        4.0K Jan 12 18:32 info
drwxr-xr-x    3 root     root        4.0K Jan 12 18:45 logs
drwxr-xr-x    4 root     root        4.0K Jan 12 18:32 objects
-rw-r--r--    1 root     root        2.7K Jan 12 18:45 packed-refs
drwxr-xr-x    5 root     root        4.0K Jan 12 18:45 refs

Given the difference in directory sizes here, it does seem like the workflow's download never finishes (there's a few other, much smaller, directories in the .cache as well):
image

There seems to be ample disk space:

❯ kubectl -n gitea exec -it gitea-act-runner-0 -c act-runner -- /bin/ash
/data # df -h /
Filesystem                Size      Used Available Use% Mounted on
overlay                  98.2G     32.3G     61.6G  34% /

Gitea Version

1.22.3

Can you reproduce the bug on the Gitea demo site?

No

Log Gist

No response

Screenshots

No response

Git Version

No response

Operating System

No response

How are you running Gitea?

I am running it in kubernetes using the Gitea helm chart.

Database

None

Originally created by @cyberops7 on GitHub (Jan 12, 2025). ### Description When I include a workflow step for `docker/scout-action@v1` (https://github.com/docker/scout-action), the run stalls while trying to `git clone` the necessary repository. It is a large repository (several GB) compared to the other actions I use. The Set up Job step runs, and successfully clones the other actions it is going to use, but when it gets to cloing the docker scout action, it never is able to move past that step, remaining (seemingly indefinitely) at: ``` ☁ git clone 'https://github.com/docker/scout-action' # ref=v1 cloning https://github.com/docker/scout-action to /root/.cache/act/754ec737f035b7cc4d77b1b2cb4e77fb44b95373b198fcf3cc596ab23f50d816 ``` It never gets to the steps where the logs state "Cloned https://github.com/..." and then "Checked out v{targed_tag_here}" Even after letting the action run sit there for 30m, this is the contents of that directory: ``` ~/.cache/act/754ec737f035b7cc4d77b1b2cb4e77fb44b95373b198fcf3cc596ab23f50d816/.git # ls -lah total 24K drwxr-xr-x 4 root root 4.0K Jan 12 18:14 . drwxr-xr-x 3 root root 4.0K Jan 12 18:14 .. -rw-r--r-- 1 root root 23 Jan 12 18:14 HEAD -rw-r--r-- 1 root root 130 Jan 12 18:14 config drwxr-xr-x 4 root root 4.0K Jan 12 18:14 objects drwxr-xr-x 4 root root 4.0K Jan 12 18:14 refs ``` It did download a lot, evidenced by: ``` ~/.cache/act/754ec737f035b7cc4d77b1b2cb4e77fb44b95373b198fcf3cc596ab23f50d816/.git/objects/pack # ls -lah total 6G drwxr-xr-x 2 root root 4.0K Jan 12 18:14 . drwxr-xr-x 4 root root 4.0K Jan 12 18:14 .. -rw------- 1 root root 6.1G Jan 12 18:18 tmp_pack_3230832379 ``` but it never unpacks what it downloaded. I can manually run `git clone 'https://github.com/docker/scout-action'` inside the pod/container right into `/root`, and that completes just fine: ``` ~ # git clone 'https://github.com/docker/scout-action' Cloning into 'scout-action'... remote: Enumerating objects: 1899, done. remote: Counting objects: 100% (22/22), done. remote: Compressing objects: 100% (19/19), done. remote: Total 1899 (delta 9), reused 3 (delta 3), pack-reused 1877 (from 4) Receiving objects: 100% (1899/1899), 6.26 GiB | 13.64 MiB/s, done. Resolving deltas: 100% (1539/1539), done. Updating files: 100% (16/16), done. ~ # ``` with this file structure: ``` ~/scout-action/.git # ls -lah total 52K drwxr-xr-x 8 root root 4.0K Jan 12 18:45 . drwxr-xr-x 5 root root 4.0K Jan 12 18:45 .. -rw-r--r-- 1 root root 21 Jan 12 18:45 HEAD drwxr-xr-x 2 root root 4.0K Jan 12 18:32 branches -rw-r--r-- 1 root root 259 Jan 12 18:45 config -rw-r--r-- 1 root root 73 Jan 12 18:32 description drwxr-xr-x 2 root root 4.0K Jan 12 18:32 hooks -rw-r--r-- 1 root root 1.6K Jan 12 18:45 index drwxr-xr-x 2 root root 4.0K Jan 12 18:32 info drwxr-xr-x 3 root root 4.0K Jan 12 18:45 logs drwxr-xr-x 4 root root 4.0K Jan 12 18:32 objects -rw-r--r-- 1 root root 2.7K Jan 12 18:45 packed-refs drwxr-xr-x 5 root root 4.0K Jan 12 18:45 refs ``` Given the difference in directory sizes here, it does seem like the workflow's download never finishes (there's a few other, much smaller, directories in the .cache as well): <img width="489" alt="image" src="https://github.com/user-attachments/assets/97de1480-2a9c-46eb-aa10-dea2c450e7b6" /> There seems to be ample disk space: ``` ❯ kubectl -n gitea exec -it gitea-act-runner-0 -c act-runner -- /bin/ash /data # df -h / Filesystem Size Used Available Use% Mounted on overlay 98.2G 32.3G 61.6G 34% / ``` ### Gitea Version 1.22.3 ### Can you reproduce the bug on the Gitea demo site? No ### Log Gist _No response_ ### Screenshots _No response_ ### Git Version _No response_ ### Operating System _No response_ ### How are you running Gitea? I am running it in kubernetes using the Gitea helm chart. ### Database None
GiteaMirror added the topic/gitea-actionsissue/workaroundissue/not-a-bug labels 2025-11-02 10:58:46 -06:00
Author
Owner

@cyberops7 commented on GitHub (Jan 13, 2025):

Oh geez....well, apparently if you give it an hour (ok, 59 minutes), it finally finishes. Something is off, though, since manually running git clone for that repo (inside the same container, the same host) take significantly less time.

Jan 12, 2025, 11:14 AM
  ☁  git clone 'https://github.com/docker/scout-action' # ref=v1
Jan 12, 2025, 11:14 AM
  cloning https://github.com/docker/scout-action to /root/.cache/act/754ec737f035b7cc4d77b1b2cb4e77fb44b95373b198fcf3cc596ab23f50d816
Jan 12, 2025, 12:13 PM
Cloned https://github.com/docker/scout-action to /root/.cache/act/754ec737f035b7cc4d77b1b2cb4e77fb44b95373b198fcf3cc596ab23f50d816
Jan 12, 2025, 12:14 PM
Checked out v1
@cyberops7 commented on GitHub (Jan 13, 2025): Oh geez....well, apparently if you give it an hour (ok, 59 minutes), it _finally_ finishes. Something is off, though, since manually running `git clone` for that repo (inside the same container, the same host) take _significantly_ less time. ``` Jan 12, 2025, 11:14 AM ☁ git clone 'https://github.com/docker/scout-action' # ref=v1 Jan 12, 2025, 11:14 AM cloning https://github.com/docker/scout-action to /root/.cache/act/754ec737f035b7cc4d77b1b2cb4e77fb44b95373b198fcf3cc596ab23f50d816 Jan 12, 2025, 12:13 PM Cloned https://github.com/docker/scout-action to /root/.cache/act/754ec737f035b7cc4d77b1b2cb4e77fb44b95373b198fcf3cc596ab23f50d816 Jan 12, 2025, 12:14 PM Checked out v1 ```
Author
Owner

@crazy-max commented on GitHub (May 8, 2025):

That's because the git repo contains binaries so it takes a long time to checkout: https://github.com/docker/scout-action/tree/main/dist

I have created this composite action to avoid that: https://github.com/crazy-max/.github?tab=readme-ov-file#docker-scout

You can check its behavior here: https://github.com/docker/buildx/actions/runs/14847110731/job/41684892782#step:3:1

@crazy-max commented on GitHub (May 8, 2025): That's because the git repo contains binaries so it takes a long time to checkout: https://github.com/docker/scout-action/tree/main/dist I have created this composite action to avoid that: https://github.com/crazy-max/.github?tab=readme-ov-file#docker-scout You can check its behavior here: https://github.com/docker/buildx/actions/runs/14847110731/job/41684892782#step:3:1
Author
Owner

@DaanSelen commented on GitHub (Aug 8, 2025):

That's because the git repo contains binaries so it takes a long time to checkout: https://github.com/docker/scout-action/tree/main/dist

I have created this composite action to avoid that: https://github.com/crazy-max/.github?tab=readme-ov-file#docker-scout

You can check its behavior here: https://github.com/docker/buildx/actions/runs/14847110731/job/41684892782#step:3:1

Thank you! It looks to be working well! Thanks!

@DaanSelen commented on GitHub (Aug 8, 2025): > That's because the git repo contains binaries so it takes a long time to checkout: https://github.com/docker/scout-action/tree/main/dist > > I have created this composite action to avoid that: https://github.com/crazy-max/.github?tab=readme-ov-file#docker-scout > > You can check its behavior here: https://github.com/docker/buildx/actions/runs/14847110731/job/41684892782#step:3:1 Thank you! It looks to be working well! Thanks!
Author
Owner

@DaanSelen commented on GitHub (Aug 9, 2025):

That's because the git repo contains binaries so it takes a long time to checkout: https://github.com/docker/scout-action/tree/main/dist

I have created this composite action to avoid that: https://github.com/crazy-max/.github?tab=readme-ov-file#docker-scout

You can check its behavior here: https://github.com/docker/buildx/actions/runs/14847110731/job/41684892782#step:3:1

Building on top of this, I was just missing the actual overview of the CVE's so here is my custom version expanding on Max's work; https://github.com/DaanSelen/composite-actions/tree/main/docker-scout

@DaanSelen commented on GitHub (Aug 9, 2025): > That's because the git repo contains binaries so it takes a long time to checkout: https://github.com/docker/scout-action/tree/main/dist > > I have created this composite action to avoid that: https://github.com/crazy-max/.github?tab=readme-ov-file#docker-scout > > You can check its behavior here: https://github.com/docker/buildx/actions/runs/14847110731/job/41684892782#step:3:1 Building on top of this, I was just missing the actual overview of the CVE's so here is my custom version expanding on Max's work; https://github.com/DaanSelen/composite-actions/tree/main/docker-scout
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/gitea#13971