[GH-ISSUE #2122] Cannot run ollama on my server using the docker image, error 132 #63252

Closed
opened 2026-05-03 12:44:56 -05:00 by GiteaMirror · 5 comments
Owner

Originally created by @GuiPoM on GitHub (Jan 21, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2122

Hello,

This is the first time I am facing such an issue, I cannot run the container at all, it crashes right when it is deployed.
I don't know which information should be useful to debug that issue, my host is a debian 12 server with docker 25 ce

I was first deploying using a compose file but I switched back to the docker command line to double check:
docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama
It creates a volume, but container crashes with error code 132:

State
Dead false
Error
ExitCode 132
FinishedAt 2024-01-21T10:24:09.726297577Z
OOMKilled false
Paused false
Pid 0
Restarting false
Running false
StartedAt 2024-01-21T10:24:09.724212624Z
Status exited

Then I have no clue to identify what is going on, I was not able to find a reference to error 132 in the source code, that could help me do some further checks.
Maybe you will have some ideas ! Thanks !

Originally created by @GuiPoM on GitHub (Jan 21, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2122 Hello, This is the first time I am facing such an issue, I cannot run the container at all, it crashes right when it is deployed. I don't know which information should be useful to debug that issue, my host is a debian 12 server with docker 25 ce I was first deploying using a compose file but I switched back to the docker command line to double check: `docker run -d -v ollama:/root/.ollama -p 11434:11434 --name ollama ollama/ollama` It creates a volume, but container crashes with error code 132: ``` State Dead false Error ExitCode 132 FinishedAt 2024-01-21T10:24:09.726297577Z OOMKilled false Paused false Pid 0 Restarting false Running false StartedAt 2024-01-21T10:24:09.724212624Z Status exited ``` Then I have no clue to identify what is going on, I was not able to find a reference to error 132 in the source code, that could help me do some further checks. Maybe you will have some ideas ! Thanks !
Author
Owner

@Shadoweee77 commented on GitHub (Jan 21, 2024):

I have the same exact issue

<!-- gh-comment-id:1902737466 --> @Shadoweee77 commented on GitHub (Jan 21, 2024): I have the same exact issue
Author
Owner

@GuiPoM commented on GitHub (Jan 22, 2024):

I guess it has something to do with the support of AVX instructions. I am using an Intel Gold 6400 which is socket 1200, Cornet Lake gen, but only supports SSE 4.1 and 4.2, contrary to the i5 I also have, same socket and gen, but which supports AVX.
If someone can confirm ... thanks !

<!-- gh-comment-id:1904859290 --> @GuiPoM commented on GitHub (Jan 22, 2024): I guess it has something to do with the support of AVX instructions. I am using an Intel Gold 6400 which is socket 1200, Cornet Lake gen, but only supports SSE 4.1 and 4.2, contrary to the i5 I also have, same socket and gen, but which supports AVX. If someone can confirm ... thanks !
Author
Owner

@dhiltgen commented on GitHub (Jan 26, 2024):

@GuiPoM can you try running without daemon mode (drop the -d flag) to see if there is any output before the exit/crash?

Also make sure to pull the image (docker pull ollama/ollama) to make sure you get the latest version.

<!-- gh-comment-id:1912635920 --> @dhiltgen commented on GitHub (Jan 26, 2024): @GuiPoM can you try running without daemon mode (drop the `-d` flag) to see if there is any output before the exit/crash? Also make sure to pull the image (`docker pull ollama/ollama`) to make sure you get the latest version.
Author
Owner

@GuiPoM commented on GitHub (Jan 27, 2024):

@GuiPoM can you try running without daemon mode (drop the -d flag) to see if there is any output before the exit/crash?

Also make sure to pull the image (docker pull ollama/ollama) to make sure you get the latest version.

Thank you for your answer. I do not know if you made the link with the other conversation we had in the issue #1279 about support of CPUs without AVX, but the rc image you shared with me is working fine.
I made it working on this platform, CPU without AVX, no GPU. Another one, CPU with AVX, but no GPU. And a final one, CPU with AVX and with nVidia GPU, and all three are starting fine.

So I guest the "latest" ollama image is now old and does not provide the latest enhancement to have it deployed.
I can do the check without -d if you think it is useful, but as the rc image works, I guess we can say my issue is closed, right ?

<!-- gh-comment-id:1913336057 --> @GuiPoM commented on GitHub (Jan 27, 2024): > @GuiPoM can you try running without daemon mode (drop the `-d` flag) to see if there is any output before the exit/crash? > > Also make sure to pull the image (`docker pull ollama/ollama`) to make sure you get the latest version. Thank you for your answer. I do not know if you made the link with the other conversation we had in the issue #1279 about support of CPUs without AVX, but the rc image you shared with me is working fine. I made it working on this platform, CPU without AVX, no GPU. Another one, CPU with AVX, but no GPU. And a final one, CPU with AVX and with nVidia GPU, and all three are starting fine. So I guest the "latest" ollama image is now old and does not provide the latest enhancement to have it deployed. I can do the check without `-d` if you think it is useful, but as the rc image works, I guess we can say my issue is closed, right ?
Author
Owner

@dhiltgen commented on GitHub (Jan 28, 2024):

Great to hear the latest release is working for you!

So I guest the "latest" ollama image is now old and does not provide the latest enhancement to have it deployed.

We do update the latest tag on every release, but depending on your container runtime and how you run the container, "latest" can grow stale on your system. If you docker pull ollama/ollama that will ensure you're picking up the actual latest image from Docker Hub.

It sounds like we can close this now.

<!-- gh-comment-id:1913699248 --> @dhiltgen commented on GitHub (Jan 28, 2024): Great to hear the latest release is working for you! > So I guest the "latest" ollama image is now old and does not provide the latest enhancement to have it deployed. We do update the latest tag on every release, but depending on your container runtime and how you run the container, "latest" can grow stale on your system. If you `docker pull ollama/ollama` that will ensure you're picking up the actual latest image from Docker Hub. It sounds like we can close this now.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#63252