[GH-ISSUE #3730] 升级最新版启动报错 - windows subprocess crash on 0.1.32 #28055

Closed
opened 2026-04-22 05:48:00 -05:00 by GiteaMirror · 7 comments
Owner

Originally created by @hyanqing1 on GitHub (Apr 18, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3730

Originally assigned to: @dhiltgen on GitHub.

What is the issue?

升级了最新版本0.1.32,启动报错,错误如下:
Error: llama runner process no longer running: 3221225785
后来又重装了0.1.31版本,正常启动。
我的是windows10系统

OS

Windows

GPU

Intel

CPU

Intel

Ollama version

0.1.32

Originally created by @hyanqing1 on GitHub (Apr 18, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3730 Originally assigned to: @dhiltgen on GitHub. ### What is the issue? 升级了最新版本0.1.32,启动报错,错误如下: Error: llama runner process no longer running: 3221225785 后来又重装了0.1.31版本,正常启动。 我的是windows10系统 ### OS Windows ### GPU Intel ### CPU Intel ### Ollama version 0.1.32
GiteaMirror added the bugwindows labels 2026-04-22 05:48:00 -05:00
Author
Owner

@remy415 commented on GitHub (Apr 18, 2024):

Can you please post your log files? Please refer to this FAQ

<!-- gh-comment-id:2063953116 --> @remy415 commented on GitHub (Apr 18, 2024): Can you please post your log files? Please refer to [this FAQ](https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md?plain=1#L24)
Author
Owner

@chenshenghao commented on GitHub (Apr 19, 2024):

I have the similar issue. It just returns: Error: llama runner process no longer running: 1
Because I'm running on a remote node, I do not have the access to ollama log:
Hint: You are currently not seeing messages from other users and the system.
Users in the 'systemd-journal' group can see all messages. Pass -q to
turn off this notice.

Is there any other way to get the log about the error?

<!-- gh-comment-id:2065781097 --> @chenshenghao commented on GitHub (Apr 19, 2024): I have the similar issue. It just returns: Error: llama runner process no longer running: 1 Because I'm running on a remote node, I do not have the access to ollama log: Hint: You are currently not seeing messages from other users and the system. Users in the 'systemd-journal' group can see all messages. Pass -q to turn off this notice. Is there any other way to get the log about the error?
Author
Owner

@remy415 commented on GitHub (Apr 19, 2024):

What is your configuration? You said you are using a remote node, how are you accessing the remote node?

The error message suggests you are trying to execute ollama run <model> without the service running. If you’re running Windows 10, Ollama would be running as a background service on the desktop. Is the remote node running windows 10?

<!-- gh-comment-id:2065991530 --> @remy415 commented on GitHub (Apr 19, 2024): What is your configuration? You said you are using a remote node, how are you accessing the remote node? The error message suggests you are trying to execute `ollama run <model>` without the service running. If you’re running Windows 10, Ollama would be running as a background service on the desktop. Is the remote node running windows 10?
Author
Owner

@chenshenghao commented on GitHub (Apr 22, 2024):

What is your configuration? You said you are using a remote node, how are you accessing the remote node?

The error message suggests you are trying to execute ollama run <model> without the service running. If you’re running Windows 10, Ollama would be running as a background service on the desktop. Is the remote node running windows 10?

On a debian 10 remote node. I tried to manually start the service using the following commands, and ensure the ollama process has been started.
sudo systemctl daemon-reload
sudo systemctl enable ollama
sudo systemctl start ollama

Then execute ollama run llama3, but the above error persists.

<!-- gh-comment-id:2068363457 --> @chenshenghao commented on GitHub (Apr 22, 2024): > What is your configuration? You said you are using a remote node, how are you accessing the remote node? > > The error message suggests you are trying to execute `ollama run <model>` without the service running. If you’re running Windows 10, Ollama would be running as a background service on the desktop. Is the remote node running windows 10? On a debian 10 remote node. I tried to manually start the service using the following commands, and ensure the ollama process has been started. sudo systemctl daemon-reload sudo systemctl enable ollama sudo systemctl start ollama Then execute `ollama run llama3`, but the above error persists.
Author
Owner

@remy415 commented on GitHub (Apr 22, 2024):

You will need two terminals or use something like tmux or screen:

You can stop the daemon systemctl stop ollama, then in one terminal run OLLAMA_DEBUG=1 ollama serve, the other terminal run ollama run llama3. The logs we would need are from the window for ollama serve. You can pipe it to tee if you want to save it to a file too: OLLAMA_DEBUG=1 ollama serve | tee -a /PATH/TO/FILE

<!-- gh-comment-id:2068414925 --> @remy415 commented on GitHub (Apr 22, 2024): You will need two terminals or use something like tmux or screen: You can stop the daemon `systemctl stop ollama`, then in one terminal run `OLLAMA_DEBUG=1 ollama serve`, the other terminal run `ollama run llama3`. The logs we would need are from the window for `ollama serve`. You can pipe it to tee if you want to save it to a file too: `OLLAMA_DEBUG=1 ollama serve | tee -a /PATH/TO/FILE`
Author
Owner

@dhiltgen commented on GitHub (May 5, 2024):

@hyanqing1 sorry that you ran into problems with 0.1.32 on Windows. We've fixed a number of bugs in 0.1.33 related to windows subprocess handling which should hopefully resolve your problem. Please try upgrading to 0.1.33 and if you still see problems, please share your server log.

<!-- gh-comment-id:2094513002 --> @dhiltgen commented on GitHub (May 5, 2024): @hyanqing1 sorry that you ran into problems with 0.1.32 on Windows. We've fixed a number of bugs in 0.1.33 related to windows subprocess handling which should hopefully resolve your problem. Please try upgrading to 0.1.33 and if you still see problems, please share your server log.
Author
Owner

@dhiltgen commented on GitHub (May 21, 2024):

If you're still having problems, please upgrade to the latest version and if that doesn't resolve it, please share your server log and I'll reopen the issue.

<!-- gh-comment-id:2123191425 --> @dhiltgen commented on GitHub (May 21, 2024): If you're still having problems, please upgrade to the latest version and if that doesn't resolve it, please share your server log and I'll reopen the issue.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#28055