[GH-ISSUE #7905] ollama 0.4.7 results in: 127 #5059

Closed
opened 2026-04-12 16:09:06 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @bucovaina on GitHub (Dec 2, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7905

What is the issue?

On a freshly installed RHEL8 host, ollama 4.7.0 does not start. It does when I install 4.6.0. Only changes I made to /etc/systemd/system/ollama.service is setting OLLAMA_HOST to 0.0.0.0

>   "model": "phi3",
>   "prompt":"Why is the sky blue?"
>  }'
{"error":"llama runner process has terminated: exit status 127"}[root@host ~]# ollama run phi3
Error: llama runner process has terminated: exit status 127
[root@host ~]# curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.4.6 sh
>>> Installing ollama to /usr/local
>>> Downloading Linux amd64 bundle
######################################################################## 100.0%
>>> Adding ollama user to render group...
>>> Adding ollama user to video group...
>>> Adding current user to ollama group...
>>> Creating ollama systemd service...
>>> Enabling and starting ollama service...
>>> NVIDIA GPU installed.
[root@host ~]# curl -X POST http://localhost:11434/api/generate -d '{
  "model": "phi3",
  "prompt":"Why is the sky blue?"
 }'
{"model":"phi3","created_at":"2024-12-02T08:51:31.215490419Z","response":"The","done":false}
{"model":"phi3","created_at":"2024-12-02T08:51:31.243051448Z","response":" sky","done":false}
{"model":"phi3","created_at":"2024-12-02T08:51:31.270859148Z","response":" appears","done":false}
{"model":"phi3","created_at":"2024-12-02T08:51:31.299653528Z","response":" pre","done":false}
{"model":"phi3","created_at":"2024-12-02T08:51:31.329058248Z","response":"domin","done":false}
{"model":"phi3","created_at":"2024-12-02T08:51:31.359087445Z","response":"antly","done":false}
{"model":"phi3","created_at":"2024-12-02T08:51:31.389054145Z","response":" blue","done":false}
{"model":"phi3","created_at":"2024-12-02T08:51:31.418885377Z","response":" to","done":false}
{"model":"phi3","created_at":"2024-12-02T08:51:31.448781861Z","response":" the","done":false}
...
...

My GPU is officially not supported. So might be because of that. If you feel like closing this won't fix because it's highly likely because of the Tesla M6, feel free to do so.

[root@host ~]# lshw -class video
  *-display                 
       description: VGA compatible controller
       product: bochs-drmdrmfb
       physical id: 1
       bus info: pci@0000:00:01.0
       logical name: /dev/fb0
       version: 02
       width: 32 bits
       clock: 33MHz
       capabilities: vga_controller bus_master rom fb
       configuration: depth=32 driver=bochs-drm latency=0 resolution=1280,800
       resources: irq:0 memory:80000000-80ffffff memory:8304b000-8304bfff memory:c0000-dffff
  *-display
       description: VGA compatible controller
       product: GM204GL [Tesla M6]
       vendor: NVIDIA Corporation
       physical id: 0
       bus info: pci@0000:01:00.0
       logical name: /dev/fb0
       version: a1
       width: 64 bits
       clock: 33MHz
       capabilities: pm msi pciexpress vga_controller bus_master cap_list fb
       configuration: depth=32 driver=nvidia latency=0 mode=1280x800 visual=truecolor xres=1280 yres=800
       resources: iomemory:38000-37fff iomemory:38000-37fff irq:44 memory:81000000-81ffffff memory:380000000000-38000fffffff memory:380010000000-380011ffffff ioport:7000(size=128)
[root@host ~]# 

This is also a VM on a proxmox host running on a dual Intel e5-2667v3.

OS

Linux

GPU

Nvidia

CPU

Intel

Ollama version

0.4.7

Originally created by @bucovaina on GitHub (Dec 2, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7905 ### What is the issue? On a freshly installed RHEL8 host, ollama 4.7.0 does not start. It does when I install 4.6.0. Only changes I made to /etc/systemd/system/ollama.service is setting OLLAMA_HOST to 0.0.0.0 ```[root@host ~]# curl -X POST http://localhost:11434/api/generate -d '{ > "model": "phi3", > "prompt":"Why is the sky blue?" > }' {"error":"llama runner process has terminated: exit status 127"}[root@host ~]# ollama run phi3 Error: llama runner process has terminated: exit status 127 [root@host ~]# curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.4.6 sh >>> Installing ollama to /usr/local >>> Downloading Linux amd64 bundle ######################################################################## 100.0% >>> Adding ollama user to render group... >>> Adding ollama user to video group... >>> Adding current user to ollama group... >>> Creating ollama systemd service... >>> Enabling and starting ollama service... >>> NVIDIA GPU installed. [root@host ~]# curl -X POST http://localhost:11434/api/generate -d '{ "model": "phi3", "prompt":"Why is the sky blue?" }' {"model":"phi3","created_at":"2024-12-02T08:51:31.215490419Z","response":"The","done":false} {"model":"phi3","created_at":"2024-12-02T08:51:31.243051448Z","response":" sky","done":false} {"model":"phi3","created_at":"2024-12-02T08:51:31.270859148Z","response":" appears","done":false} {"model":"phi3","created_at":"2024-12-02T08:51:31.299653528Z","response":" pre","done":false} {"model":"phi3","created_at":"2024-12-02T08:51:31.329058248Z","response":"domin","done":false} {"model":"phi3","created_at":"2024-12-02T08:51:31.359087445Z","response":"antly","done":false} {"model":"phi3","created_at":"2024-12-02T08:51:31.389054145Z","response":" blue","done":false} {"model":"phi3","created_at":"2024-12-02T08:51:31.418885377Z","response":" to","done":false} {"model":"phi3","created_at":"2024-12-02T08:51:31.448781861Z","response":" the","done":false} ... ... ``` My GPU is officially not supported. So might be because of that. If you feel like closing this won't fix because it's highly likely because of the Tesla M6, feel free to do so. ``` [root@host ~]# lshw -class video *-display description: VGA compatible controller product: bochs-drmdrmfb physical id: 1 bus info: pci@0000:00:01.0 logical name: /dev/fb0 version: 02 width: 32 bits clock: 33MHz capabilities: vga_controller bus_master rom fb configuration: depth=32 driver=bochs-drm latency=0 resolution=1280,800 resources: irq:0 memory:80000000-80ffffff memory:8304b000-8304bfff memory:c0000-dffff *-display description: VGA compatible controller product: GM204GL [Tesla M6] vendor: NVIDIA Corporation physical id: 0 bus info: pci@0000:01:00.0 logical name: /dev/fb0 version: a1 width: 64 bits clock: 33MHz capabilities: pm msi pciexpress vga_controller bus_master cap_list fb configuration: depth=32 driver=nvidia latency=0 mode=1280x800 visual=truecolor xres=1280 yres=800 resources: iomemory:38000-37fff iomemory:38000-37fff irq:44 memory:81000000-81ffffff memory:380000000000-38000fffffff memory:380010000000-380011ffffff ioport:7000(size=128) [root@host ~]# ``` This is also a VM on a proxmox host running on a dual Intel e5-2667v3. ### OS Linux ### GPU Nvidia ### CPU Intel ### Ollama version 0.4.7
GiteaMirror added the needs more infobug labels 2026-04-12 16:09:07 -05:00
Author
Owner

@rick-github commented on GitHub (Dec 2, 2024):

Server logs would aid in debugging.

<!-- gh-comment-id:2510988606 --> @rick-github commented on GitHub (Dec 2, 2024): [Server logs](https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#how-to-troubleshoot-issues) would aid in debugging.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#5059