[GH-ISSUE #4184] Warning: could not connect to a running Ollama instance #49115

Closed
opened 2026-04-28 10:45:51 -05:00 by GiteaMirror · 8 comments
Owner

Originally created by @rkuo2000 on GitHub (May 5, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/4184

Originally assigned to: @dhiltgen on GitHub.

What is the issue?

curl -fsSL https://ollama.com/install.sh | sh

ollama -v
Warning: could not connect to a running Ollama instance
Warning: client version is 0.1.33

OS

Linux

GPU

Nvidia

CPU

Intel

Ollama version

0.1.33

Originally created by @rkuo2000 on GitHub (May 5, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/4184 Originally assigned to: @dhiltgen on GitHub. ### What is the issue? curl -fsSL https://ollama.com/install.sh | sh ollama -v Warning: could not connect to a running Ollama instance Warning: client version is 0.1.33 ### OS Linux ### GPU Nvidia ### CPU Intel ### Ollama version 0.1.33
GiteaMirror added the bugnvidialinux labels 2026-04-28 10:45:51 -05:00
Author
Owner

@dhiltgen commented on GitHub (May 5, 2024):

Can you share your server log?

https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md

<!-- gh-comment-id:2094970500 --> @dhiltgen commented on GitHub (May 5, 2024): Can you share your server log? https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md
Author
Owner

@rkuo2000 commented on GitHub (May 5, 2024):

Can you share your server log?

https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md

There is not a running Ollama instance, how can I log if it is not running at all ?
The binary is downloaded to /usr/local/bin/ollama, but it runs nothing, only show version.
ollama run llama3 --> nothing run

v0.1.32 was running OK, how can I fallback to previous version ?

<!-- gh-comment-id:2094975245 --> @rkuo2000 commented on GitHub (May 5, 2024): > Can you share your server log? > > https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md There is not a running Ollama instance, how can I log if it is not running at all ? The binary is downloaded to /usr/local/bin/ollama, but it runs nothing, only show version. ollama run llama3 --> nothing run v0.1.32 was running OK, how can I fallback to previous version ?
Author
Owner

@dhiltgen commented on GitHub (May 5, 2024):

If you ran the install script, then it should be set up to run as a system service. Running systemctl status might explain more. Here's what I see on a system where I shut it down manually:

% sudo systemctl status ollama
○ ollama.service - Ollama Service
     Loaded: loaded (/etc/systemd/system/ollama.service; enabled; vendor preset: enabled)
     Active: inactive (dead) since Sat 2024-05-04 09:06:13 PDT; 1 day 6h ago
    Process: 3398 ExecStart=/usr/bin/ollama serve (code=exited, status=0/SUCCESS)
   Main PID: 3398 (code=exited, status=0/SUCCESS)
        CPU: 1min 12.848s

Apr 07 15:45:52 daniel-laptop systemd[1]: Started Ollama Service.
Apr 07 15:45:52 daniel-laptop ollama[3398]: 2024/04/07 15:45:52 images.go:996: total blobs: 10
Apr 07 15:45:52 daniel-laptop ollama[3398]: 2024/04/07 15:45:52 images.go:1003: total unused blobs removed: 0
Apr 07 15:45:52 daniel-laptop ollama[3398]: 2024/04/07 15:45:52 routes.go:564: Listening on 127.0.0.1:11434
May 04 09:06:13 daniel-laptop systemd[1]: Stopping Ollama Service...
May 04 09:06:13 daniel-laptop systemd[1]: ollama.service: Deactivated successfully.
May 04 09:06:13 daniel-laptop systemd[1]: Stopped Ollama Service.
May 04 09:06:13 daniel-laptop systemd[1]: ollama.service: Consumed 1min 12.848s CPU time.

If there's nothing obvious in that output, then try checking the log

% sudo journalctl -u ollama > /tmp/server.log

If it is crashing or having some other problem, there should typically be information in the log.

To workaround by downgrading, you can install older versions by following the instructions here - https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#installing-older-or-pre-release-versions-on-linux

<!-- gh-comment-id:2094984865 --> @dhiltgen commented on GitHub (May 5, 2024): If you ran the install script, then it should be set up to run as a system service. Running `systemctl status` might explain more. Here's what I see on a system where I shut it down manually: ``` % sudo systemctl status ollama ○ ollama.service - Ollama Service Loaded: loaded (/etc/systemd/system/ollama.service; enabled; vendor preset: enabled) Active: inactive (dead) since Sat 2024-05-04 09:06:13 PDT; 1 day 6h ago Process: 3398 ExecStart=/usr/bin/ollama serve (code=exited, status=0/SUCCESS) Main PID: 3398 (code=exited, status=0/SUCCESS) CPU: 1min 12.848s Apr 07 15:45:52 daniel-laptop systemd[1]: Started Ollama Service. Apr 07 15:45:52 daniel-laptop ollama[3398]: 2024/04/07 15:45:52 images.go:996: total blobs: 10 Apr 07 15:45:52 daniel-laptop ollama[3398]: 2024/04/07 15:45:52 images.go:1003: total unused blobs removed: 0 Apr 07 15:45:52 daniel-laptop ollama[3398]: 2024/04/07 15:45:52 routes.go:564: Listening on 127.0.0.1:11434 May 04 09:06:13 daniel-laptop systemd[1]: Stopping Ollama Service... May 04 09:06:13 daniel-laptop systemd[1]: ollama.service: Deactivated successfully. May 04 09:06:13 daniel-laptop systemd[1]: Stopped Ollama Service. May 04 09:06:13 daniel-laptop systemd[1]: ollama.service: Consumed 1min 12.848s CPU time. ``` If there's nothing obvious in that output, then try checking the log ``` % sudo journalctl -u ollama > /tmp/server.log ``` If it is crashing or having some other problem, there should typically be information in the log. To workaround by downgrading, you can install older versions by following the instructions here - https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#installing-older-or-pre-release-versions-on-linux
Author
Owner

@rkuo2000 commented on GitHub (May 6, 2024):

It will work if I keep ollama serve running
<Terminal 1> ollama serve
<Terminal 2> ollama run llama3

systemctl status ollama

Unit ollama.service could not be found.

sudo journalctl -u ollama > /tmp/server.log

from server.log, I found
Error: could not create directory mkdir /usr/share/ollama: permission denied

May 06 04:35:29 rkuo-Z790-UD systemd[1]: Started Ollama Service.
May 06 04:35:29 rkuo-Z790-UD ollama[11979]: Couldn't find '/usr/share/ollama/.ollama/id_ed25519'. Generating new private key.
May 06 04:35:29 rkuo-Z790-UD ollama[11979]: Error: could not create directory mkdir /usr/share/ollama: permission denied
May 06 04:35:29 rkuo-Z790-UD systemd[1]: ollama.service: Main process exited, code=exited, status=1/FAILURE
May 06 04:35:29 rkuo-Z790-UD systemd[1]: ollama.service: Failed with result 'exit-code'.
May 06 04:35:30 rkuo-Z790-UD systemd[1]: Stopped Ollama Service.

<!-- gh-comment-id:2096085713 --> @rkuo2000 commented on GitHub (May 6, 2024): It will work if I keep ollama serve running <Terminal 1> ollama serve <Terminal 2> ollama run llama3 systemctl status ollama > Unit ollama.service could not be found. sudo journalctl -u ollama > /tmp/server.log from server.log, I found Error: could not create directory mkdir /usr/share/ollama: permission denied May 06 04:35:29 rkuo-Z790-UD systemd[1]: Started Ollama Service. May 06 04:35:29 rkuo-Z790-UD ollama[11979]: Couldn't find '/usr/share/ollama/.ollama/id_ed25519'. Generating new private key. May 06 04:35:29 rkuo-Z790-UD ollama[11979]: Error: could not create directory mkdir /usr/share/ollama: permission denied May 06 04:35:29 rkuo-Z790-UD systemd[1]: ollama.service: Main process exited, code=exited, status=1/FAILURE May 06 04:35:29 rkuo-Z790-UD systemd[1]: ollama.service: Failed with result 'exit-code'. May 06 04:35:30 rkuo-Z790-UD systemd[1]: Stopped Ollama Service.
Author
Owner

@rkuo2000 commented on GitHub (May 7, 2024):

systemctl status ollama
Unit ollama.service could not be found.

but ollama.serve & ollama run llama3 both are running in different terminals ?

<!-- gh-comment-id:2098338619 --> @rkuo2000 commented on GitHub (May 7, 2024): systemctl status ollama Unit ollama.service could not be found. but ollama.serve & ollama run llama3 both are running in different terminals ?
Author
Owner

@dhiltgen commented on GitHub (May 7, 2024):

@rkuo2000 it sounds like the install script didn't work correctly. What Linux Distro are you running? Did you see any warnings or errors when you tried to run the install script?

Ollama is a client-server architecture. The linux install script sets up the service as a systemctl system service, running as user ollama and should create the directories with the right permissions. As a workaround you can follow the manual install instructions here https://github.com/ollama/ollama/blob/main/docs/linux.md#manual-install

<!-- gh-comment-id:2098763715 --> @dhiltgen commented on GitHub (May 7, 2024): @rkuo2000 it sounds like the install script didn't work correctly. What Linux Distro are you running? Did you see any warnings or errors when you tried to run the install script? Ollama is a client-server architecture. The linux install script sets up the service as a systemctl system service, running as user `ollama` and should create the directories with the right permissions. As a workaround you can follow the manual install instructions here https://github.com/ollama/ollama/blob/main/docs/linux.md#manual-install
Author
Owner

@rkuo2000 commented on GitHub (May 7, 2024):

Ubuntu 22.04.4 LTS
linux kernel = 6.5.0-28-generic

@rkuo2000 it sounds like the install script didn't work correctly. What Linux Distro are you running? Did you see any warnings or errors when you tried to run the install script?

Ollama is a client-server architecture. The linux install script sets up the service as a systemctl system service, running as user ollama and should create the directories with the right permissions. As a workaround you can follow the manual install instructions here https://github.com/ollama/ollama/blob/main/docs/linux.md#manual-install

<!-- gh-comment-id:2099189873 --> @rkuo2000 commented on GitHub (May 7, 2024): Ubuntu 22.04.4 LTS linux kernel = 6.5.0-28-generic > @rkuo2000 it sounds like the install script didn't work correctly. What Linux Distro are you running? Did you see any warnings or errors when you tried to run the install script? > > Ollama is a client-server architecture. The linux install script sets up the service as a systemctl system service, running as user `ollama` and should create the directories with the right permissions. As a workaround you can follow the manual install instructions here https://github.com/ollama/ollama/blob/main/docs/linux.md#manual-install
Author
Owner

@rkuo2000 commented on GitHub (May 7, 2024):

After going through manually remove and installation, it works as good as v1.0.32 now.
Thanks !

<!-- gh-comment-id:2099217755 --> @rkuo2000 commented on GitHub (May 7, 2024): After going through manually remove and installation, it works as good as v1.0.32 now. Thanks !
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#49115