[GH-ISSUE #3814] Error: could not connect to ollama app, is it running? #2358

Closed
opened 2026-04-12 12:40:32 -05:00 by GiteaMirror · 8 comments
Owner

Originally created by @userandpass on GitHub (Apr 22, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3814

Originally assigned to: @dhiltgen on GitHub.

What is the issue?

1、modify the ollema.service file
2、systemctl daemon-reload
3、systemctl start ollama

OS

Linux

GPU

Nvidia

CPU

No response

Ollama version

ollama --version
Warning: could not connect to a running Ollama instance
Warning: client version is 0.1.32

Originally created by @userandpass on GitHub (Apr 22, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3814 Originally assigned to: @dhiltgen on GitHub. ### What is the issue? 1、modify the ollema.service file 2、systemctl daemon-reload 3、systemctl start ollama ### OS Linux ### GPU Nvidia ### CPU _No response_ ### Ollama version ollama --version Warning: could not connect to a running Ollama instance Warning: client version is 0.1.32
GiteaMirror added the bug label 2026-04-12 12:40:32 -05:00
Author
Owner

@swoh816 commented on GitHub (Apr 22, 2024):

If you recently upgraded Ollama version to 0.1.32, it is probably because of the names of model blobs. If using Linux machine, go to /usr/share/ollama/.ollama/models/blobs/, and see if the model blobs are starting with sha256:<BLOB_HASH>. If so, you must replace colon : with hyphen - so that the blob names should be sha256-<BLOB_HASH>. Change those names, and don't forget to change the user and user group to ollama:

sudo -i
cd /usr/share/ollama/.ollama/models/blobs/
for f in $(ls); do mv $f ${f//[\:]/\-}; done
cd ../
chown -R ollama blobs
chgrp -R ollama blobs

I experienced this error, and created an issue https://github.com/ollama/ollama/issues/3828.

<!-- gh-comment-id:2070127400 --> @swoh816 commented on GitHub (Apr 22, 2024): If you recently upgraded Ollama version to 0.1.32, it is probably because of the names of model blobs. If using Linux machine, go to `/usr/share/ollama/.ollama/models/blobs/`, and see if the model blobs are starting with `sha256:<BLOB_HASH>`. If so, you must replace colon `:` with hyphen `-` so that the blob names should be `sha256-<BLOB_HASH>`. Change those names, and don't forget to change the user and user group to `ollama`: ``` sudo -i cd /usr/share/ollama/.ollama/models/blobs/ for f in $(ls); do mv $f ${f//[\:]/\-}; done cd ../ chown -R ollama blobs chgrp -R ollama blobs ``` I experienced this error, and created an issue https://github.com/ollama/ollama/issues/3828.
Author
Owner

@dhiltgen commented on GitHub (May 1, 2024):

@userandpass can you share your server log?

https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md

<!-- gh-comment-id:2089305436 --> @dhiltgen commented on GitHub (May 1, 2024): @userandpass can you share your server log? https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md
Author
Owner

@Mr0519 commented on GitHub (May 5, 2024):

我是在Windows平台遇到了这个错误,经过我多次尝试终于找到了问题。这错误肯定是你更新ollama后产生的错误,问题的原因就是你在Windows的环境配置里把模型存放地址改了,但是重新安装以后无法找到你的配置文件导致的错误。这个问题的修复也很简单就是把原来的模型存放位置改了就行,然后只把模型文件单独拷贝过来就能用了,如果拷贝到新的位置不可用那就只能重新下载模型了。(或许我的经验能帮到你)
I encountered this error on the Windows platform, and after multiple attempts, I finally pinpointed the issue. The error is definitely a result of the update to ollama. The problem is that you changed the model storage location in the Windows environment settings, but after reinstalling, the system is unable to find your configuration file, leading to the error. The solution to this problem is quite straightforward: simply change the model storage location back to the original setting, and then copy only the model files over. If copying to the new location doesn't work, then you will need to download the model again.(Perhaps my experience can help you.)

<!-- gh-comment-id:2094825150 --> @Mr0519 commented on GitHub (May 5, 2024): 我是在Windows平台遇到了这个错误,经过我多次尝试终于找到了问题。这错误肯定是你更新ollama后产生的错误,问题的原因就是你在Windows的环境配置里把模型存放地址改了,但是重新安装以后无法找到你的配置文件导致的错误。这个问题的修复也很简单就是把原来的模型存放位置改了就行,然后只把模型文件单独拷贝过来就能用了,如果拷贝到新的位置不可用那就只能重新下载模型了。(或许我的经验能帮到你) I encountered this error on the Windows platform, and after multiple attempts, I finally pinpointed the issue. The error is definitely a result of the update to ollama. The problem is that you changed the model storage location in the Windows environment settings, but after reinstalling, the system is unable to find your configuration file, leading to the error. The solution to this problem is quite straightforward: simply change the model storage location back to the original setting, and then copy only the model files over. If copying to the new location doesn't work, then you will need to download the model again.(Perhaps my experience can help you.)
Author
Owner

@Animal-Machine commented on GitHub (May 14, 2024):

@dhiltgen here are more details for an issue that may (or may not) be the same.

How I got there

I realized I had not enough space on my root partition to download a model, so I tried to change the model folder:

sudo systemctl edit ollama.service

adding those lines:

[Service]
Environment="OLLAMA_MODELS=~/.ollama/models/"

then reloading the service:

$sudo systemctl daemon-reload
$sudo systemctl restart ollama
$ollama run llama3
Error: could not connect to ollama app, is it running?

My server log

$ sudo journalctl -u ollama | tail
May 14 16:50:03 mymachine systemd[1]: Started ollama.service - Ollama Service.
May 14 16:50:03 mymachine ollama[1558737]: 2024/05/14 16:50:03 routes.go:1006: INFO server config env="map[OLLAMA_DEBUG:false OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:1 OLLAMA_MAX_QUEUE:512 OLLAMA_MAX_VRAM:0 OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:*] OLLAMA_RUNNERS_DIR: OLLAMA_TMPDIR:]"
May 14 16:50:03 mymachine ollama[1558737]: Error: mkdir ~: permission denied
May 14 16:50:03 mymachine systemd[1]: ollama.service: Main process exited, code=exited, status=1/FAILURE
May 14 16:50:03 mymachine systemd[1]: ollama.service: Failed with result 'exit-code'.

My solution

Edit ollama.service again, but replacing ~ by /home/myusername:

[Service]
Environment="OLLAMA_MODELS=/home/myusername/.ollama/models/"

Change .ollama folder's owner to avoid permission denied error:

cd ~
sudo chown -R ollama:ollama .ollama/models

I can get back to download the model, now it works!

sudo systemctl daemon-reload
sudo systemctl restart ollama
ollama run llama3
<!-- gh-comment-id:2110481901 --> @Animal-Machine commented on GitHub (May 14, 2024): @dhiltgen here are more details for an issue that may (or may not) be the same. ### How I got there I realized I had not enough space on my root partition to download a model, so I tried to change the model folder: ``` sudo systemctl edit ollama.service ``` adding those lines: ``` [Service] Environment="OLLAMA_MODELS=~/.ollama/models/" ``` then reloading the service: ``` $sudo systemctl daemon-reload $sudo systemctl restart ollama $ollama run llama3 Error: could not connect to ollama app, is it running? ``` ### My server log ``` $ sudo journalctl -u ollama | tail May 14 16:50:03 mymachine systemd[1]: Started ollama.service - Ollama Service. May 14 16:50:03 mymachine ollama[1558737]: 2024/05/14 16:50:03 routes.go:1006: INFO server config env="map[OLLAMA_DEBUG:false OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:1 OLLAMA_MAX_QUEUE:512 OLLAMA_MAX_VRAM:0 OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:*] OLLAMA_RUNNERS_DIR: OLLAMA_TMPDIR:]" May 14 16:50:03 mymachine ollama[1558737]: Error: mkdir ~: permission denied May 14 16:50:03 mymachine systemd[1]: ollama.service: Main process exited, code=exited, status=1/FAILURE May 14 16:50:03 mymachine systemd[1]: ollama.service: Failed with result 'exit-code'. ``` ### My solution Edit ollama.service again, but replacing `~` by `/home/myusername`: ``` [Service] Environment="OLLAMA_MODELS=/home/myusername/.ollama/models/" ``` Change `.ollama` folder's owner to avoid `permission denied` error: ``` cd ~ sudo chown -R ollama:ollama .ollama/models ``` I can get back to download the model, now it works! ``` sudo systemctl daemon-reload sudo systemctl restart ollama ollama run llama3 ```
Author
Owner

@ergosumdre commented on GitHub (May 16, 2024):

sudo chown -R ollama:ollama

Thanks, this helped alot. Since I created a different folder i had to give ollama permissions to the new folder by:
sudo chown -R ollama:ollama FOLDER_PATH

<!-- gh-comment-id:2113789236 --> @ergosumdre commented on GitHub (May 16, 2024): > sudo chown -R ollama:ollama Thanks, this helped alot. Since I created a different folder i had to give ollama permissions to the new folder by: `sudo chown -R ollama:ollama FOLDER_PATH`
Author
Owner

@Animal-Machine commented on GitHub (May 16, 2024):

sudo chown -R ollama:ollama

Thanks, this helped alot. Since I created a different folder i had to give ollama permissions to the new folder by: sudo chown -R ollama:ollama FOLDER_PATH

You're welcome! Glad I helped you.

Also, I just updated my solution, this will be helpful if you choose to put your models into /home/username/.ollama/models. The correct command is not sudo chown -R ollama:ollama .ollama as I first stated but sudo chown -R ollama:ollama .ollama/models.

Indeed, the .ollama directory in my home also contains the history which I must own, not ollama. Otherwise I get this + a terminal freeze:

$ ollama run llama3
Error: open /home/username/.ollama/history: permission denied
<!-- gh-comment-id:2115011275 --> @Animal-Machine commented on GitHub (May 16, 2024): > > sudo chown -R ollama:ollama > > Thanks, this helped alot. Since I created a different folder i had to give ollama permissions to the new folder by: `sudo chown -R ollama:ollama FOLDER_PATH` You're welcome! Glad I helped you. Also, I just updated my solution, this will be helpful if you choose to put your models into `/home/username/.ollama/models`. The correct command is not `sudo chown -R ollama:ollama .ollama` as I first stated but `sudo chown -R ollama:ollama .ollama/models`. Indeed, the `.ollama` directory in my home also contains the history which I must own, not ollama. Otherwise I get this + a terminal freeze: ```bash $ ollama run llama3 Error: open /home/username/.ollama/history: permission denied ```
Author
Owner

@dhiltgen commented on GitHub (May 31, 2024):

@userandpass if you're still having trouble, please make sure to upgrade to the latest version and if that doesn't resolve it, please share the server logs and I'll reopen this issue.

<!-- gh-comment-id:2143019019 --> @dhiltgen commented on GitHub (May 31, 2024): @userandpass if you're still having trouble, please make sure to upgrade to the latest version and if that doesn't resolve it, please share the server logs and I'll reopen this issue.
Author
Owner

@P-ek commented on GitHub (Dec 25, 2024):

You need to launch Ollama before you can pull or run models, in the case of a local attempt, you would simply run (after installing Ollama):

ollama serve

Open new prompt and you would subsequently be able to run "your model":

ollama run your model

<!-- gh-comment-id:2562027345 --> @P-ek commented on GitHub (Dec 25, 2024): You need to launch Ollama before you can pull or run models, in the case of a local attempt, you would simply run (after installing Ollama): `ollama serve` Open new prompt and you would subsequently be able to run "your model": `ollama run your model`
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#2358