[GH-ISSUE #2727] Error: could not connect to ollama app, is it running? on Windows 10 #1638

Closed
opened 2026-04-12 11:34:42 -05:00 by GiteaMirror · 20 comments
Owner

Originally created by @Alias4D on GitHub (Feb 24, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2727

Error: could not connect to ollama app, is it running? on windows 10

image

log file 👍

time=2024-02-24T14:24:23.004+03:00 level=WARN source=server.go:113 msg="server crash 1 - exit code 2 - respawning"
time=2024-02-24T14:24:23.513+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-02-24T14:24:24.208+03:00 level=WARN source=server.go:113 msg="server crash 24 - exit code 2 - respawning"
time=2024-02-24T14:24:24.528+03:00 level=WARN source=server.go:113 msg="server crash 2 - exit code 2 - respawning"
time=2024-02-24T14:24:24.717+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-02-24T14:24:25.036+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-02-24T14:24:27.039+03:00 level=WARN source=server.go:113 msg="server crash 3 - exit code 2 - respawning"
time=2024-02-24T14:24:27.545+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-02-24T14:24:29.295+03:00 level=WARN source=server.go:113 msg="server crash 11 - exit code 2 - respawning"
time=2024-02-24T14:24:29.796+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-02-24T14:24:30.556+03:00 level=WARN source=server.go:113 msg="server crash 4 - exit code 2 - respawning"
time=2024-02-24T14:24:30.807+03:00 level=WARN source=server.go:113 msg="server crash 15 - exit code 2 - respawning"
time=2024-02-24T14:24:31.072+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-02-24T14:24:31.309+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-02-24T14:24:33.097+03:00 level=INFO source=lifecycle.go:87 msg="Waiting for ollama server to shutdown..."
time=2024-02-24T14:24:34.350+03:00 level=INFO source=lifecycle.go:87 msg="Waiting for ollama server to shutdown..."
time=2024-02-24T14:24:35.086+03:00 level=INFO source=lifecycle.go:91 msg="Ollama app exiting"
time=2024-02-24T14:24:36.017+03:00 level=INFO source=lifecycle.go:87 msg="Waiting for ollama server to shutdown..."
time=2024-02-24T14:24:40.805+03:00 level=INFO source=lifecycle.go:91 msg="Ollama app exiting"
time=2024-02-24T14:24:46.324+03:00 level=INFO source=lifecycle.go:91 msg="Ollama app exiting"
time=2024-02-24T14:24:48.722+03:00 level=WARN source=server.go:113 msg="server crash 25 - exit code 2 - respawning"
time=2024-02-24T14:24:49.228+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"

Originally created by @Alias4D on GitHub (Feb 24, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2727 Error: could not connect to ollama app, is it running? on windows 10 ![image](https://github.com/ollama/ollama/assets/27604791/c67859cb-0233-49aa-a03a-eea3394d848e) log file 👍 time=2024-02-24T14:24:23.004+03:00 level=WARN source=server.go:113 msg="server crash 1 - exit code 2 - respawning" time=2024-02-24T14:24:23.513+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started" time=2024-02-24T14:24:24.208+03:00 level=WARN source=server.go:113 msg="server crash 24 - exit code 2 - respawning" time=2024-02-24T14:24:24.528+03:00 level=WARN source=server.go:113 msg="server crash 2 - exit code 2 - respawning" time=2024-02-24T14:24:24.717+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started" time=2024-02-24T14:24:25.036+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started" time=2024-02-24T14:24:27.039+03:00 level=WARN source=server.go:113 msg="server crash 3 - exit code 2 - respawning" time=2024-02-24T14:24:27.545+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started" time=2024-02-24T14:24:29.295+03:00 level=WARN source=server.go:113 msg="server crash 11 - exit code 2 - respawning" time=2024-02-24T14:24:29.796+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started" time=2024-02-24T14:24:30.556+03:00 level=WARN source=server.go:113 msg="server crash 4 - exit code 2 - respawning" time=2024-02-24T14:24:30.807+03:00 level=WARN source=server.go:113 msg="server crash 15 - exit code 2 - respawning" time=2024-02-24T14:24:31.072+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started" time=2024-02-24T14:24:31.309+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started" time=2024-02-24T14:24:33.097+03:00 level=INFO source=lifecycle.go:87 msg="Waiting for ollama server to shutdown..." time=2024-02-24T14:24:34.350+03:00 level=INFO source=lifecycle.go:87 msg="Waiting for ollama server to shutdown..." time=2024-02-24T14:24:35.086+03:00 level=INFO source=lifecycle.go:91 msg="Ollama app exiting" time=2024-02-24T14:24:36.017+03:00 level=INFO source=lifecycle.go:87 msg="Waiting for ollama server to shutdown..." time=2024-02-24T14:24:40.805+03:00 level=INFO source=lifecycle.go:91 msg="Ollama app exiting" time=2024-02-24T14:24:46.324+03:00 level=INFO source=lifecycle.go:91 msg="Ollama app exiting" time=2024-02-24T14:24:48.722+03:00 level=WARN source=server.go:113 msg="server crash 25 - exit code 2 - respawning" time=2024-02-24T14:24:49.228+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
Author
Owner

@Alias4D commented on GitHub (Feb 25, 2024):

I try reinstall ollama but same error message 😌
App.log below 👇
time=2024-02-25T17:50:52.780+03:00 level=WARN source=server.go:113 msg="server crash 1 - exit code 2 - respawning"
time=2024-02-25T17:50:53.293+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-02-25T17:50:53.451+03:00 level=WARN source=server.go:113 msg="server crash 30 - exit code 2 - respawning"
time=2024-02-25T17:50:53.957+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-02-25T17:50:54.308+03:00 level=WARN source=server.go:113 msg="server crash 2 - exit code 2 - respawning"
time=2024-02-25T17:50:54.815+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-02-25T17:50:56.822+03:00 level=WARN source=server.go:113 msg="server crash 3 - exit code 2 - respawning"
time=2024-02-25T17:50:57.327+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-02-25T17:51:00.328+03:00 level=WARN source=server.go:113 msg="server crash 4 - exit code 2 - respawning"
time=2024-02-25T17:51:00.836+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-02-25T17:51:04.841+03:00 level=WARN source=server.go:113 msg="server crash 5 - exit code 2 - respawning"
time=2024-02-25T17:51:05.348+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started"
time=2024-02-25T17:51:10.358+03:00 level=WARN source=server.go:113 msg="server crash 6 - exit code 2 - respawning"
image

what is solution , Please ............. ☺

<!-- gh-comment-id:1962964955 --> @Alias4D commented on GitHub (Feb 25, 2024): I try reinstall ollama but same error message 😌 App.log below 👇 time=2024-02-25T17:50:52.780+03:00 level=WARN source=server.go:113 msg="server crash 1 - exit code 2 - respawning" time=2024-02-25T17:50:53.293+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started" time=2024-02-25T17:50:53.451+03:00 level=WARN source=server.go:113 msg="server crash 30 - exit code 2 - respawning" time=2024-02-25T17:50:53.957+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started" time=2024-02-25T17:50:54.308+03:00 level=WARN source=server.go:113 msg="server crash 2 - exit code 2 - respawning" time=2024-02-25T17:50:54.815+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started" time=2024-02-25T17:50:56.822+03:00 level=WARN source=server.go:113 msg="server crash 3 - exit code 2 - respawning" time=2024-02-25T17:50:57.327+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started" time=2024-02-25T17:51:00.328+03:00 level=WARN source=server.go:113 msg="server crash 4 - exit code 2 - respawning" time=2024-02-25T17:51:00.836+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started" time=2024-02-25T17:51:04.841+03:00 level=WARN source=server.go:113 msg="server crash 5 - exit code 2 - respawning" time=2024-02-25T17:51:05.348+03:00 level=ERROR source=server.go:116 msg="failed to restart server exec: already started" time=2024-02-25T17:51:10.358+03:00 level=WARN source=server.go:113 msg="server crash 6 - exit code 2 - respawning" ![image](https://github.com/ollama/ollama/assets/27604791/dd240a31-b3ea-497d-ae77-27511986efa4) what is solution , Please ............. ☺
Author
Owner

@shreyahegde18 commented on GitHub (Feb 28, 2024):

Try this after installing ollama, run
$ ollama serve

let that be there. open another shell and run ollama [commands], as an example:
$ ollama run llama2

<!-- gh-comment-id:1969331044 --> @shreyahegde18 commented on GitHub (Feb 28, 2024): Try this after installing ollama, run `$ ollama serve` let that be there. open another shell and run ollama [commands], as an example: `$ ollama run llama2`
Author
Owner

@Alias4D commented on GitHub (Feb 28, 2024):

Thanks, problem fixed

<!-- gh-comment-id:1969691208 --> @Alias4D commented on GitHub (Feb 28, 2024): Thanks, problem fixed
Author
Owner

@Alias4D commented on GitHub (Feb 28, 2024):

Thanks, problem fixed

<!-- gh-comment-id:1969691210 --> @Alias4D commented on GitHub (Feb 28, 2024): Thanks, problem fixed
Author
Owner

@MartianInGreen commented on GitHub (Mar 12, 2024):

You might have to enable the systemctl service manually, after that the ollama serve should be redundant.

systemctl enable ollama
systemctl start ollama
<!-- gh-comment-id:1992360587 --> @MartianInGreen commented on GitHub (Mar 12, 2024): You might have to enable the systemctl service manually, after that the `ollama serve` should be redundant. ```bash systemctl enable ollama systemctl start ollama ```
Author
Owner

@roshdwivedi commented on GitHub (Mar 21, 2024):

You might have to enable the systemctl service manually, after that the ollama serve should be redundant.

systemctl enable ollama
systemctl start ollama

I tried this and i am getting this error not sure what it means

└─$ systemctl enable ollama
systemctl start ollama
System has not been booted with systemd as init system (PID 1). Can't operate.
Failed to connect to bus: Host is down

<!-- gh-comment-id:2012798186 --> @roshdwivedi commented on GitHub (Mar 21, 2024): > You might have to enable the systemctl service manually, after that the `ollama serve` should be redundant. > > ```shell > systemctl enable ollama > systemctl start ollama > ``` I tried this and i am getting this error not sure what it means > └─$ systemctl enable ollama systemctl start ollama System has not been booted with systemd as init system (PID 1). Can't operate. Failed to connect to bus: Host is down
Author
Owner

@mybr4inhurts commented on GitHub (Mar 25, 2024):

You started on a windows host with windows backend but your posted error message suggests that you are using the wsl with a linux distribution.

My wild guess is that you use the default distro wich would be Ubuntu LTS. Luckily systemd is supported and can now be activated for your WSL distro: https://learn.microsoft.com/en-us/windows/wsl/systemd

After activating systemd, you should be able to systemctl enable ollama

<!-- gh-comment-id:2018012000 --> @mybr4inhurts commented on GitHub (Mar 25, 2024): You started on a windows host with windows backend but your posted error message suggests that you are using the wsl with a linux distribution. My wild guess is that you use the default distro wich would be Ubuntu LTS. Luckily systemd is supported and can now be activated for your WSL distro: https://learn.microsoft.com/en-us/windows/wsl/systemd After activating systemd, you should be able to `systemctl enable ollama`
Author
Owner

@DeepNeurons commented on GitHub (Apr 2, 2024):

here just to mention that ollama service has to be launched first (verify that on ubuntu : systemctl status ollama.service) then you can start working with it

<!-- gh-comment-id:2032972209 --> @DeepNeurons commented on GitHub (Apr 2, 2024): here just to mention that ollama service has to be launched first (verify that on ubuntu : systemctl status ollama.service) then you can start working with it
Author
Owner

@Toppbeatz commented on GitHub (Apr 8, 2024):

Getting error is ollama running right after updating to the latest version of ollama

<!-- gh-comment-id:2043823477 --> @Toppbeatz commented on GitHub (Apr 8, 2024): Getting error is ollama running right after updating to the latest version of ollama
Author
Owner

@jayrodathome commented on GitHub (May 21, 2024):

Try this after installing ollama, run $ ollama serve

let that be there. open another shell and run ollama [commands], as an example: $ ollama run llama2

TY! this worked. Just had to open another shell. TY

<!-- gh-comment-id:2123593444 --> @jayrodathome commented on GitHub (May 21, 2024): > Try this after installing ollama, run `$ ollama serve` > > let that be there. open another shell and run ollama [commands], as an example: `$ ollama run llama2` TY! this worked. Just had to open another shell. TY
Author
Owner

@danydin commented on GitHub (Jun 30, 2024):

You might have to enable the systemctl service manually, after that the ollama serve should be redundant.

systemctl enable ollama
systemctl start ollama

that's not working on mac, also lunchctl wasn't recgonized by the zsh terminal

<!-- gh-comment-id:2198445026 --> @danydin commented on GitHub (Jun 30, 2024): > You might have to enable the systemctl service manually, after that the `ollama serve` should be redundant. > > ```shell > systemctl enable ollama > systemctl start ollama > ``` that's not working on mac, also lunchctl wasn't recgonized by the zsh terminal
Author
Owner

@lmaddox commented on GitHub (Jul 28, 2024):

Mine are listening on specific addresses, it would seem:

netstat -tupln|grep -F :11434

This is what worked for me:

OLLAMA_HOST=my-host:11434 ollama run my-model

<!-- gh-comment-id:2254626114 --> @lmaddox commented on GitHub (Jul 28, 2024): Mine are listening on specific addresses, it would seem: `netstat -tupln|grep -F :11434` This is what worked for me: `OLLAMA_HOST=my-host:11434 ollama run my-model`
Author
Owner

@GunjaGupta1233 commented on GitHub (Aug 3, 2024):

So simple problem, just do this, and this will solve this error -
ollama serve

<!-- gh-comment-id:2266687613 --> @GunjaGupta1233 commented on GitHub (Aug 3, 2024): So simple problem, just do this, and this will solve this error - `ollama serve`
Author
Owner

@gabriel-batistuta commented on GitHub (Jan 4, 2025):

same issue here

<!-- gh-comment-id:2571397823 --> @gabriel-batistuta commented on GitHub (Jan 4, 2025): same issue here
Author
Owner

@rakete commented on GitHub (Jan 11, 2025):

I am having this issue right now, the small tray application won't start an ollama server, and any ollama command that I run will actually add another instance of the tray application to my systray:

invasion-of-the-ollama

The reason is that suddenly (after an update I guess?) there is a port exclusion range configured on my Windows 11 that includes port 11434:

C:\Users\Rakete› netsh interface ipv4 show excludedportrange protocol=tcp

Protocol tcp Port Exclusion Ranges

Start Port    End Port
----------    --------
     10097       10196
     10197       10296
     10297       10396
     10397       10496
     10497       10596
     10597       10696
     10697       10796
     10797       10896
     10897       10996
     11198       11297
     11298       11397
     11398       11497 <- 11434 included here
     11498       11597
     11598       11697
     11698       11797
     11798       11897
     11898       11997
     11998       12097
     50000       50059     *

* - Administered port exclusions.

and because of that I can't actually start ollama serve:

→ C:\Users\Rakete› ollama serve
Error: listen tcp 127.0.0.1:11434: bind: An attempt was made to access a socket in a way forbidden by its access permissions.

(same thing is also logged in server.log)

seems even as admin it won't start.

So to get it to work with port 11434, either change the default port, or remove the exclusion range:

→ C:\Users\Rakete› netsh int ipv4 delete excludedportrange protocol=tcp startport=11398 numberofports=100
Access is denied.

but I can't!

Well, I think it's easier to just change the default port tbh, but you could also try disabling Hyper-V, then reserving that port range (so Hyper-V can't reserve it for itself), then re-enable Hyper-V. But I am not going to bother right now.

<!-- gh-comment-id:2585477169 --> @rakete commented on GitHub (Jan 11, 2025): I am having this issue right now, the small tray application won't start an ollama server, and any ollama command that I run will actually add another instance of the tray application to my systray: ![invasion-of-the-ollama](https://github.com/user-attachments/assets/a594a69d-2065-4955-a409-4da2f0168fc5) The reason is that suddenly (after an update I guess?) there is a port exclusion range configured on my Windows 11 that includes port 11434: ``` C:\Users\Rakete› netsh interface ipv4 show excludedportrange protocol=tcp Protocol tcp Port Exclusion Ranges Start Port End Port ---------- -------- 10097 10196 10197 10296 10297 10396 10397 10496 10497 10596 10597 10696 10697 10796 10797 10896 10897 10996 11198 11297 11298 11397 11398 11497 <- 11434 included here 11498 11597 11598 11697 11698 11797 11798 11897 11898 11997 11998 12097 50000 50059 * * - Administered port exclusions. ``` and because of that I can't actually start `ollama serve`: ``` → C:\Users\Rakete› ollama serve Error: listen tcp 127.0.0.1:11434: bind: An attempt was made to access a socket in a way forbidden by its access permissions. ``` (same thing is also logged in server.log) seems even as admin it won't start. So to get it to work with port 11434, either change the default port, or remove the exclusion range: ``` → C:\Users\Rakete› netsh int ipv4 delete excludedportrange protocol=tcp startport=11398 numberofports=100 Access is denied. ``` but I can't! Well, I think it's easier to just change the default port tbh, but you could also try disabling Hyper-V, then reserving that port range (so Hyper-V can't reserve it for itself), then re-enable Hyper-V. But I am not going to bother right now.
Author
Owner

@Rainweic commented on GitHub (Feb 14, 2025):

Two ways:

1: default way

sudo systemctl start ollama
sudo systemctl status ollama

2: set ENV

sudo systemctl edit ollama

[Service]
Environment="OLLAMA_MODELS=/path/to/download/model"
Environment="OLLAMA_HOST=0.0.0.0:8000"

sudo systemctl start ollama
sudo systemctl status ollama
OLLAMA_HOST=my-host:8000 ollama run xxxx
<!-- gh-comment-id:2658142248 --> @Rainweic commented on GitHub (Feb 14, 2025): Two ways: 1: default way ```bash sudo systemctl start ollama sudo systemctl status ollama ``` 2: set ENV ```bash sudo systemctl edit ollama ``` [Service] Environment="OLLAMA_MODELS=/path/to/download/model" Environment="OLLAMA_HOST=0.0.0.0:8000" ```bash sudo systemctl start ollama sudo systemctl status ollama OLLAMA_HOST=my-host:8000 ollama run xxxx ```
Author
Owner

@hczs commented on GitHub (Feb 17, 2025):

Two ways:

1: default way

sudo systemctl start ollama
sudo systemctl status ollama
2: set ENV

sudo systemctl edit ollama
[Service] Environment="OLLAMA_MODELS=/path/to/download/model" Environment="OLLAMA_HOST=0.0.0.0:8000"

sudo systemctl start ollama
sudo systemctl status ollama
OLLAMA_HOST=my-host:8000 ollama run xxxx

Thank you for your help! Your method solved the problem.

<!-- gh-comment-id:2662291471 --> @hczs commented on GitHub (Feb 17, 2025): > Two ways: > > 1: default way > > sudo systemctl start ollama > sudo systemctl status ollama > 2: set ENV > > sudo systemctl edit ollama > [Service] Environment="OLLAMA_MODELS=/path/to/download/model" Environment="OLLAMA_HOST=0.0.0.0:8000" > > sudo systemctl start ollama > sudo systemctl status ollama > OLLAMA_HOST=my-host:8000 ollama run xxxx Thank you for your help! Your method solved the problem.
Author
Owner

@Rainweic commented on GitHub (Feb 25, 2025):

Two ways:
1: default way
sudo systemctl start ollama
sudo systemctl status ollama
2: set ENV
sudo systemctl edit ollama
[Service] Environment="OLLAMA_MODELS=/path/to/download/model" Environment="OLLAMA_HOST=0.0.0.0:8000"
sudo systemctl start ollama
sudo systemctl status ollama
OLLAMA_HOST=my-host:8000 ollama run xxxx

Thank you for your help! Your method solved the problem.

不客气 应该的 hhhh

<!-- gh-comment-id:2681173758 --> @Rainweic commented on GitHub (Feb 25, 2025): > > Two ways: > > 1: default way > > sudo systemctl start ollama > > sudo systemctl status ollama > > 2: set ENV > > sudo systemctl edit ollama > > [Service] Environment="OLLAMA_MODELS=/path/to/download/model" Environment="OLLAMA_HOST=0.0.0.0:8000" > > sudo systemctl start ollama > > sudo systemctl status ollama > > OLLAMA_HOST=my-host:8000 ollama run xxxx > > Thank you for your help! Your method solved the problem. 不客气 应该的 hhhh
Author
Owner

@dklobucaric commented on GitHub (Apr 4, 2025):

Please, do have in mind of ollama wont start check your disk space, i just had an issue, as my ollama is a systemd service, and i use docker for Open Web UI, that i see no models, and ollama wont start. So just my experience

<!-- gh-comment-id:2778371020 --> @dklobucaric commented on GitHub (Apr 4, 2025): Please, do have in mind of ollama wont start check your disk space, i just had an issue, as my ollama is a systemd service, and i use docker for Open Web UI, that i see no models, and ollama wont start. So just my experience
Author
Owner

@xmesaj2 commented on GitHub (Apr 12, 2025):

for me it was an issue on Windows that OLLAMA_MODELS variable in advanced system settings was pointing to my network share, which was offline

<!-- gh-comment-id:2798851199 --> @xmesaj2 commented on GitHub (Apr 12, 2025): for me it was an issue on Windows that OLLAMA_MODELS variable in advanced system settings was pointing to my network share, which was offline
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#1638