[GH-ISSUE #7524] Error: could not connect to ollama app, is it running? #66841

Open
opened 2026-05-04 08:20:53 -05:00 by GiteaMirror · 21 comments
Owner

Originally created by @BongozGoBOOM on GitHub (Nov 6, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7524

Originally assigned to: @dhiltgen on GitHub.

What is the issue?

Tried versions v0.4.0, v0.3.14, and v0.3.13, all yielded the same exact results.
image_2024-11-06_031207237

Attempted to start the app through start menu, file explorer, and the ollama serve command (in separate windows), all yielded the same results.

I checked the app.log, and every single launch was showing the same thing:

"time=2024-11-06T03:02:50.988-05:00 level=WARN source=server.go:163 msg="server crash 1 - exit code 2 - respawning""

And it'd go so on for server crash 1 - 30 so and so.
I hate making my own issue posts if I can help it but I've spent all night trying to fix this so I'd like some help.

OS

Windows

GPU

Nvidia

CPU

Intel

Ollama version

0.4.0-rc8

Originally created by @BongozGoBOOM on GitHub (Nov 6, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7524 Originally assigned to: @dhiltgen on GitHub. ### What is the issue? Tried versions v0.4.0, v0.3.14, and v0.3.13, all yielded the same exact results. ![image_2024-11-06_031207237](https://github.com/user-attachments/assets/0984f689-2351-48f7-a3bd-5bfd98deb484) Attempted to start the app through start menu, file explorer, and the ollama serve command (in separate windows), all yielded the same results. I checked the app.log, and every single launch was showing the same thing: "time=2024-11-06T03:02:50.988-05:00 level=WARN source=server.go:163 msg="server crash 1 - exit code 2 - respawning"" And it'd go so on for server crash 1 - 30 so and so. I hate making my own issue posts if I can help it but I've spent all night trying to fix this so I'd like some help. ### OS Windows ### GPU Nvidia ### CPU Intel ### Ollama version 0.4.0-rc8
GiteaMirror added the bugwindowsnvidianeeds more info labels 2026-05-04 08:20:57 -05:00
Author
Owner

@rick-github commented on GitHub (Nov 6, 2024):

Is there anything earlier in the log that indicates why the server crashed? If you post a full log it might give some insight.

<!-- gh-comment-id:2459099856 --> @rick-github commented on GitHub (Nov 6, 2024): Is there anything earlier in the log that indicates why the server crashed? If you post a full log it might give some insight.
Author
Owner

@40740 commented on GitHub (Nov 6, 2024):

I also encountered the same bug

<!-- gh-comment-id:2459613321 --> @40740 commented on GitHub (Nov 6, 2024): I also encountered the same bug
Author
Owner

@rick-github commented on GitHub (Nov 6, 2024):

Server logs will help in debugging.

<!-- gh-comment-id:2459618429 --> @rick-github commented on GitHub (Nov 6, 2024): [Server logs](https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#how-to-troubleshoot-issues) will help in debugging.
Author
Owner

@40740 commented on GitHub (Nov 6, 2024):

Error: listen tcp 127.0.0.1:11434: bind: An attempt was made to access a socket in a way forbidden by its access permissions.
time=2024-11-06T21:01:28.391+08:00 level=ERROR source=server.go:145 msg="failed to start server failed to start server context canceled"

<!-- gh-comment-id:2459696348 --> @40740 commented on GitHub (Nov 6, 2024): `Error: listen tcp 127.0.0.1:11434: bind: An attempt was made to access a socket in a way forbidden by its access permissions.` `time=2024-11-06T21:01:28.391+08:00 level=ERROR source=server.go:145 msg="failed to start server failed to start server context canceled"`
Author
Owner

@rick-github commented on GitHub (Nov 6, 2024):

This is not a full log, and important context may be missing, which may slow the resolution of your problem.

An attempt was made to access a socket in a way forbidden by its access permissions

This usually means that there's another process already listening on port 11434, usually because the server has been started twice. What do you see if you run tasklist?

<!-- gh-comment-id:2459709987 --> @rick-github commented on GitHub (Nov 6, 2024): This is not a full log, and important context may be missing, which may slow the resolution of your problem. > `An attempt was made to access a socket in a way forbidden by its access permissions` This usually means that there's another process already listening on port 11434, usually because the server has been started twice. What do you see if you run `tasklist`?
Author
Owner

@dhiltgen commented on GitHub (Nov 6, 2024):

It might also be a dup of #3362

<!-- gh-comment-id:2460037260 --> @dhiltgen commented on GitHub (Nov 6, 2024): It might also be a dup of #3362
Author
Owner

@BongozGoBOOM commented on GitHub (Nov 6, 2024):

Server logs will help in debugging.

server.log

Pretty much all the server logs seem to say the same thing, I'm unable to make sense of it, though.

<!-- gh-comment-id:2460587462 --> @BongozGoBOOM commented on GitHub (Nov 6, 2024): > [Server logs](https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#how-to-troubleshoot-issues) will help in debugging. [server.log](https://github.com/user-attachments/files/17651955/server.log) Pretty much all the server logs seem to say the same thing, I'm unable to make sense of it, though.
Author
Owner

@mattharvill commented on GitHub (Nov 6, 2024):

I'm experiencing the same issue after updating. Using Windows 11

2024/11/06 11:37:46 routes.go:1158: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:C:\\Users\\XYZh\\.ollama\\models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://*] OLLAMA_SCHED_SPREAD:false OLLAMA_TMPDIR: ROCR_VISIBLE_DEVICES:]"
time=2024-11-06T11:37:46.488-08:00 level=INFO source=images.go:754 msg="total blobs: 6"
time=2024-11-06T11:37:46.488-08:00 level=INFO source=images.go:761 msg="total unused blobs removed: 0"
time=2024-11-06T11:37:46.489-08:00 level=INFO source=routes.go:1205 msg="Listening on 127.0.0.1:11434 (version 0.3.14)"
time=2024-11-06T11:37:46.490-08:00 level=ERROR source=common.go:279 msg="empty runner dir"
time=2024-11-06T11:37:46.490-08:00 level=INFO source=common.go:49 msg="Dynamic LLM libraries" runners=[]
Error: unable to initialize llm runners unable to locate runners in any search path [C:\Users\XYZh\AppData\Local\Programs\Ollama C:\Users\XYZh\AppData\Local\Programs\Ollama\windows-amd64 C:\Users\XYZh\AppData\Local\Programs\Ollama\dist\windows-amd64 C:\Users\XYZh\AppData\Local\Programs\Ollama C:\Users\XYZh\AppData\Local\Programs\Ollama\windows-amd64 C:\Users\XYZh\AppData\Local\Programs\Ollama\dist\windows-amd64 C:\Users\XYZh C:\Users\XYZh\windows-amd64 C:\Users\XYZh\dist\windows-amd64]
<!-- gh-comment-id:2460621532 --> @mattharvill commented on GitHub (Nov 6, 2024): I'm experiencing the same issue after updating. Using Windows 11 ``` 2024/11/06 11:37:46 routes.go:1158: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:C:\\Users\\XYZh\\.ollama\\models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://*] OLLAMA_SCHED_SPREAD:false OLLAMA_TMPDIR: ROCR_VISIBLE_DEVICES:]" time=2024-11-06T11:37:46.488-08:00 level=INFO source=images.go:754 msg="total blobs: 6" time=2024-11-06T11:37:46.488-08:00 level=INFO source=images.go:761 msg="total unused blobs removed: 0" time=2024-11-06T11:37:46.489-08:00 level=INFO source=routes.go:1205 msg="Listening on 127.0.0.1:11434 (version 0.3.14)" time=2024-11-06T11:37:46.490-08:00 level=ERROR source=common.go:279 msg="empty runner dir" time=2024-11-06T11:37:46.490-08:00 level=INFO source=common.go:49 msg="Dynamic LLM libraries" runners=[] Error: unable to initialize llm runners unable to locate runners in any search path [C:\Users\XYZh\AppData\Local\Programs\Ollama C:\Users\XYZh\AppData\Local\Programs\Ollama\windows-amd64 C:\Users\XYZh\AppData\Local\Programs\Ollama\dist\windows-amd64 C:\Users\XYZh\AppData\Local\Programs\Ollama C:\Users\XYZh\AppData\Local\Programs\Ollama\windows-amd64 C:\Users\XYZh\AppData\Local\Programs\Ollama\dist\windows-amd64 C:\Users\XYZh C:\Users\XYZh\windows-amd64 C:\Users\XYZh\dist\windows-amd64] ```
Author
Owner

@dhiltgen commented on GitHub (Nov 6, 2024):

@BongozGoBOOM your crash is when we're trying to initialize the nvcuda library. What version of the nvidia driver are you running? My suspicion is it may be a very old version and that's causing us to hit a bug in the GPU discovery code.

@mattharvill your issue is unrelated. The runners appear to have disappeared. Can you check your AV logs to see if the binaries were incorrectly flagged as potentially malicious and deleted perhaps? If not, try uninstalling and re-installing to get the binaries back.

<!-- gh-comment-id:2460639250 --> @dhiltgen commented on GitHub (Nov 6, 2024): @BongozGoBOOM your crash is when we're trying to initialize the nvcuda library. What version of the nvidia driver are you running? My suspicion is it may be a very old version and that's causing us to hit a bug in the GPU discovery code. @mattharvill your issue is unrelated. The runners appear to have disappeared. Can you check your AV logs to see if the binaries were incorrectly flagged as potentially malicious and deleted perhaps? If not, try uninstalling and re-installing to get the binaries back.
Author
Owner

@BongozGoBOOM commented on GitHub (Nov 6, 2024):

@BongozGoBOOM your crash is when we're trying to initialize the nvcuda library. What version of the nvidia driver are you running? My suspicion is it may be a very old version and that's causing us to hit a bug in the GPU discovery code.

I'm on the latest game ready NVIDIA driver version for my GPU, 566.03, should I try falling back?

<!-- gh-comment-id:2460644718 --> @BongozGoBOOM commented on GitHub (Nov 6, 2024): > @BongozGoBOOM your crash is when we're trying to initialize the nvcuda library. What version of the nvidia driver are you running? My suspicion is it may be a very old version and that's causing us to hit a bug in the GPU discovery code. I'm on the latest game ready NVIDIA driver version for my GPU, 566.03, should I try falling back?
Author
Owner

@dhiltgen commented on GitHub (Nov 6, 2024):

@BongozGoBOOM thanks. Let me install a matching set of software and see if I can repro. Downgrading might be a viable workaround. This looks very similar to #5625 but up to this point we had only seen it on very old drivers.

<!-- gh-comment-id:2460647756 --> @dhiltgen commented on GitHub (Nov 6, 2024): @BongozGoBOOM thanks. Let me install a matching set of software and see if I can repro. Downgrading might be a viable workaround. This looks very similar to #5625 but up to this point we had only seen it on very old drivers.
Author
Owner

@dhiltgen commented on GitHub (Nov 6, 2024):

I wasn't able to reproduce on Win11 with 566.03.

@BongozGoBOOM can you try to capture debug logs to see if there's any additional insight there. Quit the tray application, then in powershell set $env:OLLAMA_DEBUG="1" and then try to run a model. Then you can go to the tray menu, View logs, and grab the latest server log.

<!-- gh-comment-id:2460695968 --> @dhiltgen commented on GitHub (Nov 6, 2024): I wasn't able to reproduce on Win11 with 566.03. @BongozGoBOOM can you try to capture debug logs to see if there's any additional insight there. Quit the tray application, then in powershell set ` $env:OLLAMA_DEBUG="1"` and then try to run a model. Then you can go to the tray menu, View logs, and grab the latest server log.
Author
Owner

@BongozGoBOOM commented on GitHub (Nov 6, 2024):

I wasn't able to reproduce on Win11 with 566.03.

@BongozGoBOOM can you try to capture debug logs to see if there's any additional insight there. Quit the tray application, then in powershell set $env:OLLAMA_DEBUG="1" and then try to run a model. Then you can go to the tray menu, View logs, and grab the latest server log.

serverwithdebug.log

I also downgraded my driver to 565.90, and still am getting the same error.

<!-- gh-comment-id:2460716854 --> @BongozGoBOOM commented on GitHub (Nov 6, 2024): > I wasn't able to reproduce on Win11 with 566.03. > > @BongozGoBOOM can you try to capture debug logs to see if there's any additional insight there. Quit the tray application, then in powershell set ` $env:OLLAMA_DEBUG="1"` and then try to run a model. Then you can go to the tray menu, View logs, and grab the latest server log. [serverwithdebug.log](https://github.com/user-attachments/files/17652732/serverwithdebug.log) I also downgraded my driver to 565.90, and still am getting the same error.
Author
Owner

@dhiltgen commented on GitHub (Nov 6, 2024):

Unfortunately nothing obvious in the debug log. If I can't find it via code inspection, I'll get some additional debug logging added to the init routine so at least in the next release we'll be able to narrow down where we're crashing during nvcuda.dll initialization.

<!-- gh-comment-id:2460757148 --> @dhiltgen commented on GitHub (Nov 6, 2024): Unfortunately nothing obvious in the debug log. If I can't find it via code inspection, I'll get some additional debug logging added to the init routine so at least in the next release we'll be able to narrow down where we're crashing during nvcuda.dll initialization.
Author
Owner

@mattharvill commented on GitHub (Nov 6, 2024):

@BongozGoBOOM your crash is when we're trying to initialize the nvcuda library. What version of the nvidia driver are you running? My suspicion is it may be a very old version and that's causing us to hit a bug in the GPU discovery code.

@mattharvill your issue is unrelated. The runners appear to have disappeared. Can you check your AV logs to see if the binaries were incorrectly flagged as potentially malicious and deleted perhaps? If not, try uninstalling and re-installing to get the binaries back.

Thanks for the quick reply @dhiltgen. Couldn't find anything blocked within anti-virus notes. Did a full uninstall, reinstall and got ollama back up and running. Thx for the support!

<!-- gh-comment-id:2460804825 --> @mattharvill commented on GitHub (Nov 6, 2024): > @BongozGoBOOM your crash is when we're trying to initialize the nvcuda library. What version of the nvidia driver are you running? My suspicion is it may be a very old version and that's causing us to hit a bug in the GPU discovery code. > > @mattharvill your issue is unrelated. The runners appear to have disappeared. Can you check your AV logs to see if the binaries were incorrectly flagged as potentially malicious and deleted perhaps? If not, try uninstalling and re-installing to get the binaries back. Thanks for the quick reply @dhiltgen. Couldn't find anything blocked within anti-virus notes. Did a full uninstall, reinstall and got ollama back up and running. Thx for the support!
Author
Owner

@40740 commented on GitHub (Nov 6, 2024):

Already solved, Win11, Run Command net stop Winnat

<!-- gh-comment-id:2460992234 --> @40740 commented on GitHub (Nov 6, 2024): Already solved, Win11, Run Command `net stop Winnat`
Author
Owner

@40740 commented on GitHub (Nov 6, 2024):

已解决,Win11,Run Command net stop Winnat

This should be the problem:

https://github.com/ollama/ollama/issues/3362#issuecomment-2081299370

<!-- gh-comment-id:2461001526 --> @40740 commented on GitHub (Nov 6, 2024): > 已解决,Win11,Run Command `net stop Winnat` This should be the problem: [https://github.com/ollama/ollama/issues/3362#issuecomment-2081299370](url)
Author
Owner

@dhiltgen commented on GitHub (Nov 13, 2024):

@BongozGoBOOM please try out 0.4.1 which will have more debug logging to try to help isolate where the bug is.

After updating, quit the tray app, and in a powershell terminal:

$env:OLLAMA_DEBUG="1"
ollama serve

Then share the output.

<!-- gh-comment-id:2474842962 --> @dhiltgen commented on GitHub (Nov 13, 2024): @BongozGoBOOM please try out 0.4.1 which will have more debug logging to try to help isolate where the bug is. After updating, quit the tray app, and in a powershell terminal: ```powershell $env:OLLAMA_DEBUG="1" ollama serve ``` Then share the output.
Author
Owner

@hanfangyuan4396 commented on GitHub (Jan 12, 2025):

Error: listen tcp 127.0.0.1:11434: bind: An attempt was made to access a socket in a way forbidden by its access permissions. time=2024-11-06T21:01:28.391+08:00 level=ERROR source=server.go:145 msg="failed to start server failed to start server context canceled"

Solution:
You can resolve this issue by setting the "OLLAMA_HOST" environment variable to "127.0.0.1:5005"

Root Cause:
When running ollama serve on Windows and encountering the error:
"Error: listen tcp 127.0.0.1:11434: bind: An attempt was made to access a socket in a way forbidden by its access permissions"
I believe this isn't an Ollama-specific issue. I tested the port availability using python -m http.server 11434 and encountered the same problem. I discovered that many ports around 11434 have similar issues.

I suspect this is because Windows reserves certain ports for dynamic allocation. You can check Windows' dynamic port range and quantity using netsh int ipv4 show dynamicport tcp. Port 11434 is likely within Windows' reserved dynamic port range.

<!-- gh-comment-id:2585659716 --> @hanfangyuan4396 commented on GitHub (Jan 12, 2025): > `Error: listen tcp 127.0.0.1:11434: bind: An attempt was made to access a socket in a way forbidden by its access permissions.` `time=2024-11-06T21:01:28.391+08:00 level=ERROR source=server.go:145 msg="failed to start server failed to start server context canceled"` Solution: You can resolve this issue by setting the "OLLAMA_HOST" environment variable to "127.0.0.1:5005" Root Cause: When running `ollama serve` on Windows and encountering the error: "Error: listen tcp 127.0.0.1:11434: bind: An attempt was made to access a socket in a way forbidden by its access permissions" I believe this isn't an Ollama-specific issue. I tested the port availability using `python -m http.server 11434` and encountered the same problem. I discovered that many ports around 11434 have similar issues. I suspect this is because Windows reserves certain ports for dynamic allocation. You can check Windows' dynamic port range and quantity using `netsh int ipv4 show dynamicport tcp`. Port 11434 is likely within Windows' reserved dynamic port range.
Author
Owner

@rick-github commented on GitHub (Jan 13, 2025):

net stop winnat
netsh int ipv4 add excludedportrange protocol=tcp startport=11434 numberofports=1
net start winnat
<!-- gh-comment-id:2586006961 --> @rick-github commented on GitHub (Jan 13, 2025): ``` net stop winnat netsh int ipv4 add excludedportrange protocol=tcp startport=11434 numberofports=1 net start winnat ```
Author
Owner

@neuwcodebox commented on GitHub (Feb 22, 2025):

I'm using Windows, but suddenly I have the same problem.
Error: listen tcp 127.0.0.1:11434: bind: An attempt was made to access a socket in a way forbidden by its access permissions.

It was solved by using this method.

net stop winnat
netsh int ipv4 add excludedportrange protocol=tcp startport=11434 numberofports=1
net start winnat
<!-- gh-comment-id:2676185293 --> @neuwcodebox commented on GitHub (Feb 22, 2025): I'm using Windows, but suddenly I have the same problem. `Error: listen tcp 127.0.0.1:11434: bind: An attempt was made to access a socket in a way forbidden by its access permissions.` It was solved by using this method. > ``` > net stop winnat > netsh int ipv4 add excludedportrange protocol=tcp startport=11434 numberofports=1 > net start winnat > ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#66841