[GH-ISSUE #3978] Error: Head "http://127.0.0.1:11434/": dial tcp 127.0.0.1:11434: connectex: #48973

Closed
opened 2026-04-28 10:22:15 -05:00 by GiteaMirror · 11 comments
Owner

Originally created by @jannoname on GitHub (Apr 27, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3978

Originally assigned to: @dhiltgen on GitHub.

What is the issue?

The last windows version of ollama didnt work on my laptop. It cant connect to 11434 anymore - even if it is free or block the port itself.

C:\Users\XXX>ollama list
Error: Head "http://127.0.0.1:11434/": dial tcp 127.0.0.1:11434: connectex: Es konnte keine Verbindung hergestellt werden, da der Zielcomputer die Verbindung verweigerte.

C:\Users\XXX>netstat -ano | find  "11434"

C:\Users\XXX>ollama list
Error: Head "http://127.0.0.1:11434/": dial tcp 127.0.0.1:11434: connectex: Es konnte keine Verbindung hergestellt werden, da der Zielcomputer die Verbindung  verweigerte. 

The older version works ... until it selfupdatet.
The new ollama (0.1.32) didnt work in docker too.
The Chat in the console window works some times but server command never.

OS

Windows

GPU

AMD onboard

CPU

Ryzen 7000 Mobile APU

Ollama version

0.1.32 (non-docker)

Originally created by @jannoname on GitHub (Apr 27, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3978 Originally assigned to: @dhiltgen on GitHub. ### What is the issue? The last windows version of ollama didnt work on my laptop. It cant connect to 11434 anymore - even if it is free or block the port itself. ``` C:\Users\XXX>ollama list Error: Head "http://127.0.0.1:11434/": dial tcp 127.0.0.1:11434: connectex: Es konnte keine Verbindung hergestellt werden, da der Zielcomputer die Verbindung verweigerte. C:\Users\XXX>netstat -ano | find "11434" C:\Users\XXX>ollama list Error: Head "http://127.0.0.1:11434/": dial tcp 127.0.0.1:11434: connectex: Es konnte keine Verbindung hergestellt werden, da der Zielcomputer die Verbindung verweigerte. ``` The older version works ... until it selfupdatet. The new ollama (0.1.32) didnt work in docker too. The Chat in the console window works some times but server command never. ### OS Windows ### GPU AMD onboard ### CPU Ryzen 7000 Mobile APU ### Ollama version 0.1.32 (non-docker)
GiteaMirror added the bugwindows labels 2026-04-28 10:22:15 -05:00
Author
Owner

@dcasota commented on GitHub (Apr 29, 2024):

+1 same issue

<!-- gh-comment-id:2082468980 --> @dcasota commented on GitHub (Apr 29, 2024): +1 same issue
Author
Owner

@dhiltgen commented on GitHub (May 1, 2024):

Please give the latest 0.1.33 RC build a try, and hopefully your issue will be resolved. We've fixed a number of bugs related to windows subprocess handling recently.

https://github.com/ollama/ollama/releases

<!-- gh-comment-id:2089124501 --> @dhiltgen commented on GitHub (May 1, 2024): Please give the latest 0.1.33 RC build a try, and hopefully your issue will be resolved. We've fixed a number of bugs related to windows subprocess handling recently. https://github.com/ollama/ollama/releases
Author
Owner

@jannoname commented on GitHub (May 2, 2024):

Thx for the feedback.

ollama list works now.
But the serve funktion still doesnt. The only app on the port is ollama itself.
If i quite the ollama app in the taskbar with rightclick tis process is gone and i can start the serve funktion in the console.
But my Webinterface still cant connnect to ollama.

<!-- gh-comment-id:2091436516 --> @jannoname commented on GitHub (May 2, 2024): Thx for the feedback. ollama list works now. But the serve funktion still doesnt. The only app on the port is ollama itself. If i quite the ollama app in the taskbar with rightclick tis process is gone and i can start the serve funktion in the console. But my Webinterface still cant connnect to ollama.
Author
Owner

@dhiltgen commented on GitHub (May 2, 2024):

Ollama is a client-server architecture, and binds to a TCP port. Only one process can bind to a port at a time, so you can't run two servers on the same default port. You can run it on a different port with the OLLAMA_HOST variable, but I don't think that's what you're trying to do. If you're trying to run a web UI and it is failing to connect this might be cross-origin related. Check out the FAQ here for guidance.

<!-- gh-comment-id:2091546535 --> @dhiltgen commented on GitHub (May 2, 2024): Ollama is a client-server architecture, and binds to a TCP port. Only one process can bind to a port at a time, so you can't run two servers on the same default port. You can run it on a different port with the [OLLAMA_HOST variable](https://github.com/ollama/ollama/blob/main/docs/faq.md#how-can-i-expose-ollama-on-my-network), but I don't think that's what you're trying to do. If you're trying to run a web UI and it is failing to connect this might be cross-origin related. Check out the [FAQ here](https://github.com/ollama/ollama/blob/main/docs/faq.md#how-can-i-allow-additional-web-origins-to-access-ollama) for guidance.
Author
Owner

@jannoname commented on GitHub (May 4, 2024):

I only use one instance. The problem didnt ocur with older versions of ollama - starting with 1.32. (Win 11 Laptop). I testet it today on an Win 10 Pro Desktop PC that never had ollama installed - same problem with the port binding on newer version.

<!-- gh-comment-id:2094201531 --> @jannoname commented on GitHub (May 4, 2024): I only use one instance. The problem didnt ocur with older versions of ollama - starting with 1.32. (Win 11 Laptop). I testet it today on an Win 10 Pro Desktop PC that never had ollama installed - same problem with the port binding on newer version.
Author
Owner

@dhiltgen commented on GitHub (May 4, 2024):

@jannoname I'm not quite understanding your scenario. If ollama list works, that requires the client to talk to the server to produce output. That indicates the server IS running on your system. It sounds like you are trying to run ollama serve again while the existing one is running and that is expected to fail as you can't have 2 servers running on the same port. If the system tray icon is present, that indicates the Ollama server is running. On Windows and MacOS, we try to make the server run automatically.

If that doesn't help answer your question, can you elaborate a little more what you're trying to do? Can you check Task Manager for running Ollamas, and explain if the tray app is running or not?

<!-- gh-comment-id:2094280355 --> @dhiltgen commented on GitHub (May 4, 2024): @jannoname I'm not quite understanding your scenario. If `ollama list` works, that requires the client to talk to the server to produce output. That indicates the server *IS* running on your system. It sounds like you are trying to run `ollama serve` again while the existing one is running and that is expected to fail as you can't have 2 servers running on the same port. If the system tray icon is present, that indicates the Ollama server is running. On Windows and MacOS, we try to make the server run automatically. If that doesn't help answer your question, can you elaborate a little more what you're trying to do? Can you check Task Manager for running Ollamas, and explain if the tray app is running or not?
Author
Owner

@dcasota commented on GitHub (May 4, 2024):

@dhiltgen On Windows, the user experience should be improved.

Here a W11 example with Docker desktop and pulling latest Ollama.

image

The issue has been reported in https://github.com/ollama/ollama/issues/4146

<!-- gh-comment-id:2094300266 --> @dcasota commented on GitHub (May 4, 2024): @dhiltgen On Windows, the user experience should be improved. Here a W11 example with Docker desktop and pulling latest Ollama. ![image](https://github.com/ollama/ollama/assets/14890243/f5bad2d6-8793-44b5-a16d-30044d5b61c2) The issue has been reported in https://github.com/ollama/ollama/issues/4146
Author
Owner

@dhiltgen commented on GitHub (May 21, 2024):

I believe the problem has been resolved here on the latest version(s) of Ollama. @jannoname if you're still having problems, please share more information on what's going wrong.

<!-- gh-comment-id:2123184578 --> @dhiltgen commented on GitHub (May 21, 2024): I believe the problem has been resolved here on the latest version(s) of Ollama. @jannoname if you're still having problems, please share more information on what's going wrong.
Author
Owner

@trivalik commented on GitHub (Oct 17, 2024):

I run into this as well. The solution is to start olama with ollama serve which hosts the server, then run your wanted ollama run <model name>.

<!-- gh-comment-id:2419404481 --> @trivalik commented on GitHub (Oct 17, 2024): I run into this as well. The solution is to start olama with `ollama serve` which hosts the server, then run your wanted `ollama run <model name>`.
Author
Owner

@shiboys commented on GitHub (Feb 4, 2025):

I run into this as well. The solution is to start olama with ollama serve which hosts the server, then run your wanted ollama run <model name>.

this solve my problem on my mac os . Thanks

<!-- gh-comment-id:2634381724 --> @shiboys commented on GitHub (Feb 4, 2025): > I run into this as well. The solution is to start olama with `ollama serve` which hosts the server, then run your wanted `ollama run <model name>`. this solve my problem on my mac os . Thanks
Author
Owner

@krishenriksen commented on GitHub (Nov 22, 2025):

I ran into this issue today.

netstat -ano | findstr :11434
TCP 0.0.0.0:11434 0.0.0.0:0 LISTENING 5064

tasklist /svc /FI "PID eq 5064"

Image Name PID Services
========================= ======== ============================================
svchost.exe 5064 iphlpsvc

Apparently iphlpsvc was hogging the port, something I've never seen before.

<!-- gh-comment-id:3565113203 --> @krishenriksen commented on GitHub (Nov 22, 2025): I ran into this issue today. netstat -ano | findstr :11434 TCP 0.0.0.0:11434 0.0.0.0:0 LISTENING 5064 tasklist /svc /FI "PID eq 5064" Image Name PID Services ========================= ======== ============================================ svchost.exe 5064 iphlpsvc Apparently iphlpsvc was hogging the port, something I've never seen before.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#48973