[GH-ISSUE #2540] Error: listen tcp 127.0.0.1:11434 in windows #27248

Closed
opened 2026-04-22 04:24:55 -05:00 by GiteaMirror · 5 comments
Owner

Originally created by @razvanab on GitHub (Feb 16, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2540

I get this error in Windows ollama preview when I try to run "ollama serve."

Error: listen tcp 127.0.0.1:11434: bind: Only one usage of each socket address (protocol/network address/port) is normally permitted.

Originally created by @razvanab on GitHub (Feb 16, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2540 I get this error in Windows ollama preview when I try to run "ollama serve." Error: listen tcp 127.0.0.1:11434: bind: Only one usage of each socket address (protocol/network address/port) is normally permitted.
Author
Owner

@joostshao commented on GitHub (Feb 16, 2024):

me too

<!-- gh-comment-id:1948416233 --> @joostshao commented on GitHub (Feb 16, 2024): me too
Author
Owner

@razvanab commented on GitHub (Feb 16, 2024):

Ok, I think I got it.
Ollama is already running in the background as a server in Windows at: http://localhost:11434. "see traybar"
Just put that address in your browser, and you'll see 

<!-- gh-comment-id:1948453690 --> @razvanab commented on GitHub (Feb 16, 2024): Ok, I think I got it. Ollama is already running in the background as a server in Windows at: http://localhost:11434. "see traybar" Just put that address in your browser, and you'll see 
Author
Owner

@Zxyy-mo commented on GitHub (Mar 18, 2024):

Thanks,brother

<!-- gh-comment-id:2003171704 --> @Zxyy-mo commented on GitHub (Mar 18, 2024): Thanks,brother
Author
Owner

@JerryOver commented on GitHub (Apr 5, 2024):

I got this problem. How do you exit Ollama?

<!-- gh-comment-id:2039643681 --> @JerryOver commented on GitHub (Apr 5, 2024): I got this problem. How do you exit Ollama?
Author
Owner

@JerryOver commented on GitHub (Apr 5, 2024):

So I exited Ollama from the taskbar, and "http://localhost:11434/" was no longer reachable with the message: "Ollama is running". So it's off. Then I opened it with my command prompt with "C:\Users\hinso>ollama serve", and I got this output:

time=2024-04-05T07:13:53.844-05:00 level=INFO source=images.go:804 msg="total blobs: 6"
time=2024-04-05T07:13:53.851-05:00 level=INFO source=images.go:811 msg="total unused blobs removed: 0"
time=2024-04-05T07:13:53.852-05:00 level=INFO source=routes.go:1118 msg="Listening on 127.0.0.1:11434 (version 0.1.30)"
time=2024-04-05T07:13:53.881-05:00 level=INFO source=payload_common.go:113 msg="Extracting dynamic libraries to C:\Users\hinso\AppData\Local\Temp\ollama1623166627\runners ..."
time=2024-04-05T07:13:54.024-05:00 level=INFO source=payload_common.go:140 msg="Dynamic LLM libraries [cpu_avx cpu rocm_v5.7 cpu_avx2 cuda_v11.3]"
[GIN] 2024/04/05 - 07:14:08 | 200 | 0s | 127.0.0.1 | GET "/"

I checked the taskbar. Ollama isn't open there, but "http://localhost:11434/" is available again saying, "Ollama is running". I just want to get back to where I was chatting with it in my command prompt. I can't seem to get back there. I closed the command prompt because I thought that I could reopen it without a model loaded. I was trying to install something for Comyui.

<!-- gh-comment-id:2039663763 --> @JerryOver commented on GitHub (Apr 5, 2024): So I exited Ollama from the taskbar, and "http://localhost:11434/" was no longer reachable with the message: "Ollama is running". So it's off. Then I opened it with my command prompt with "C:\Users\hinso>ollama serve", and I got this output: time=2024-04-05T07:13:53.844-05:00 level=INFO source=images.go:804 msg="total blobs: 6" time=2024-04-05T07:13:53.851-05:00 level=INFO source=images.go:811 msg="total unused blobs removed: 0" time=2024-04-05T07:13:53.852-05:00 level=INFO source=routes.go:1118 msg="Listening on 127.0.0.1:11434 (version 0.1.30)" time=2024-04-05T07:13:53.881-05:00 level=INFO source=payload_common.go:113 msg="Extracting dynamic libraries to C:\\Users\\hinso\\AppData\\Local\\Temp\\ollama1623166627\\runners ..." time=2024-04-05T07:13:54.024-05:00 level=INFO source=payload_common.go:140 msg="Dynamic LLM libraries [cpu_avx cpu rocm_v5.7 cpu_avx2 cuda_v11.3]" [GIN] 2024/04/05 - 07:14:08 | 200 | 0s | 127.0.0.1 | GET "/" I checked the taskbar. Ollama isn't open there, but "http://localhost:11434/" is available again saying, "Ollama is running". I just want to get back to where I was chatting with it in my command prompt. I can't seem to get back there. I closed the command prompt because I thought that I could reopen it without a model loaded. I was trying to install something for Comyui.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#27248