[GH-ISSUE #8261] Error: listen tcp 127.0.0.1:11434: bind: Only one usage of each socket address #51792

Closed
opened 2026-04-28 20:57:32 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @davincitr on GitHub (Dec 28, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/8261

What is the issue?

Hello i tried everything on the internet even ı formatted my pc.

Error: listen tcp 127.0.0.1:11434: bind: Normal olarak her yuva adresi (iletişim kuralı/ağ adresi/bağlantı noktası) için yalnızca bir kullanıma izin veriliyor.
image

image

image

image
i tried kill from task bar too.
I tried using other ports too. I added one named 0.0.0.0 but it doesnt worked too

OS

Windows

GPU

Nvidia

CPU

Intel

Ollama version

0.5.0 and 0.5.4

Originally created by @davincitr on GitHub (Dec 28, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/8261 ### What is the issue? Hello i tried everything on the internet even ı formatted my pc. Error: listen tcp 127.0.0.1:11434: bind: Normal olarak her yuva adresi (iletişim kuralı/ağ adresi/bağlantı noktası) için yalnızca bir kullanıma izin veriliyor. ![image](https://github.com/user-attachments/assets/0ab05a6b-ffaa-41e1-bc87-d70ad41a7747) ![image](https://github.com/user-attachments/assets/9f04a3ef-e92b-4c9a-a38d-4d772dc7002c) ![image](https://github.com/user-attachments/assets/b7005ccb-5f69-4740-b508-dd0eb6b6e804) ![image](https://github.com/user-attachments/assets/c263a6fd-39ac-4bcc-96ac-617124395f68) i tried kill from task bar too. I tried using other ports too. I added one named 0.0.0.0 but it doesnt worked too ### OS Windows ### GPU Nvidia ### CPU Intel ### Ollama version 0.5.0 and 0.5.4
GiteaMirror added the needs more infobug labels 2026-04-28 20:57:32 -05:00
Author
Owner

@rick-github commented on GitHub (Dec 28, 2024):

You don't need to run ollama serve , ollama is already running as a service. If you want to run ollama serve, you need to stop the ollama service: right click on the ollama icon in the toolbar and select "Quit Ollama".

<!-- gh-comment-id:2564272028 --> @rick-github commented on GitHub (Dec 28, 2024): You don't need to run `ollama serve` , ollama is already running as a service. If you want to run `ollama serve`, you need to stop the ollama service: right click on the ollama icon in the toolbar and select "Quit Ollama".
Author
Owner

@pdevine commented on GitHub (Dec 29, 2024):

@davincitr what does ollama run llama3.2 do? As @rick-github mentioned it looks like everything is running correctly.

<!-- gh-comment-id:2564597826 --> @pdevine commented on GitHub (Dec 29, 2024): @davincitr what does `ollama run llama3.2` do? As @rick-github mentioned it looks like everything is running correctly.
Author
Owner

@pdevine commented on GitHub (Jan 8, 2025):

I'll go ahead and close this as I think it's working correctly. Feel free to keep commenting.

<!-- gh-comment-id:2578302792 --> @pdevine commented on GitHub (Jan 8, 2025): I'll go ahead and close this as I think it's working correctly. Feel free to keep commenting.
Author
Owner

@davincitr commented on GitHub (Jan 15, 2025):

rick thanks for your comment . It works thx

<!-- gh-comment-id:2592490096 --> @davincitr commented on GitHub (Jan 15, 2025): rick thanks for your comment . It works thx
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#51792