[GH-ISSUE #3575] Error: listen tcp 127.0.0.1:11434: bind: Only one usage of each socket address (protocol/network address/port) is normally permitted. #48719

Closed
opened 2026-04-28 09:08:04 -05:00 by GiteaMirror · 16 comments
Owner

Originally created by @Coder-Vishali on GitHub (Apr 10, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3575

What is the issue?

When I execute ollama serve, I face the below issue:

Error: listen tcp 127.0.0.1:11434: bind: Only one usage of each socket address (protocol/network address/port) is normally permitted.

image

**
image
**

Things which I have tired out:

  1. Restarted my machine
  2. Stop and start the ollama server
  3. Kill the port using: netstat -ano | findstr :, taskkill /PID /F

What did you expect to see?

No response

Steps to reproduce

No response

Are there any recent changes that introduced the issue?

No response

OS

Windows

Architecture

x86

Platform

No response

Ollama version

No response

GPU

No response

GPU info

No response

CPU

No response

Other software

No response

Originally created by @Coder-Vishali on GitHub (Apr 10, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3575 ### What is the issue? When I execute ollama serve, I face the below issue: Error: listen tcp 127.0.0.1:11434: bind: Only one usage of each socket address (protocol/network address/port) is normally permitted. ![image](https://github.com/ollama/ollama/assets/60731083/38b5ce1d-e039-4066-9fc0-7fa18b027722) ** ![image](https://github.com/ollama/ollama/assets/60731083/86c5173e-4a1b-4270-8c89-145f055b60dd) ** Things which I have tired out: 1. Restarted my machine 2. Stop and start the ollama server 3. Kill the port using: netstat -ano | findstr :<PORT>, taskkill /PID <PID> /F ### What did you expect to see? _No response_ ### Steps to reproduce _No response_ ### Are there any recent changes that introduced the issue? _No response_ ### OS Windows ### Architecture x86 ### Platform _No response_ ### Ollama version _No response_ ### GPU _No response_ ### GPU info _No response_ ### CPU _No response_ ### Other software _No response_
GiteaMirror added the bug label 2026-04-28 09:08:04 -05:00
Author
Owner

@muxixi727 commented on GitHub (Apr 11, 2024):

You can close ollama that is opened locally

<!-- gh-comment-id:2049216877 --> @muxixi727 commented on GitHub (Apr 11, 2024): You can close ollama that is opened locally
Author
Owner

@4G3NTR0LLC4G3 commented on GitHub (Apr 19, 2024):

In order to close the "local" ollama go to the bottom right of taskbar on windows click the up arrow, and quit ollama from the small tiny ollama app icon in the small arrow key menu. SO CONFUSING>

If you then go back and run ollama serve it should work now.

<!-- gh-comment-id:2067218018 --> @4G3NTR0LLC4G3 commented on GitHub (Apr 19, 2024): In order to close the "local" ollama go to the bottom right of taskbar on windows click the up arrow, and quit ollama from the small tiny ollama app icon in the small arrow key menu. SO CONFUSING> If you then go back and run ollama serve it should work now.
Author
Owner

@cumtsd commented on GitHub (May 21, 2024):

i've got the same problem without solved

<!-- gh-comment-id:2121527752 --> @cumtsd commented on GitHub (May 21, 2024): i've got the same problem without solved
Author
Owner

@smspgh commented on GitHub (May 31, 2024):

There are 2 processes that are effectively activated when running Ollama Client in windows. You will find ollama and ollama app. The one is the parent controlling the localhost serving endpoint @ port 11434.. The other which is ollama app and if not killed will instantly restart the server on port 11434 if you only kill the one. So, if you kill both or at least kill "ollama app" process, it should take care of that issue. Here is your shortcut: Get-Process ollama* | Stop-Process -Force | ollama serve

<!-- gh-comment-id:2142287602 --> @smspgh commented on GitHub (May 31, 2024): There are 2 processes that are effectively activated when running Ollama Client in windows. You will find ollama and ollama app. The one is the parent controlling the localhost serving endpoint @ port 11434.. The other which is ollama app and if not killed will instantly restart the server on port 11434 if you only kill the one. So, if you kill both or at least kill "ollama app" process, it should take care of that issue. Here is your shortcut: Get-Process ollama* | Stop-Process -Force | ollama serve
Author
Owner

@Shoaib5136 commented on GitHub (Jun 10, 2024):

You can close ollama that is opened locally

Thanks, dude I was also searching for this many hours.

<!-- gh-comment-id:2157777951 --> @Shoaib5136 commented on GitHub (Jun 10, 2024): > You can close ollama that is opened locally Thanks, dude I was also searching for this many hours.
Author
Owner

@YuChenXin-ZJU commented on GitHub (Jul 28, 2024):

Are there any other ports I can call?

<!-- gh-comment-id:2254564800 --> @YuChenXin-ZJU commented on GitHub (Jul 28, 2024): Are there any other ports I can call?
Author
Owner

@smspgh commented on GitHub (Jul 28, 2024):

@YuChenXin-ZJU If you mean can you set a port that is different to call then yes.. You would use: setx OLLAMA_HOST "127.0.0.1:YOUR_DESIRED_PORT" /M then just restart ollama.

<!-- gh-comment-id:2254583423 --> @smspgh commented on GitHub (Jul 28, 2024): @YuChenXin-ZJU If you mean can you set a port that is different to call then yes.. You would use: setx OLLAMA_HOST "127.0.0.1:YOUR_DESIRED_PORT" /M then just restart ollama.
Author
Owner

@PrithivJith commented on GitHub (Jul 30, 2024):

Hello, I had this same issue a simple computer restart fixed this for me. Hope this helped :D

<!-- gh-comment-id:2258523168 --> @PrithivJith commented on GitHub (Jul 30, 2024): Hello, I had this same issue a simple computer restart fixed this for me. Hope this helped :D
Author
Owner

@EagleEye2010 commented on GitHub (Sep 28, 2024):

In order to close the "local" ollama go to the bottom right of taskbar on windows click the up arrow, and quit ollama from the small tiny ollama app icon in the small arrow key menu. SO CONFUSING>

If you then go back and run ollama serve it should work now.

Saving my insanity.

<!-- gh-comment-id:2380882962 --> @EagleEye2010 commented on GitHub (Sep 28, 2024): > In order to close the "local" ollama go to the bottom right of taskbar on windows click the up arrow, and quit ollama from the small tiny ollama app icon in the small arrow key menu. SO CONFUSING> > > If you then go back and run ollama serve it should work now. Saving my insanity.
Author
Owner

@hakeemsalman commented on GitHub (Jan 22, 2025):

In order to close the "local" ollama go to the bottom right of taskbar on windows click the up arrow, and quit ollama from the small tiny ollama app icon in the small arrow key menu. SO CONFUSING>

If you then go back and run ollama serve it should work now.

It's SOLVED, thank you so much.

<!-- gh-comment-id:2606363478 --> @hakeemsalman commented on GitHub (Jan 22, 2025): > In order to close the "local" ollama go to the bottom right of taskbar on windows click the up arrow, and quit ollama from the small tiny ollama app icon in the small arrow key menu. SO CONFUSING> > > If you then go back and run ollama serve it should work now. It's SOLVED, thank you so much.
Author
Owner

@ESTAS-crypto commented on GitHub (Jan 31, 2025):

I also have almost the same problem Error: listen tcp 127.0.0.1:11434: bind: Only one usage of each socket address (protocol/network address/port) is normally permitted.

<!-- gh-comment-id:2626793661 --> @ESTAS-crypto commented on GitHub (Jan 31, 2025): I also have almost the same problem Error: listen tcp 127.0.0.1:11434: bind: Only one usage of each socket address (protocol/network address/port) is normally permitted.
Author
Owner

@meqasim commented on GitHub (Feb 3, 2025):

You can close ollama that is opened locally

It work. Sometime solution is infront of us, but we are searching somewhere else

<!-- gh-comment-id:2631747771 --> @meqasim commented on GitHub (Feb 3, 2025): > You can close ollama that is opened locally It work. Sometime solution is infront of us, but we are searching somewhere else
Author
Owner

@manav-a-thinkwik commented on GitHub (Feb 12, 2025):

In order to close the "local" ollama go to the bottom right of taskbar on windows click the up arrow, and quit ollama from the small tiny ollama app icon in the small arrow key menu. SO CONFUSING>

If you then go back and run ollama serve it should work now.

thanks brother

In order to close the "local" ollama go to the bottom right of taskbar on windows click the up arrow, and quit ollama from the small tiny ollama app icon in the small arrow key menu. SO CONFUSING>

If you then go back and run ollama serve it should work now.

thanks brother

<!-- gh-comment-id:2653116041 --> @manav-a-thinkwik commented on GitHub (Feb 12, 2025): > In order to close the "local" ollama go to the bottom right of taskbar on windows click the up arrow, and quit ollama from the small tiny ollama app icon in the small arrow key menu. SO CONFUSING> > > If you then go back and run ollama serve it should work now. thanks brother > In order to close the "local" ollama go to the bottom right of taskbar on windows click the up arrow, and quit ollama from the small tiny ollama app icon in the small arrow key menu. SO CONFUSING> > > If you then go back and run ollama serve it should work now. thanks brother
Author
Owner

@Technorocker commented on GitHub (May 25, 2025):

what about when there is no local ollama running and Im trying to setup ollama as a service but the only error i keep getting is :
Error: listen tcp 127.0.0.1:11434: socket: The requested service provider could not be loaded or initialized. in the logs
ive checked using netstat and nothing is using that port

<!-- gh-comment-id:2907999561 --> @Technorocker commented on GitHub (May 25, 2025): what about when there is no local ollama running and Im trying to setup ollama as a service but the only error i keep getting is : `Error: listen tcp 127.0.0.1:11434: socket: The requested service provider could not be loaded or initialized.` in the logs ive checked using netstat and nothing is using that port
Author
Owner

@hemaranjani02 commented on GitHub (Jun 17, 2025):

just go to task manager and end the background running ollama, ollama.exe. It works

<!-- gh-comment-id:2980171839 --> @hemaranjani02 commented on GitHub (Jun 17, 2025): just go to task manager and end the background running ollama, ollama.exe. It works
Author
Owner

@heyu58 commented on GitHub (Dec 21, 2025):

In order to close the "local" ollama go to the bottom right of taskbar on windows click the up arrow, and quit ollama from the small tiny ollama app icon in the small arrow key menu. SO CONFUSING>

If you then go back and run ollama serve it should work now.

Best answer for rookie just like me!!!

<!-- gh-comment-id:3678787325 --> @heyu58 commented on GitHub (Dec 21, 2025): > In order to close the "local" ollama go to the bottom right of taskbar on windows click the up arrow, and quit ollama from the small tiny ollama app icon in the small arrow key menu. SO CONFUSING> > > If you then go back and run ollama serve it should work now. Best answer for rookie just like me!!!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#48719