[GH-ISSUE #707] 127.0.0.1:11434: bind: address already in use #26087

Closed
opened 2026-04-22 02:02:13 -05:00 by GiteaMirror · 34 comments
Owner

Originally created by @Nivek92 on GitHub (Oct 5, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/707

When I run ollama serve I get

Error: listen tcp 127.0.0.1:11434: bind: address already in use

After checking what's running on the port with sudo lsof -i :11434

I see that ollama is already running

ollama 2233 ollama 3u IPv4 37563 0t0 TCP localhost:11434 (LISTEN)

I killed the process and ran the serve command again and got the same error. So it seems that it tries to start the server twice.

Originally created by @Nivek92 on GitHub (Oct 5, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/707 When I run `ollama serve` I get `Error: listen tcp 127.0.0.1:11434: bind: address already in use` After checking what's running on the port with `sudo lsof -i :11434` I see that ollama is already running `ollama 2233 ollama 3u IPv4 37563 0t0 TCP localhost:11434 (LISTEN)` I killed the process and ran the serve command again and got the same error. So it seems that it tries to start the server twice.
GiteaMirror added the question label 2026-04-22 02:02:13 -05:00
Author
Owner

@BruceMacD commented on GitHub (Oct 5, 2023):

Is this on Mac or Linux?

On Mac the app (running in the toolbar) will automatically restart the server when it stops. Exit the toolbar app to stop the server.

On Linux the Ollama server is added as a system service. To stop it you can run $ systemctl stop ollama.

<!-- gh-comment-id:1749030283 --> @BruceMacD commented on GitHub (Oct 5, 2023): Is this on Mac or Linux? On Mac the app (running in the toolbar) will automatically restart the server when it stops. Exit the toolbar app to stop the server. On Linux the Ollama server is added as a system service. To stop it you can run `$ systemctl stop ollama`.
Author
Owner

@xyproto commented on GitHub (Oct 6, 2023):

brew services restart ollama might also be helpful, if on macOS.

<!-- gh-comment-id:1750219343 --> @xyproto commented on GitHub (Oct 6, 2023): `brew services restart ollama` might also be helpful, if on macOS.
Author
Owner

@orpic commented on GitHub (Oct 8, 2023):

same happened with me but only after the initial install.
Running from next time after systemctl stop ollama , it worked fine.

<!-- gh-comment-id:1752096265 --> @orpic commented on GitHub (Oct 8, 2023): same happened with me but only after the initial install. Running from next time after `systemctl stop ollama` , it worked fine.
Author
Owner

@Nivek92 commented on GitHub (Oct 9, 2023):

yeah it's on Linux, Ubuntu specifically. It's not really a problem as the service is working but very confusing as it happens on the initial install, when you don't assume the service is already running, making you believe another service is blocking the port.

<!-- gh-comment-id:1752871216 --> @Nivek92 commented on GitHub (Oct 9, 2023): yeah it's on Linux, Ubuntu specifically. It's not really a problem as the service is working but very confusing as it happens on the initial install, when you don't assume the service is already running, making you believe another service is blocking the port.
Author
Owner

@Yadheedhya06 commented on GitHub (Oct 21, 2023):

https://github.com/jmorganca/ollama/pull/872

<!-- gh-comment-id:1773902997 --> @Yadheedhya06 commented on GitHub (Oct 21, 2023): https://github.com/jmorganca/ollama/pull/872
Author
Owner

@technovangelist commented on GitHub (Dec 4, 2023):

It looks like this issue was solved already as per the linked issue as well as Bruce's comment. I will go ahead and close it now. If you think there is anything we left out, reopen and we can address. Thanks for being part of this great community.

<!-- gh-comment-id:1839350775 --> @technovangelist commented on GitHub (Dec 4, 2023): It looks like this issue was solved already as per the linked issue as well as Bruce's comment. I will go ahead and close it now. If you think there is anything we left out, reopen and we can address. Thanks for being part of this great community.
Author
Owner

@ibehnam commented on GitHub (Dec 11, 2023):

I'm on macOS and still have the issue even after doing brew services restart ollama. I used to see ollama's icon in menu bar but not anymore.

<!-- gh-comment-id:1849371386 --> @ibehnam commented on GitHub (Dec 11, 2023): I'm on macOS and still have the issue even after doing `brew services restart ollama`. I used to see ollama's icon in menu bar but not anymore.
Author
Owner

@DoLife commented on GitHub (Dec 12, 2023):

I`m having the same isssue but Im on windows using the docker

<!-- gh-comment-id:1851891129 --> @DoLife commented on GitHub (Dec 12, 2023): I`m having the same isssue but Im on windows using the docker
Author
Owner

@xyproto commented on GitHub (Dec 12, 2023):

@ibehnam @DoLife Are you sure you are not running Ollama twice? Have you also tried checking with docker ps that Ollama is not already running within Docker?

<!-- gh-comment-id:1852151512 --> @xyproto commented on GitHub (Dec 12, 2023): @ibehnam @DoLife Are you sure you are not running Ollama twice? Have you also tried checking with `docker ps` that Ollama is not already running within Docker?
Author
Owner

@buts101 commented on GitHub (Dec 22, 2023):

disable firewalld or any other ip filtering

<!-- gh-comment-id:1867467913 --> @buts101 commented on GitHub (Dec 22, 2023): disable firewalld or any other ip filtering
Author
Owner

@ClaudeRobbinCR commented on GitHub (Dec 23, 2023):

@technovangelist Actually, I was just running it on WSL2 ubuntu 22.04, and it showed the same error.

<!-- gh-comment-id:1868373200 --> @ClaudeRobbinCR commented on GitHub (Dec 23, 2023): @technovangelist Actually, I was just running it on WSL2 ubuntu 22.04, and it showed the same error.
Author
Owner

@shivrajjadhav733 commented on GitHub (Jan 8, 2024):

OS - Apple M1 Pro chip

I tried to install ollama on machine. Installation was successful. I can see Ollama icon in menu bar at the top.

when I try to run model using command -

ollama run laama2
Or
ollama run mistral

I get attached error of operation timed out.

BC48A4D0-AA86-41AA-B611-961467611449

I tried to run - brew services restart ollama and I got error saying “ Error: Formula ‘ollama’ is not installed.

How do I fix the errors and run models using ollama?

<!-- gh-comment-id:1881621566 --> @shivrajjadhav733 commented on GitHub (Jan 8, 2024): OS - Apple M1 Pro chip I tried to install ollama on machine. Installation was successful. I can see Ollama icon in menu bar at the top. when I try to run model using command - ollama run laama2 Or ollama run mistral I get attached error of operation timed out. ![BC48A4D0-AA86-41AA-B611-961467611449](https://github.com/jmorganca/ollama/assets/35407279/6532b007-2626-4f4a-87fc-499aa96e9d2f) I tried to run - brew services restart ollama and I got error saying “ Error: Formula ‘ollama’ is not installed. How do I fix the errors and run models using ollama?
Author
Owner

@inteligenciamilgrau commented on GitHub (Feb 5, 2024):

For me it's is happening because ollama was already running

try:

systemctl status ollama

And you will get something like telling you it's running (active):

ollama.service - Ollama Service
Loaded: loaded (/etc/systemd/system/ollama.service; enabled; vendor preset: enabled)
Active: active (running) since Mon 2024-02-05 08:33:09 -03; 25min ago
Main PID: 3525 (ollama)
Tasks: 51 (limit: 9459)
Memory: 4.6G
CGroup: /system.slice/ollama.service
└─3525 /usr/local/bin/ollama serve

To start/stop this service use:

systemctl stop ollama
systemctl start ollama

<!-- gh-comment-id:1926849813 --> @inteligenciamilgrau commented on GitHub (Feb 5, 2024): For me it's is happening because ollama was already running try: `systemctl status ollama` And you will get something like telling you it's running (active): > ollama.service - Ollama Service > Loaded: loaded (/etc/systemd/system/ollama.service; enabled; vendor preset: enabled) > Active: active (running) since Mon 2024-02-05 08:33:09 -03; 25min ago > Main PID: 3525 (ollama) > Tasks: 51 (limit: 9459) > Memory: 4.6G > CGroup: /system.slice/ollama.service > └─3525 /usr/local/bin/ollama serve To start/stop this service use: `systemctl stop ollama` `systemctl start ollama`
Author
Owner

@lovincyrus commented on GitHub (Feb 15, 2024):

If you're still encountering this on mac,

lsof -i :11434
kill <PID>
<!-- gh-comment-id:1945623450 --> @lovincyrus commented on GitHub (Feb 15, 2024): If you're still encountering this on mac, ``` lsof -i :11434 kill <PID> ```
Author
Owner

@HPUhushicheng commented on GitHub (Mar 19, 2024):

ubuntu/debian
sudo apt update
sudo apt install lsof
lsof -i :11434
kill <PID>
ollama serve

centos
sudo yum update
sudo yum install lsof
lsof -i :11434
kill <PID>
ollama serve

<!-- gh-comment-id:2007071526 --> @HPUhushicheng commented on GitHub (Mar 19, 2024): ubuntu/debian `sudo apt update` `sudo apt install lsof` `lsof -i :11434` `kill <PID>` `ollama serve` centos `sudo yum update` `sudo yum install lsof` `lsof -i :11434` `kill <PID>` `ollama serve`
Author
Owner

@abdurahmanadilovic commented on GitHub (Apr 1, 2024):

when I run lsof -i :11434 I get no results, but if I do ollama serve I still get the address already in use error

<!-- gh-comment-id:2029633375 --> @abdurahmanadilovic commented on GitHub (Apr 1, 2024): when I run `lsof -i :11434` I get no results, but if I do `ollama serve` I still get the address already in use error
Author
Owner

@DeepNeurons commented on GitHub (Apr 2, 2024):

i encountered the same issue on Ubuntu 20.04.5 LTS
solution : export OLLAMA_HOST=localhost:8888
source : here

<!-- gh-comment-id:2032898605 --> @DeepNeurons commented on GitHub (Apr 2, 2024): i encountered the same issue on Ubuntu 20.04.5 LTS solution : export OLLAMA_HOST=localhost:8888 source : [here](https://pypi.org/project/ollamac/)
Author
Owner

@Bayout commented on GitHub (Apr 3, 2024):

@DeepNeurons Thanks brother

<!-- gh-comment-id:2034668425 --> @Bayout commented on GitHub (Apr 3, 2024): @DeepNeurons Thanks brother
Author
Owner

@Tchez commented on GitHub (Apr 6, 2024):

When I run ollama serve I get

Error: listen tcp 127.0.0.1:11434: bind: address already in use

After checking what's running on the port with sudo lsof -i :11434

I see that ollama is already running

ollama 2233 ollama 3u IPv4 37563 0t0 TCP localhost:11434 (LISTEN)

I killed the process and ran the serve command again and got the same error. So it seems that it tries to start the server twice.

Try this:

 sudo systemctl stop ollama

I wasn't able to do it without 'sudo'

<!-- gh-comment-id:2041160218 --> @Tchez commented on GitHub (Apr 6, 2024): > When I run `ollama serve` I get > > `Error: listen tcp 127.0.0.1:11434: bind: address already in use` > > After checking what's running on the port with `sudo lsof -i :11434` > > I see that ollama is already running > > `ollama 2233 ollama 3u IPv4 37563 0t0 TCP localhost:11434 (LISTEN)` > > I killed the process and ran the serve command again and got the same error. So it seems that it tries to start the server twice. Try this: ```bash sudo systemctl stop ollama ``` > I wasn't able to do it without 'sudo'
Author
Owner

@svr123456789 commented on GitHub (Apr 18, 2024):

Look like need to add
Environment="OLLAMA_HOST=0.0.0.0:11434"
into /etc/systemd/system/ollama.service

<!-- gh-comment-id:2064644247 --> @svr123456789 commented on GitHub (Apr 18, 2024): Look like need to add `Environment="OLLAMA_HOST=0.0.0.0:11434"` into /etc/systemd/system/ollama.service
Author
Owner

@AmmariAbdelmounaim commented on GitHub (Apr 29, 2024):

Ran into a bit of a snag with WSL where I was still getting errors even after stopping ollama using systemctl stop ollama. Turns out the listener port was still active. Here's a quick fix I found that worked for me on Windows:

Fire up your Command Prompt as admin and run the following to grab the PID of the problematic port:netstat -aon | findstr :11434
With the PID in hand, kill the process: taskkill /F /PID <PID>
Make sure to replace with your actual process ID.
After that's done, just restart ollama: ollama serve

<!-- gh-comment-id:2083643253 --> @AmmariAbdelmounaim commented on GitHub (Apr 29, 2024): Ran into a bit of a snag with WSL where I was still getting errors even after stopping ollama using `systemctl stop ollama`. Turns out the listener port was still active. Here's a quick fix I found that worked for me on Windows: Fire up your Command Prompt as admin and run the following to grab the PID of the problematic port:`netstat -aon | findstr :11434` With the PID in hand, kill the process: `taskkill /F /PID <PID>` Make sure to replace <PID> with your actual process ID. After that's done, just restart ollama: `ollama serve`
Author
Owner

@Pathsis commented on GitHub (Apr 30, 2024):

when I run lsof -i :11434 I get no results, but if I do ollama serve I still get the address already in use error

I have the same problem.

<!-- gh-comment-id:2086050775 --> @Pathsis commented on GitHub (Apr 30, 2024): > when I run `lsof -i :11434` I get no results, but if I do `ollama serve` I still get the address already in use error I have the same problem.
Author
Owner

@Pathsis commented on GitHub (Apr 30, 2024):

Ran into a bit of a snag with WSL where I was still getting errors even after stopping ollama using systemctl stop ollama. Turns out the listener port was still active. Here's a quick fix I found that worked for me on Windows:

Fire up your Command Prompt as admin and run the following to grab the PID of the problematic port:netstat -aon | findstr :11434 With the PID in hand, kill the process: taskkill /F /PID <PID> Make sure to replace with your actual process ID. After that's done, just restart ollama: ollama serve

I killed all the pids to the point where running lsof -i :11434 gives no output, but running ollama serve still gives me this error.

<!-- gh-comment-id:2086055830 --> @Pathsis commented on GitHub (Apr 30, 2024): > Ran into a bit of a snag with WSL where I was still getting errors even after stopping ollama using `systemctl stop ollama`. Turns out the listener port was still active. Here's a quick fix I found that worked for me on Windows: > > Fire up your Command Prompt as admin and run the following to grab the PID of the problematic port:`netstat -aon | findstr :11434` With the PID in hand, kill the process: `taskkill /F /PID <PID>` Make sure to replace with your actual process ID. After that's done, just restart ollama: `ollama serve` I killed all the pids to the point where running `lsof -i :11434 ` gives no output, but running `ollama serve` still gives me this error.
Author
Owner

@BruceMacD commented on GitHub (May 1, 2024):

Hey all, not seeing ollama in the output of lsof could be a permissions issue. When you install ollama on linux via the install script it creates a service user for the background process. You may need to stop the process via systemctl in that case.

Here is some troubleshooting steps that will hopefully help:

  • Stop the background service: sudo systemctl stop ollama
  • Run lsof as sudo to rule out permissions issues: sudo lsof -i :11434
<!-- gh-comment-id:2089133698 --> @BruceMacD commented on GitHub (May 1, 2024): Hey all, not seeing ollama in the output of `lsof` could be a permissions issue. When you install ollama on linux via the install script it creates a service user for the background process. You may need to stop the process via systemctl in that case. Here is some troubleshooting steps that will hopefully help: - Stop the background service: `sudo systemctl stop ollama` - Run `lsof` as sudo to rule out permissions issues: `sudo lsof -i :11434`
Author
Owner

@RichardScottOZ commented on GitHub (May 9, 2024):

that has seemed to work thanks Bruce

<!-- gh-comment-id:2102649962 --> @RichardScottOZ commented on GitHub (May 9, 2024): that has seemed to work thanks Bruce
Author
Owner

@justingolden21 commented on GitHub (May 11, 2024):

Same problem here (win10)

image

I only run ollama through command prompt and it doesn't let me kill the process

image

<!-- gh-comment-id:2105611873 --> @justingolden21 commented on GitHub (May 11, 2024): Same problem here (win10) ![image](https://github.com/ollama/ollama/assets/30274440/5b673cac-aec8-44aa-bc51-5326cb06d394) I only run ollama through command prompt and it doesn't let me kill the process ![image](https://github.com/ollama/ollama/assets/30274440/2b166fb3-3ee0-4c03-96b9-94c9e53d8a8f)
Author
Owner

@fcalabrow commented on GitHub (May 23, 2024):

On Windows, make sure that Llama app is not already running. Hit the right click and then quit Llama, then run it from the terminal with ollama serve
image

<!-- gh-comment-id:2126954224 --> @fcalabrow commented on GitHub (May 23, 2024): On Windows, make sure that Llama app is not already running. Hit the right click and then quit Llama, then run it from the terminal with `ollama serve` <img width="176" alt="image" src="https://github.com/ollama/ollama/assets/87214888/4b18497a-45a8-4edc-bade-84f51b393a86">
Author
Owner

@paralyser commented on GitHub (Jun 7, 2024):

Fresh install.
System synology .
Installed with both docker and portainer.
Wiped/restarted/updated/reinstalled several times. Tried around 10 (8888,8778,11434,11535,11636,11433 etc.. )different ports, but every port gives the same error( with different ports).
Specified ports, ( OLLAMA_HOST=0.0.0.0:11434 ) to every port, changed 0.0.0.0 to 127.0.0.1 also. Same errors
Error: listen tcp 0.0.0.0:11434: bind: address already in use.
Have no idea how to fix it.

<!-- gh-comment-id:2155631712 --> @paralyser commented on GitHub (Jun 7, 2024): Fresh install. System synology . Installed with both docker and portainer. Wiped/restarted/updated/reinstalled several times. Tried around 10 (8888,8778,11434,11535,11636,11433 etc.. )different ports, but every port gives the same error( with different ports). Specified ports, ( OLLAMA_HOST=0.0.0.0:11434 ) to every port, changed 0.0.0.0 to 127.0.0.1 also. Same errors Error: listen tcp 0.0.0.0:11434: bind: address already in use. Have no idea how to fix it.
Author
Owner

@SebastianoF commented on GitHub (Jul 9, 2024):

On Mac first install was naughty for me.

Even with the app quitted (from the top bar), the port was in use and being re-created immediately after killing it with kill <PID>.

So:

  • I started again the ollama app again, double clicking on the applications list.
  • I quitted it again via the right click icon, and the problem solved itself (I also disabled the app running at startup from the system settings, for the next time).

Now the ollama CLI from terminal works as expected, with the app still quitted.

<!-- gh-comment-id:2217210501 --> @SebastianoF commented on GitHub (Jul 9, 2024): On Mac first install was naughty for me. Even with the app quitted (from the top bar), the port was in use and being re-created immediately after killing it with `kill <PID>`. So: - I started again the ollama app again, double clicking on the applications list. - I quitted it again via the right click icon, and the problem solved itself (I also disabled the app running at startup from the system settings, for the next time). Now the ollama CLI from terminal works as expected, with the app still quitted.
Author
Owner

@loveplay1983 commented on GitHub (Feb 20, 2025):

When I run ollama serve I get

Error: listen tcp 127.0.0.1:11434: bind: address already in use

After checking what's running on the port with sudo lsof -i :11434

I see that ollama is already running

ollama 2233 ollama 3u IPv4 37563 0t0 TCP localhost:11434 (LISTEN)

I killed the process and ran the serve command again and got the same error. So it seems that it tries to start the server twice.

On Ubuntu, Ollama is typically installed as a systemd service that starts automatically at system startup. This means the Ollama API server is already running in the background, so you generally don't need to run ollama serve manually. If you do, the system complains with "127.0.0.1:11434: bind: address already in use."

To solve the issue, we can define a custom port in the service file as we do above. However, commands like ollama list will complain after we set up a custom port since they are bound to the default 11434. We can explicitly export the environment variable OLLAMA_HOST and use the same port defined in the service file. For example, export OLLAMA_HOST=0.0.0.0:11450 # 11450 is the custom port defined in the service file. Then, the command ollama list will work again.

<!-- gh-comment-id:2671392784 --> @loveplay1983 commented on GitHub (Feb 20, 2025): > When I run `ollama serve` I get > > `Error: listen tcp 127.0.0.1:11434: bind: address already in use` > > After checking what's running on the port with `sudo lsof -i :11434` > > I see that ollama is already running > > `ollama 2233 ollama 3u IPv4 37563 0t0 TCP localhost:11434 (LISTEN)` > > I killed the process and ran the serve command again and got the same error. So it seems that it tries to start the server twice. On Ubuntu, Ollama is typically installed as a systemd service that starts automatically at system startup. This means the Ollama API server is already running in the background, so you generally don't need to run ollama serve manually. If you do, the system complains with "127.0.0.1:11434: bind: address already in use." To solve the issue, we can define a custom port in the service file as we do above. However, commands like ollama list will complain after we set up a custom port since they are bound to the default 11434. We can explicitly export the environment variable OLLAMA_HOST and use the same port defined in the service file. For example, export OLLAMA_HOST=0.0.0.0:11450 # 11450 is the custom port defined in the service file. Then, the command ollama list will work again.
Author
Owner

@Jaykumaran commented on GitHub (Feb 24, 2025):

For Ubuntu

Step 1. sudo nano /etc/systemd/system/ollama.service

Add Environment="OLLAMA_HOST=0.0.0.0:11434"

Step 2: source ~/.bashrc

Step 3: systemctl stop ollama

If an warning pops up like:
Warning: The unit file, source configuration file or drop-ins of ollama.service changed on disk. Run 'systemctl daemon-reload' to reload units.

Finally,

Step 4: systemctl daemon-reload

It should work now,

ollama serve

<!-- gh-comment-id:2677817355 --> @Jaykumaran commented on GitHub (Feb 24, 2025): For **Ubuntu** Step 1. `sudo nano /etc/systemd/system/ollama.service` Add `Environment="OLLAMA_HOST=0.0.0.0:11434"` Step 2: `source ~/.bashrc` Step 3: `systemctl stop ollama` If an warning pops up like: `Warning: The unit file, source configuration file or drop-ins of ollama.service changed on disk. Run 'systemctl daemon-reload' to reload units.` Finally, Step 4: `systemctl daemon-reload` It should work now, `ollama serve`
Author
Owner

@Haseeeb21 commented on GitHub (Mar 19, 2025):

For Ubuntu

Step 1. sudo nano /etc/systemd/system/ollama.service

Add Environment="OLLAMA_HOST=0.0.0.0:11434"

Step 2: source ~/.bashrc

Step 3: systemctl stop ollama

If an warning pops up like: Warning: The unit file, source configuration file or drop-ins of ollama.service changed on disk. Run 'systemctl daemon-reload' to reload units.

Finally,

Step 4: systemctl daemon-reload

It should work now,

ollama serve

Worked Thanks!

<!-- gh-comment-id:2736162006 --> @Haseeeb21 commented on GitHub (Mar 19, 2025): > For **Ubuntu** > > Step 1. `sudo nano /etc/systemd/system/ollama.service` > > Add `Environment="OLLAMA_HOST=0.0.0.0:11434"` > > Step 2: `source ~/.bashrc` > > Step 3: `systemctl stop ollama` > > If an warning pops up like: `Warning: The unit file, source configuration file or drop-ins of ollama.service changed on disk. Run 'systemctl daemon-reload' to reload units.` > > Finally, > > Step 4: `systemctl daemon-reload` > > It should work now, > > `ollama serve` Worked Thanks!
Author
Owner

@eniodev commented on GitHub (Aug 31, 2025):

this worked for me https://github.com/ollama/ollama/issues/2194

<!-- gh-comment-id:3240239738 --> @eniodev commented on GitHub (Aug 31, 2025): this worked for me https://github.com/ollama/ollama/issues/2194
Author
Owner

@ghost commented on GitHub (Nov 22, 2025):

This model is the best to work for Bittensor.com

<!-- gh-comment-id:3565386431 --> @ghost commented on GitHub (Nov 22, 2025): This model is the best to work for Bittensor.com
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#26087