[GH-ISSUE #5008] Cant connect from WSL Ubuntu to the Windows 11 host system #28928

Closed
opened 2026-04-22 07:29:11 -05:00 by GiteaMirror · 5 comments
Owner

Originally created by @PayteR on GitHub (Jun 12, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/5008

Originally assigned to: @dhiltgen on GitHub.

What is the issue?

Hi, i installed Ollama on Windows 11 machine, then i want to access it from my WSL Ubuntu installation, i opened port 11434 on host machine, and when I try nc it worked well

nc -zv 172.23.16.1 11434
Connection to 172.23.16.1 11434 port [tcp/*] succeeded!

but when I try wget it

 wget 172.23.16.1:11434 -v
--2024-06-12 23:56:14--  http://172.23.16.1:11434/
Connecting to 172.23.16.1:11434... connected.
HTTP request sent, awaiting response... No data received.
Retrying.

--2024-06-12 23:56:15--  (try: 2)  http://172.23.16.1:11434/
Connecting to 172.23.16.1:11434... connected.
HTTP request sent, awaiting response... Read error (Connection reset by peer) in headers.
Retrying.

--2024-06-12 23:56:17--  (try: 3)  http://172.23.16.1:11434/
Connecting to 172.23.16.1:11434... connected.
HTTP request sent, awaiting response... No data received.
Retrying.

--2024-06-12 23:56:20--  (try: 4)  http://172.23.16.1:11434/
Connecting to 172.23.16.1:11434... connected.
HTTP request sent, awaiting response... No data received.
Retrying.

it gets the error and I'm unable to connect to API from Ubuntu. I tried to disable firewall but that didn't help. Thx for any advice.

OS

Linux, Windows

GPU

Nvidia

CPU

AMD

Ollama version

0.1.42

Originally created by @PayteR on GitHub (Jun 12, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/5008 Originally assigned to: @dhiltgen on GitHub. ### What is the issue? Hi, i installed Ollama on Windows 11 machine, then i want to access it from my WSL Ubuntu installation, i opened port `11434` on host machine, and when I try `nc` it worked well ``` nc -zv 172.23.16.1 11434 Connection to 172.23.16.1 11434 port [tcp/*] succeeded! ``` but when I try wget it ``` wget 172.23.16.1:11434 -v --2024-06-12 23:56:14-- http://172.23.16.1:11434/ Connecting to 172.23.16.1:11434... connected. HTTP request sent, awaiting response... No data received. Retrying. --2024-06-12 23:56:15-- (try: 2) http://172.23.16.1:11434/ Connecting to 172.23.16.1:11434... connected. HTTP request sent, awaiting response... Read error (Connection reset by peer) in headers. Retrying. --2024-06-12 23:56:17-- (try: 3) http://172.23.16.1:11434/ Connecting to 172.23.16.1:11434... connected. HTTP request sent, awaiting response... No data received. Retrying. --2024-06-12 23:56:20-- (try: 4) http://172.23.16.1:11434/ Connecting to 172.23.16.1:11434... connected. HTTP request sent, awaiting response... No data received. Retrying. ``` it gets the error and I'm unable to connect to API from Ubuntu. I tried to disable firewall but that didn't help. Thx for any advice. ### OS Linux, Windows ### GPU Nvidia ### CPU AMD ### Ollama version 0.1.42
GiteaMirror added the wslneeds more infowindows labels 2026-04-22 07:29:11 -05:00
Author
Owner

@JerrettDavis commented on GitHub (Jun 12, 2024):

By default, Ollama doesn't listen on all interfaces. Try setting the OLLAMA_HOST environment variable to 0.0.0.0 to instruct it to listen on all interfaces. if you haven't already.

<!-- gh-comment-id:2164086796 --> @JerrettDavis commented on GitHub (Jun 12, 2024): By default, Ollama doesn't listen on all interfaces. Try setting the `OLLAMA_HOST` environment variable to `0.0.0.0` to instruct it to listen on all interfaces. if you haven't already.
Author
Owner

@PayteR commented on GitHub (Jun 13, 2024):

By default, Ollama doesn't listen on all interfaces. Try setting the OLLAMA_HOST environment variable to 0.0.0.0 to instruct it to listen on all interfaces. if you haven't already.

Hmm then I get another error and it doesn't even run

ollama run llama3
Error: Head "http://0.0.0.0:11434/": read tcp 127.0.0.1:56139->127.0.0.1:11434: wsarecv: An existing connection was forcibly closed by the remote host.
<!-- gh-comment-id:2164365461 --> @PayteR commented on GitHub (Jun 13, 2024): > By default, Ollama doesn't listen on all interfaces. Try setting the `OLLAMA_HOST` environment variable to `0.0.0.0` to instruct it to listen on all interfaces. if you haven't already. Hmm then I get another error and it doesn't even run ``` ollama run llama3 Error: Head "http://0.0.0.0:11434/": read tcp 127.0.0.1:56139->127.0.0.1:11434: wsarecv: An existing connection was forcibly closed by the remote host. ```
Author
Owner

@JerrettDavis commented on GitHub (Jun 13, 2024):

@PayteR can you check the Ollama server logs and post them? That error would seem to indicate that the server isn't running as expected.

<!-- gh-comment-id:2165636999 --> @JerrettDavis commented on GitHub (Jun 13, 2024): @PayteR can you check the Ollama server logs and post them? That error would seem to indicate that the server isn't running as expected.
Author
Owner

@dhiltgen commented on GitHub (Jun 18, 2024):

@PayteR you'll need different OLLAMA_HOST settings for the client and server - 0.0.0.0 works for a server to tell it to bind to all IPv4 addresses/interfaces on the system, but that is ambiguous for a client, as it doesn't know what IP to connect to.

Be aware that binding the server to 0.0.0.0 does it expose it on your network, so other machines on the same subnet can access the server (depending on how you have firewalls configured). You may want to bind it to a specific address on your host, and then use that address within WSL for the client to connect to.

<!-- gh-comment-id:2177193393 --> @dhiltgen commented on GitHub (Jun 18, 2024): @PayteR you'll need different OLLAMA_HOST settings for the client and server - 0.0.0.0 works for a server to tell it to bind to all IPv4 addresses/interfaces on the system, but that is ambiguous for a client, as it doesn't know what IP to connect to. Be aware that binding the server to 0.0.0.0 does it expose it on your network, so other machines on the same subnet can access the server (depending on how you have firewalls configured). You may want to bind it to a specific address on your host, and then use that address within WSL for the client to connect to.
Author
Owner

@PayteR commented on GitHub (Jun 20, 2024):

Thank you guys for help, i had multiple issues there and last one was that I had proxy configured by netsh
so I needed to remove it like here https://github.com/ollama/ollama/issues/2560#issuecomment-1950690705

OLLAMA_HOST 0.0.0.0 was definetly that first thing (and only thing in my case) that should I configure and everything would work

<!-- gh-comment-id:2180328656 --> @PayteR commented on GitHub (Jun 20, 2024): Thank you guys for help, i had multiple issues there and last one was that I had proxy configured by `netsh ` so I needed to remove it like here https://github.com/ollama/ollama/issues/2560#issuecomment-1950690705 `OLLAMA_HOST 0.0.0.0` was definetly that first thing (and only thing in my case) that should I configure and everything would work
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#28928