[GH-ISSUE #3089] Error when requesting ollama api from another pc (windows) #48408

Closed
opened 2026-04-28 08:07:19 -05:00 by GiteaMirror · 16 comments
Owner

Originally created by @insooneelife on GitHub (Mar 13, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3089

I plan to set up ollama on another PC and proceed with the work on the current PC.
However, when sending a request to ollama from a PC, I entered the IP address of the PC and sent it, but there is no reply.
Can you tell me what the problem is?

request url
http://localhost:11434/api/chat -> http://172.168.10.1:11434/api/chat

Originally created by @insooneelife on GitHub (Mar 13, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3089 I plan to set up ollama on another PC and proceed with the work on the current PC. However, when sending a request to ollama from a PC, I entered the IP address of the PC and sent it, but there is no reply. Can you tell me what the problem is? request url http://localhost:11434/api/chat -> http://172.168.10.1:11434/api/chat
GiteaMirror added the question label 2026-04-28 08:07:19 -05:00
Author
Owner

@BruceMacD commented on GitHub (Mar 13, 2024):

What error do you see? Is the host accessible on that port?

<!-- gh-comment-id:1993122043 --> @BruceMacD commented on GitHub (Mar 13, 2024): What error do you see? Is the host accessible on that port?
Author
Owner

@insooneelife commented on GitHub (Mar 13, 2024):

Ah, I think this is my issue.
I suspect it is a security solution issue, but there is no ping response. sorry.
I think it can be closed.

<!-- gh-comment-id:1993164305 --> @insooneelife commented on GitHub (Mar 13, 2024): Ah, I think this is my issue. I suspect it is a security solution issue, but there is no ping response. sorry. I think it can be closed.
Author
Owner

@insooneelife commented on GitHub (Mar 13, 2024):

There was an issue with no ping response due to a problem with the company's security program, so I thought it was my issue. However, even after solving this problem, I am unable to connect ollama in other pc.

  1. ping test ok
  2. curl test failed

This may be an issue because ollama is not in the firewall's allowed app list, so I tried adding it as an allowed app through the control panel, but the problem was not resolved.

I also tried adding the Windows port to the secure inbound, but the problem was not solved.

<!-- gh-comment-id:1993306845 --> @insooneelife commented on GitHub (Mar 13, 2024): There was an issue with no ping response due to a problem with the company's security program, so I thought it was my issue. However, even after solving this problem, I am unable to connect ollama in other pc. 1. ping test ok 2. curl test failed This may be an issue because ollama is not in the firewall's allowed app list, so I tried adding it as an allowed app through the control panel, but the problem was not resolved. I also tried adding the Windows port to the secure inbound, but the problem was not solved.
Author
Owner

@bobir01 commented on GitHub (Mar 13, 2024):

when i ran it with docker exposing port, in my case worked fine in linux ubuntu. 22.04

<!-- gh-comment-id:1993459435 --> @bobir01 commented on GitHub (Mar 13, 2024): when i ran it with docker exposing port, in my case worked fine in linux ubuntu. 22.04
Author
Owner

@insooneelife commented on GitHub (Mar 13, 2024):

I'm working on Windows.
I have two Windows PCs, A and B.
I turned on ollama on A PC.
I tested the connection through
test-netconnection <IP> -port 11434
in Windows powershell to connect A, but it failed.

So I created a custom server and turned it on on PC A to see if there was a problem with networking between my PCs.
And I connected to this server with this command.

test-netconnection <IP> -port 11434

And it works well.

>>test-netconnection <IP> -port 11434 .. RemotePort : 11434 InterfaceAlias : Ethernet TcpTestSucceeded : True

Here's my C# server example.


using System;
using System.Net;
using System.Net.Sockets;
using System.Text;

namespace SimpleServer
{
class Program
{
static void Main(string[] args)
{
Int32 port = 11434;
IPAddress localAddr = IPAddress.Any; // Listen on all IP addresses

		TcpListener server = new TcpListener(localAddr, port);

		server.Start();

		Console.WriteLine("Waiting for a connection... ");

		TcpClient client = server.AcceptTcpClient();
		Console.WriteLine("Connected!");

		NetworkStream stream = client.GetStream();

		string response = "Hello, client! Welcome to the server!";
		byte[] msg = Encoding.ASCII.GetBytes(response);
		stream.Write(msg, 0, msg.Length);
		Console.WriteLine("Sent: {0}", response);

		client.Close();
		server.Stop();

		Console.WriteLine("\nPress ENTER to continue...");
		Console.ReadLine();
	}
}

}


What I am suspicious of is that when I run this server example on Windows, a window pops up asking if I want to designate it as an app to be allowed in the firewall, and I think it worked well because I allowed it here. In the case of Ollama, the firewall permission settings do not seem to work properly.

<!-- gh-comment-id:1993615684 --> @insooneelife commented on GitHub (Mar 13, 2024): I'm working on Windows. I have two Windows PCs, A and B. I turned on ollama on A PC. I tested the connection through `test-netconnection <IP> -port 11434` in Windows powershell to connect A, but it failed. So I created a custom server and turned it on on PC A to see if there was a problem with networking between my PCs. And I connected to this server with this command. `test-netconnection <IP> -port 11434` And it works well. `>>test-netconnection <IP> -port 11434 .. RemotePort : 11434 InterfaceAlias : Ethernet TcpTestSucceeded : True` Here's my C# server example. -------------------------------------------------------------------------------------------------- using System; using System.Net; using System.Net.Sockets; using System.Text; namespace SimpleServer { class Program { static void Main(string[] args) { Int32 port = 11434; IPAddress localAddr = IPAddress.Any; // Listen on all IP addresses TcpListener server = new TcpListener(localAddr, port); server.Start(); Console.WriteLine("Waiting for a connection... "); TcpClient client = server.AcceptTcpClient(); Console.WriteLine("Connected!"); NetworkStream stream = client.GetStream(); string response = "Hello, client! Welcome to the server!"; byte[] msg = Encoding.ASCII.GetBytes(response); stream.Write(msg, 0, msg.Length); Console.WriteLine("Sent: {0}", response); client.Close(); server.Stop(); Console.WriteLine("\nPress ENTER to continue..."); Console.ReadLine(); } } } -------------------------------------------------------------------------------------------------- What I am suspicious of is that when I run this server example on Windows, a window pops up asking if I want to designate it as an app to be allowed in the firewall, and I think it worked well because I allowed it here. In the case of Ollama, the firewall permission settings do not seem to work properly.
Author
Owner

@pdevine commented on GitHub (Mar 13, 2024):

@insooneelife what did you set the OLLAMA_HOST variable to when starting ollama serve? It should be set to OLLAMA_HOST=0.0.0.0:11434

<!-- gh-comment-id:1996094088 --> @pdevine commented on GitHub (Mar 13, 2024): @insooneelife what did you set the `OLLAMA_HOST` variable to when starting `ollama serve`? It should be set to `OLLAMA_HOST=0.0.0.0:11434`
Author
Owner

@insooneelife commented on GitHub (Mar 14, 2024):

@insooneelife what did you set the OLLAMA_HOST variable to when starting ollama serve? It should be set to OLLAMA_HOST=0.0.0.0:11434

https://github.com/ollama/ollama/issues/703
I read this issue and tried this with powershell:

set OLLAMA_HOST=0.0.0.0:11434 ollama serve

I tested the connection the same way as before, but it didn't work.
test-netconnection <IP> -port 11434

And I have no idea how ollama serve should work. Now only this log appears, but if you look at the task manager, ollama.exe is not running.

PS C:\WINDOWS\system32> set OLLAMA_HOST=0.0.0.0 PS C:\WINDOWS\system32> ollama serve time=2024-03-14T10:51:28.018+09:00 level=INFO source=images.go:710 msg="total blobs: 5" time=2024-03-14T10:51:28.110+09:00 level=INFO source=images.go:717 msg="total unused blobs removed: 0" time=2024-03-14T10:51:28.115+09:00 level=INFO source=routes.go:1021 msg="Listening on 127.0.0.1:11434 (version 0.1.28)" time=2024-03-14T10:51:28.115+09:00 level=INFO source=payload_common.go:107 msg="Extracting dynamic libraries..." time=2024-03-14T10:51:28.778+09:00 level=INFO source=payload_common.go:146 msg="Dynamic LLM libraries [cuda_v11.3 cpu_avx cpu cpu_avx2]"

<!-- gh-comment-id:1996245306 --> @insooneelife commented on GitHub (Mar 14, 2024): > @insooneelife what did you set the `OLLAMA_HOST` variable to when starting `ollama serve`? It should be set to `OLLAMA_HOST=0.0.0.0:11434` https://github.com/ollama/ollama/issues/703 I read this issue and tried this with powershell: `set OLLAMA_HOST=0.0.0.0:11434 ollama serve` I tested the connection the same way as before, but it didn't work. `test-netconnection <IP> -port 11434` And I have no idea how ollama serve should work. Now only this log appears, but if you look at the task manager, ollama.exe is not running. `PS C:\WINDOWS\system32> set OLLAMA_HOST=0.0.0.0 PS C:\WINDOWS\system32> ollama serve time=2024-03-14T10:51:28.018+09:00 level=INFO source=images.go:710 msg="total blobs: 5" time=2024-03-14T10:51:28.110+09:00 level=INFO source=images.go:717 msg="total unused blobs removed: 0" time=2024-03-14T10:51:28.115+09:00 level=INFO source=routes.go:1021 msg="Listening on 127.0.0.1:11434 (version 0.1.28)" time=2024-03-14T10:51:28.115+09:00 level=INFO source=payload_common.go:107 msg="Extracting dynamic libraries..." time=2024-03-14T10:51:28.778+09:00 level=INFO source=payload_common.go:146 msg="Dynamic LLM libraries [cuda_v11.3 cpu_avx cpu cpu_avx2]"`
Author
Owner

@qknight commented on GitHub (Mar 14, 2024):

@insooneelife make sure you can access the rest interface of ollama:

in cmd.exe do this (note this does not work for powershell):

set OLLAMA_HOST=0.0.0.0
ollama serve
time=2024-03-14T13:49:11.157Z level=INFO source=images.go:710 msg="total blobs: 8"
time=2024-03-14T13:49:11.159Z level=INFO source=images.go:717 msg="total unused blobs removed: 0"
time=2024-03-14T13:49:11.160Z level=INFO source=routes.go:1021 msg="Listening on [::]:11434 (version 0.1.28)"
time=2024-03-14T13:49:11.160Z level=INFO source=payload_common.go:107 msg="Extracting dynamic libraries..."
time=2024-03-14T13:49:11.300Z level=INFO source=payload_common.go:146 msg="Dynamic LLM libraries [cuda_v11.3 cpu cpu_avx2 cpu_avx]"

this MUST read as:

msg="Listening on [::]:11434 (version 0.1.28)"

and not as

msg="Listening on 127.0.0.1:11434 (version 0.1.28)"

next use curl to probe if the port can be reached from the other machine

<!-- gh-comment-id:1997511708 --> @qknight commented on GitHub (Mar 14, 2024): @insooneelife make sure you can access the rest interface of ollama: in cmd.exe do this (note this does not work for powershell): ``` set OLLAMA_HOST=0.0.0.0 ollama serve time=2024-03-14T13:49:11.157Z level=INFO source=images.go:710 msg="total blobs: 8" time=2024-03-14T13:49:11.159Z level=INFO source=images.go:717 msg="total unused blobs removed: 0" time=2024-03-14T13:49:11.160Z level=INFO source=routes.go:1021 msg="Listening on [::]:11434 (version 0.1.28)" time=2024-03-14T13:49:11.160Z level=INFO source=payload_common.go:107 msg="Extracting dynamic libraries..." time=2024-03-14T13:49:11.300Z level=INFO source=payload_common.go:146 msg="Dynamic LLM libraries [cuda_v11.3 cpu cpu_avx2 cpu_avx]" ``` this MUST read as: `msg="Listening on [::]:11434 (version 0.1.28)"` and not as `msg="Listening on 127.0.0.1:11434 (version 0.1.28)"` next use curl to probe if the port can be reached from the other machine
Author
Owner

@insooneelife commented on GitHub (Mar 15, 2024):

@qknight
Okay! Thank you!
It works fine now.

<!-- gh-comment-id:1999681618 --> @insooneelife commented on GitHub (Mar 15, 2024): @qknight Okay! Thank you! It works fine now.
Author
Owner

@ItsNoted commented on GitHub (Mar 15, 2024):

Been trying to get this working but keep getting this error

powershell_ZYEGq8d116

If I run them separately, it still runs it on the local 127 address.

powershell_gSCgin1wB1

Also, any idea why it wants to use my CPU and not my GPU? I have a 4060.

<!-- gh-comment-id:2000082721 --> @ItsNoted commented on GitHub (Mar 15, 2024): Been trying to get this working but keep getting this error ![powershell_ZYEGq8d116](https://github.com/ollama/ollama/assets/57927413/7d498b50-6c8b-4141-b1b9-f75d837d2abd) If I run them separately, it still runs it on the local 127 address. ![powershell_gSCgin1wB1](https://github.com/ollama/ollama/assets/57927413/fb8cfd75-9ae7-43f3-9a92-e54200a9cf5c) Also, any idea why it wants to use my CPU and not my GPU? I have a 4060.
Author
Owner

@wouterverduin commented on GitHub (Mar 16, 2024):

@qknight

Thank you so much for this help! Was running in the same issue and can now access it from my other devices.

However: When i close cmd.exe i cant connect anymore. Is there a way i can make it persistent so when i start my PC it automatically serves at 0.0.0.0:11434 and not manually start cmd.exe everytime?

<!-- gh-comment-id:2001916699 --> @wouterverduin commented on GitHub (Mar 16, 2024): @qknight Thank you so much for this help! Was running in the same issue and can now access it from my other devices. However: When i close cmd.exe i cant connect anymore. Is there a way i can make it persistent so when i start my PC it automatically serves at 0.0.0.0:11434 and not manually start cmd.exe everytime?
Author
Owner

@qknight commented on GitHub (Mar 16, 2024):

@wouterverduin the easiest way is to set it as global environment variable using https://superuser.com/questions/949560/how-do-i-set-system-environment-variables-in-windows-10 and just restart ollama by clicking on the ollama icon in the task bar, exit it and from the windows menu start it again.

<!-- gh-comment-id:2001942287 --> @qknight commented on GitHub (Mar 16, 2024): @wouterverduin the easiest way is to set it as global environment variable using https://superuser.com/questions/949560/how-do-i-set-system-environment-variables-in-windows-10 and just restart ollama by clicking on the ollama icon in the task bar, exit it and from the windows menu start it again.
Author
Owner

@wouterverduin commented on GitHub (Mar 16, 2024):

@qknight Thanks man; This was the cherry on the cake!

Now it serves on 0.0.0.0 everytime Ollama starts. Would have been great if there was some configuration method at installation or something but considering it is in preview, this works just fine!

<!-- gh-comment-id:2001943737 --> @wouterverduin commented on GitHub (Mar 16, 2024): @qknight Thanks man; This was the cherry on the cake! Now it serves on 0.0.0.0 everytime Ollama starts. Would have been great if there was some configuration method at installation or something but considering it is in preview, this works just fine!
Author
Owner

@alexdibattista commented on GitHub (Mar 18, 2024):

Been trying to get this working but keep getting this error

powershell_ZYEGq8d116

If I run them separately, it still runs it on the local 127 address.

powershell_gSCgin1wB1

Also, any idea why it wants to use my CPU and not my GPU? I have a 4060.

@ItsNoted In PowerShell you need to set the variable like this $env:OLLAMA_HOST = '0.0.0.0'

<!-- gh-comment-id:2003268496 --> @alexdibattista commented on GitHub (Mar 18, 2024): > Been trying to get this working but keep getting this error > > ![powershell_ZYEGq8d116](https://private-user-images.githubusercontent.com/57927413/313276132-7d498b50-6c8b-4141-b1b9-f75d837d2abd.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTA3NTMyMDksIm5iZiI6MTcxMDc1MjkwOSwicGF0aCI6Ii81NzkyNzQxMy8zMTMyNzYxMzItN2Q0OThiNTAtNmM4Yi00MTQxLWIxYjktZjc1ZDgzN2QyYWJkLnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNDAzMTglMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjQwMzE4VDA5MDgyOVomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPTNkNjRlMmNkNGZjYTg0YWQ5YzFmZTk1MWFkMzBhOTdiZDllYzhhZmFjZTU3NDA4Y2VlNDk3NzA0YTVjY2ZmOTcmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0JmFjdG9yX2lkPTAma2V5X2lkPTAmcmVwb19pZD0wIn0.df0Z4LtjH3tbH1pkNq9i5OXpqUB9El8pWcz34yW__o0) > > If I run them separately, it still runs it on the local 127 address. > > ![powershell_gSCgin1wB1](https://private-user-images.githubusercontent.com/57927413/313277091-fb8cfd75-9ae7-43f3-9a92-e54200a9cf5c.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTA3NTMyMDksIm5iZiI6MTcxMDc1MjkwOSwicGF0aCI6Ii81NzkyNzQxMy8zMTMyNzcwOTEtZmI4Y2ZkNzUtOWFlNy00M2YzLTlhOTItZTU0MjAwYTljZjVjLnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNDAzMTglMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjQwMzE4VDA5MDgyOVomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPTJhZDRkYjEwN2EzZWRkZmE4MGM5OWEwNTVmNWRjYWRlM2JjYmEzMTk0NmQ2MGE1OGVjODM5YmFmOGQ4ZDFmODAmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0JmFjdG9yX2lkPTAma2V5X2lkPTAmcmVwb19pZD0wIn0.8yZeib87wS3cY1Kaq9zO93sKb4-xhPBRrbLdzFAt-84) > > Also, any idea why it wants to use my CPU and not my GPU? I have a 4060. @ItsNoted In PowerShell you need to set the variable like this `$env:OLLAMA_HOST = '0.0.0.0'`
Author
Owner

@gustavoeenriquez commented on GitHub (May 13, 2024):

"I had the same issue, I tried several combinations until I finally found out that the variables are case sensitive, so it should be written as OLLAMA_HOST = '0.0.0.0'."

<!-- gh-comment-id:2108805228 --> @gustavoeenriquez commented on GitHub (May 13, 2024): "I had the same issue, I tried several combinations until I finally found out that the variables are case sensitive, so it should be written as OLLAMA_HOST = '0.0.0.0'."
Author
Owner

@pdevine commented on GitHub (May 13, 2024):

Now it serves on 0.0.0.0 everytime Ollama starts. Would have been great if there was some configuration method at installation or something but considering it is in preview, this works just fine!

There actually is UX that @jmorganca was working on to allow you to do this from the tray icon which will be nice.

<!-- gh-comment-id:2108811605 --> @pdevine commented on GitHub (May 13, 2024): > Now it serves on 0.0.0.0 everytime Ollama starts. Would have been great if there was some configuration method at installation or something but considering it is in preview, this works just fine! There actually is UX that @jmorganca was working on to allow you to do this from the tray icon which will be nice.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#48408