[GH-ISSUE #7286] httpcore.ConnectError: [WinError 10061] #66686

Closed
opened 2026-05-04 07:48:04 -05:00 by GiteaMirror · 8 comments
Owner

Originally created by @RXZAN on GitHub (Oct 21, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7286

What is the issue?

I'm running the ollama service on the server side。
There was a problem running this piece of code without ollama on the local machine。

MY code:
import os
os.environ["USER_AGENT"] = "MyCustomUserAgent/1.0"
os.environ['OLLAMA_API_KEY'] = 'none'
os.environ['OLLAMA_BASE_URL'] = 'http://10.4.(my_server_ip):11434/'

from langchain_ollama import ChatOllama

llm = ChatOllama(model='llama3.1:8b', temperature=0)
messages = [
("human", "Return the words Hello World!"),
]
for chunk in llm.stream(messages):
print(chunk)

屏幕截图 2024-10-22 093552

problem:
Traceback (most recent call last):
File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\httpx_transports\default.py", line 72, in map_httpcore_exceptions
yield
File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\httpx_transports\default.py", line 236, in handle_request
resp = self._pool.handle_request(req)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\httpcore_sync\connection_pool.py", line 216, in handle_request
raise exc from None
File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\httpcore_sync\connection_pool.py", line 196, in handle_request
response = connection.handle_request(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\httpcore_sync\connection.py", line 99, in handle_request
raise exc
File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\httpcore_sync\connection.py", line 76, in handle_request
stream = self._connect(request)
^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\httpcore_sync\connection.py", line 122, in _connect
stream = self._network_backend.connect_tcp(**kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\httpcore_backends\sync.py", line 205, in connect_tcp
with map_exceptions(exc_map):
^^^^^^^^^^^^^^^^^^^^^^^
File "D:\anaconda3\envs\RAG_extra\Lib\contextlib.py", line 158, in exit
self.gen.throw(value)
File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\httpcore_exceptions.py", line 14, in map_exceptions
raise to_exc(exc) from exc
httpcore.ConnectError: [WinError 10061] 由于目标计算机积极拒绝,无法连接。

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "D:\python_projects\Whole\text.py", line 13, in
for chunk in llm.stream(messages):
^^^^^^^^^^^^^^^^^^^^
File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\langchain_core\language_models\chat_models.py", line 420, in stream
raise e
File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\langchain_core\language_models\chat_models.py", line 400, in stream
for chunk in self._stream(messages, stop=stop, **kwargs):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\langchain_ollama\chat_models.py", line 665, in _stream
for stream_resp in self._create_chat_stream(messages, stop, **kwargs):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\langchain_ollama\chat_models.py", line 527, in _create_chat_stream
yield from self._client.chat(
File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\ollama_client.py", line 80, in _stream
with self._client.stream(method, url, **kwargs) as r:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\anaconda3\envs\RAG_extra\Lib\contextlib.py", line 137, in enter
return next(self.gen)
^^^^^^^^^^^^^^
File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\httpx_client.py", line 880, in stream
response = self.send(
^^^^^^^^^^
File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\httpx_client.py", line 926, in send
response = self._send_handling_auth(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\httpx_client.py", line 954, in _send_handling_auth
response = self._send_handling_redirects(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\httpx_client.py", line 991, in _send_handling_redirects
response = self._send_single_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\httpx_client.py", line 1027, in _send_single_request
response = transport.handle_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\httpx_transports\default.py", line 235, in handle_request
with map_httpcore_exceptions():
^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\anaconda3\envs\RAG_extra\Lib\contextlib.py", line 158, in exit
self.gen.throw(value)
File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\httpx_transports\default.py", line 89, in map_httpcore_exceptions
raise mapped_exc(message) from exc
httpx.ConnectError: [WinError 10061] 由于目标计算机积极拒绝,无法连接。

I also can curl http://10.4.(my_server_ip):11434 in lacal machine
image

How do I solving this problem?

OS

Linux, Windows

GPU

Nvidia

CPU

AMD

Ollama version

0.3.13

Originally created by @RXZAN on GitHub (Oct 21, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7286 ### What is the issue? I'm running the ollama service on the server side。 There was a problem running this piece of code without ollama on the local machine。 **MY code:** import os os.environ["USER_AGENT"] = "MyCustomUserAgent/1.0" os.environ['OLLAMA_API_KEY'] = 'none' os.environ['OLLAMA_BASE_URL'] = 'http://10.4.(my_server_ip):11434/' from langchain_ollama import ChatOllama llm = ChatOllama(model='llama3.1:8b', temperature=0) messages = [ ("human", "Return the words Hello World!"), ] for chunk in llm.stream(messages): print(chunk) ![屏幕截图 2024-10-22 093552](https://github.com/user-attachments/assets/392eb7ac-bf67-41e8-b8c0-18a5519f7017) **problem:** Traceback (most recent call last): File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\httpx\_transports\default.py", line 72, in map_httpcore_exceptions yield File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\httpx\_transports\default.py", line 236, in handle_request resp = self._pool.handle_request(req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\httpcore\_sync\connection_pool.py", line 216, in handle_request raise exc from None File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\httpcore\_sync\connection_pool.py", line 196, in handle_request response = connection.handle_request( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\httpcore\_sync\connection.py", line 99, in handle_request raise exc File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\httpcore\_sync\connection.py", line 76, in handle_request stream = self._connect(request) ^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\httpcore\_sync\connection.py", line 122, in _connect stream = self._network_backend.connect_tcp(**kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\httpcore\_backends\sync.py", line 205, in connect_tcp with map_exceptions(exc_map): ^^^^^^^^^^^^^^^^^^^^^^^ File "D:\anaconda3\envs\RAG_extra\Lib\contextlib.py", line 158, in __exit__ self.gen.throw(value) File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\httpcore\_exceptions.py", line 14, in map_exceptions raise to_exc(exc) from exc httpcore.ConnectError: [WinError 10061] 由于目标计算机积极拒绝,无法连接。 The above exception was the direct cause of the following exception: Traceback (most recent call last): File "D:\python_projects\Whole\text.py", line 13, in <module> for chunk in llm.stream(messages): ^^^^^^^^^^^^^^^^^^^^ File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\langchain_core\language_models\chat_models.py", line 420, in stream raise e File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\langchain_core\language_models\chat_models.py", line 400, in stream for chunk in self._stream(messages, stop=stop, **kwargs): ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\langchain_ollama\chat_models.py", line 665, in _stream for stream_resp in self._create_chat_stream(messages, stop, **kwargs): ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\langchain_ollama\chat_models.py", line 527, in _create_chat_stream yield from self._client.chat( File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\ollama\_client.py", line 80, in _stream with self._client.stream(method, url, **kwargs) as r: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\anaconda3\envs\RAG_extra\Lib\contextlib.py", line 137, in __enter__ return next(self.gen) ^^^^^^^^^^^^^^ File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\httpx\_client.py", line 880, in stream response = self.send( ^^^^^^^^^^ File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\httpx\_client.py", line 926, in send response = self._send_handling_auth( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\httpx\_client.py", line 954, in _send_handling_auth response = self._send_handling_redirects( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\httpx\_client.py", line 991, in _send_handling_redirects response = self._send_single_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\httpx\_client.py", line 1027, in _send_single_request response = transport.handle_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\httpx\_transports\default.py", line 235, in handle_request with map_httpcore_exceptions(): ^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\anaconda3\envs\RAG_extra\Lib\contextlib.py", line 158, in __exit__ self.gen.throw(value) File "C:\Users\dell\AppData\Roaming\Python\Python312\site-packages\httpx\_transports\default.py", line 89, in map_httpcore_exceptions raise mapped_exc(message) from exc httpx.ConnectError: [WinError 10061] 由于目标计算机积极拒绝,无法连接。 I also can curl http://10.4.(my_server_ip):11434 in lacal machine ![image](https://github.com/user-attachments/assets/b4a01fd2-829a-40f2-a88a-bca4cf585552) How do I solving this problem? ### OS Linux, Windows ### GPU Nvidia ### CPU AMD ### Ollama version 0.3.13
GiteaMirror added the pythonbug labels 2026-05-04 07:48:04 -05:00
Author
Owner

@rick-github commented on GitHub (Oct 21, 2024):

It's difficult to debug a problem when the code is incomplete. Post the full content of the test script, 10.0.0.0/8 is private IP space.

<!-- gh-comment-id:2427370029 --> @rick-github commented on GitHub (Oct 21, 2024): It's difficult to debug a problem when the code is incomplete. Post the full content of the test script, 10.0.0.0/8 is private IP space.
Author
Owner

@RXZAN commented on GitHub (Oct 22, 2024):

That's all there is to the code above,but I run ollama service on my private server .and I want to call the server-side ollama api from my local machine

It's difficult to debug a problem when the code is incomplete. Post the full content of the test script, 10.0.0.0/8 is private IP space.

<!-- gh-comment-id:2428031730 --> @RXZAN commented on GitHub (Oct 22, 2024): That's all there is to the code above,but I run ollama service on my private server .and I want to call the server-side ollama api from my local machine > It's difficult to debug a problem when the code is incomplete. Post the full content of the test script, 10.0.0.0/8 is private IP space.
Author
Owner

@rick-github commented on GitHub (Oct 22, 2024):

How are you routing packets from your local machine to the server in 10.0.0.0/8 space?

<!-- gh-comment-id:2428041880 --> @rick-github commented on GitHub (Oct 22, 2024): How are you routing packets from your local machine to the server in 10.0.0.0/8 space?
Author
Owner

@RXZAN commented on GitHub (Oct 22, 2024):

sorry,I don’t understand what you mean. I’m new to this and not very familiar with the topic.
I just want to run Ollama on a machine that doesn't have it by using another machine that has Ollama installed

How are you routing packets from your local machine to the server in 10.0.0.0/8 space?

<!-- gh-comment-id:2428053915 --> @RXZAN commented on GitHub (Oct 22, 2024): sorry,I don’t understand what you mean. I’m new to this and not very familiar with the topic. I just want to run Ollama on a machine that doesn't have it by using another machine that has Ollama installed > How are you routing packets from your local machine to the server in 10.0.0.0/8 space?
Author
Owner

@rick-github commented on GitHub (Oct 22, 2024):

Your server is in a private IP space and packets (internet connections) cannot be routed to it over the public internet. If the server is not on the same LAN as your local machine, there must be a mechanism to deliver those packets - VPN, proxy, etc. If curl works and your python script doesn't, it means that your python script is not using the same mechanism as curl. If the server is on the same LAN as your local machine, then something else is broken. A description of your network setup might shed some light.

<!-- gh-comment-id:2428073178 --> @rick-github commented on GitHub (Oct 22, 2024): Your server is in a private IP space and packets (internet connections) cannot be routed to it over the public internet. If the server is not on the same LAN as your local machine, there must be a mechanism to deliver those packets - VPN, proxy, etc. If curl works and your python script doesn't, it means that your python script is not using the same mechanism as curl. If the server is on the same LAN as your local machine, then something else is broken. A description of your network setup might shed some light.
Author
Owner

@RXZAN commented on GitHub (Oct 22, 2024):

My local machine and server are on the same LAN,I was able to ping the server from my local location,and the firewalls and ports of the server and the local machine are opened.What other network settings should I need to review?

Your server is in a private IP space and packets (internet connections) cannot be routed to it over the public internet. If the server is not on the same LAN as your local machine, there must be a mechanism to deliver those packets - VPN, proxy, etc. If curl works and your python script doesn't, it means that your python script is not using the same mechanism as curl. If the server is on the same LAN as your local machine, then something else is broken. A description of your network setup might shed some light.

<!-- gh-comment-id:2428083342 --> @RXZAN commented on GitHub (Oct 22, 2024): My local machine and server are on the same LAN,I was able to ping the server from my local location,and the firewalls and ports of the server and the local machine are opened.What other network settings should I need to review? > Your server is in a private IP space and packets (internet connections) cannot be routed to it over the public internet. If the server is not on the same LAN as your local machine, there must be a mechanism to deliver those packets - VPN, proxy, etc. If curl works and your python script doesn't, it means that your python script is not using the same mechanism as curl. If the server is on the same LAN as your local machine, then something else is broken. A description of your network setup might shed some light.
Author
Owner

@rick-github commented on GitHub (Nov 5, 2024):

--- 7286.py.orig        2024-11-05 02:30:01.765323596 +0100
+++ 7286.py     2024-11-05 02:30:12.158985343 +0100
@@ -1,7 +1,7 @@
 import os
 os.environ["USER_AGENT"] = "MyCustomUserAgent/1.0"
 os.environ['OLLAMA_API_KEY'] = 'none'
-os.environ['OLLAMA_BASE_URL'] = 'http://10.4.(my_server_ip):11434/'
+os.environ['OLLAMA_HOST'] = 'http://10.4.(my_server_ip):11434/'
 
 from langchain_ollama import ChatOllama
<!-- gh-comment-id:2456039177 --> @rick-github commented on GitHub (Nov 5, 2024): ```diff --- 7286.py.orig 2024-11-05 02:30:01.765323596 +0100 +++ 7286.py 2024-11-05 02:30:12.158985343 +0100 @@ -1,7 +1,7 @@ import os os.environ["USER_AGENT"] = "MyCustomUserAgent/1.0" os.environ['OLLAMA_API_KEY'] = 'none' -os.environ['OLLAMA_BASE_URL'] = 'http://10.4.(my_server_ip):11434/' +os.environ['OLLAMA_HOST'] = 'http://10.4.(my_server_ip):11434/' from langchain_ollama import ChatOllama ```
Author
Owner

@RXZAN commented on GitHub (Nov 6, 2024):

--- 7286.py.orig        2024-11-05 02:30:01.765323596 +0100
+++ 7286.py     2024-11-05 02:30:12.158985343 +0100
@@ -1,7 +1,7 @@
 import os
 os.environ["USER_AGENT"] = "MyCustomUserAgent/1.0"
 os.environ['OLLAMA_API_KEY'] = 'none'
-os.environ['OLLAMA_BASE_URL'] = 'http://10.4.(my_server_ip):11434/'
+os.environ['OLLAMA_HOST'] = 'http://10.4.(my_server_ip):11434/'
 
 from langchain_ollama import ChatOllama

It work , thanks!

<!-- gh-comment-id:2458811564 --> @RXZAN commented on GitHub (Nov 6, 2024): > ```diff > --- 7286.py.orig 2024-11-05 02:30:01.765323596 +0100 > +++ 7286.py 2024-11-05 02:30:12.158985343 +0100 > @@ -1,7 +1,7 @@ > import os > os.environ["USER_AGENT"] = "MyCustomUserAgent/1.0" > os.environ['OLLAMA_API_KEY'] = 'none' > -os.environ['OLLAMA_BASE_URL'] = 'http://10.4.(my_server_ip):11434/' > +os.environ['OLLAMA_HOST'] = 'http://10.4.(my_server_ip):11434/' > > from langchain_ollama import ChatOllama > ``` It work , thanks!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#66686