[GH-ISSUE #8918] ollama.generate(model='deepseek-r1:1.5b',prompt=‘hello’') httpcore.RemoteProtocolError: Server disconnected without sending a response. #5783

Closed
opened 2026-04-12 17:07:26 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @HZPHuangZePeng on GitHub (Feb 7, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/8918

What is the issue?

D:\Python310\python.exe D:\python-personal\lintcode\0206.py
Traceback (most recent call last):
File "D:\Python310\lib\site-packages\httpx_transports\default.py", line 101, in map_httpcore_exceptions
yield
File "D:\Python310\lib\site-packages\httpx_transports\default.py", line 250, in handle_request
resp = self._pool.handle_request(req)
File "D:\Python310\lib\site-packages\httpcore_sync\connection_pool.py", line 216, in handle_request
raise exc from None
File "D:\Python310\lib\site-packages\httpcore_sync\connection_pool.py", line 196, in handle_request
response = connection.handle_request(
File "D:\Python310\lib\site-packages\httpcore_sync\http_proxy.py", line 207, in handle_request
return self._connection.handle_request(proxy_request)
File "D:\Python310\lib\site-packages\httpcore_sync\connection.py", line 101, in handle_request
return self._connection.handle_request(request)
File "D:\Python310\lib\site-packages\httpcore_sync\http11.py", line 143, in handle_request
raise exc
File "D:\Python310\lib\site-packages\httpcore_sync\http11.py", line 113, in handle_request
) = self._receive_response_headers(**kwargs)
File "D:\Python310\lib\site-packages\httpcore_sync\http11.py", line 186, in _receive_response_headers
event = self._receive_event(timeout=timeout)
File "D:\Python310\lib\site-packages\httpcore_sync\http11.py", line 238, in _receive_event
raise RemoteProtocolError(msg)
httpcore.RemoteProtocolError: Server disconnected without sending a response.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "D:\python-personal\lintcode\0206.py", line 2, in
ollama.generate(model='deepseek-r1:1.5b',prompt='你好啊')
File "D:\Python310\lib\site-packages\ollama_client.py", line 242, in generate
return self._request(
File "D:\Python310\lib\site-packages\ollama_client.py", line 178, in _request
return cls(**self._request_raw(*args, **kwargs).json())
File "D:\Python310\lib\site-packages\ollama_client.py", line 118, in _request_raw
r = self._client.request(*args, **kwargs)
File "D:\Python310\lib\site-packages\httpx_client.py", line 825, in request
return self.send(request, auth=auth, follow_redirects=follow_redirects)
File "D:\Python310\lib\site-packages\httpx_client.py", line 914, in send
response = self._send_handling_auth(
File "D:\Python310\lib\site-packages\httpx_client.py", line 942, in _send_handling_auth
response = self._send_handling_redirects(
File "D:\Python310\lib\site-packages\httpx_client.py", line 979, in _send_handling_redirects
response = self._send_single_request(request)
File "D:\Python310\lib\site-packages\httpx_client.py", line 1014, in _send_single_request
response = transport.handle_request(request)
File "D:\Python310\lib\site-packages\httpx_transports\default.py", line 249, in handle_request
with map_httpcore_exceptions():
File "D:\Python310\lib\contextlib.py", line 153, in exit
self.gen.throw(typ, value, traceback)
File "D:\Python310\lib\site-packages\httpx_transports\default.py", line 118, in map_httpcore_exceptions
raise mapped_exc(message) from exc
httpx.RemoteProtocolError: Server disconnected without sending a response.

Process finished with exit code 1

Relevant log output


OS

No response

GPU

No response

CPU

No response

Ollama version

No response

Originally created by @HZPHuangZePeng on GitHub (Feb 7, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/8918 ### What is the issue? D:\Python310\python.exe D:\python-personal\lintcode\0206.py Traceback (most recent call last): File "D:\Python310\lib\site-packages\httpx\_transports\default.py", line 101, in map_httpcore_exceptions yield File "D:\Python310\lib\site-packages\httpx\_transports\default.py", line 250, in handle_request resp = self._pool.handle_request(req) File "D:\Python310\lib\site-packages\httpcore\_sync\connection_pool.py", line 216, in handle_request raise exc from None File "D:\Python310\lib\site-packages\httpcore\_sync\connection_pool.py", line 196, in handle_request response = connection.handle_request( File "D:\Python310\lib\site-packages\httpcore\_sync\http_proxy.py", line 207, in handle_request return self._connection.handle_request(proxy_request) File "D:\Python310\lib\site-packages\httpcore\_sync\connection.py", line 101, in handle_request return self._connection.handle_request(request) File "D:\Python310\lib\site-packages\httpcore\_sync\http11.py", line 143, in handle_request raise exc File "D:\Python310\lib\site-packages\httpcore\_sync\http11.py", line 113, in handle_request ) = self._receive_response_headers(**kwargs) File "D:\Python310\lib\site-packages\httpcore\_sync\http11.py", line 186, in _receive_response_headers event = self._receive_event(timeout=timeout) File "D:\Python310\lib\site-packages\httpcore\_sync\http11.py", line 238, in _receive_event raise RemoteProtocolError(msg) httpcore.RemoteProtocolError: Server disconnected without sending a response. The above exception was the direct cause of the following exception: Traceback (most recent call last): File "D:\python-personal\lintcode\0206.py", line 2, in <module> ollama.generate(model='deepseek-r1:1.5b',prompt='你好啊') File "D:\Python310\lib\site-packages\ollama\_client.py", line 242, in generate return self._request( File "D:\Python310\lib\site-packages\ollama\_client.py", line 178, in _request return cls(**self._request_raw(*args, **kwargs).json()) File "D:\Python310\lib\site-packages\ollama\_client.py", line 118, in _request_raw r = self._client.request(*args, **kwargs) File "D:\Python310\lib\site-packages\httpx\_client.py", line 825, in request return self.send(request, auth=auth, follow_redirects=follow_redirects) File "D:\Python310\lib\site-packages\httpx\_client.py", line 914, in send response = self._send_handling_auth( File "D:\Python310\lib\site-packages\httpx\_client.py", line 942, in _send_handling_auth response = self._send_handling_redirects( File "D:\Python310\lib\site-packages\httpx\_client.py", line 979, in _send_handling_redirects response = self._send_single_request(request) File "D:\Python310\lib\site-packages\httpx\_client.py", line 1014, in _send_single_request response = transport.handle_request(request) File "D:\Python310\lib\site-packages\httpx\_transports\default.py", line 249, in handle_request with map_httpcore_exceptions(): File "D:\Python310\lib\contextlib.py", line 153, in __exit__ self.gen.throw(typ, value, traceback) File "D:\Python310\lib\site-packages\httpx\_transports\default.py", line 118, in map_httpcore_exceptions raise mapped_exc(message) from exc httpx.RemoteProtocolError: Server disconnected without sending a response. Process finished with exit code 1 ### Relevant log output ```shell ``` ### OS _No response_ ### GPU _No response_ ### CPU _No response_ ### Ollama version _No response_
GiteaMirror added the bug label 2026-04-12 17:07:26 -05:00
Author
Owner

@rick-github commented on GitHub (Feb 7, 2025):

Server logs may aid in debugging.

<!-- gh-comment-id:2642285736 --> @rick-github commented on GitHub (Feb 7, 2025): [Server logs](https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#how-to-troubleshoot-issues) may aid in debugging.
Author
Owner

@cambrianlee commented on GitHub (Feb 9, 2025):

Are you running Python in a proxy environment?

I had the same issue as you. I tested the following curl command in the terminal:

curl localhost:11434/api/embed -d "{\"model\":\"nomic-embed-text\",\"input\":\"why is the sky blue?\"}"

It worked because curl does not use a proxy by default. However, Python code might be going through a proxy, which could be causing the issue.

To fix this, I executed the following commands in the terminal:

export http_proxy=''
export https_proxy=''
export all_proxy=''

After doing this, my Python code worked as expected—no errors, and it successfully generated text using DeepSeek.

However, there's still an issue: it seems that we can't completely disable the proxy within the code itself. I've tried multiple approaches, but none worked. This might be a bug in the Ollama Python SDK when handling proxies.

For example, I tried:

os.environ['ALL_PROXY'] = ''
os.environ['all_proxy'] = ''
os.environ['https_proxy'] = ''
os.environ['http_proxy'] = ''

or

os.environ.pop('ALL_PROXY', None)
os.environ.pop('all_proxy', None)
os.environ.pop('https_proxy', None)
os.environ.pop('http_proxy', None)

But the issue persists.

<!-- gh-comment-id:2646301458 --> @cambrianlee commented on GitHub (Feb 9, 2025): Are you running Python in a proxy environment? I had the same issue as you. I tested the following `curl` command in the terminal: ```bash curl localhost:11434/api/embed -d "{\"model\":\"nomic-embed-text\",\"input\":\"why is the sky blue?\"}" ``` It worked because `curl` does not use a proxy by default. However, Python code might be going through a proxy, which could be causing the issue. To fix this, I executed the following commands in the terminal: ```bash export http_proxy='' export https_proxy='' export all_proxy='' ``` After doing this, my Python code worked as expected—no errors, and it successfully generated text using DeepSeek. However, there's still an issue: it seems that we can't completely disable the proxy within the code itself. I've tried multiple approaches, but none worked. This might be a bug in the Ollama Python SDK when handling proxies. For example, I tried: ```python os.environ['ALL_PROXY'] = '' os.environ['all_proxy'] = '' os.environ['https_proxy'] = '' os.environ['http_proxy'] = '' ``` or ```python os.environ.pop('ALL_PROXY', None) os.environ.pop('all_proxy', None) os.environ.pop('https_proxy', None) os.environ.pop('http_proxy', None) ``` But the issue persists.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#5783