[GH-ISSUE #13136] python not detecting ollama #55206

Closed
opened 2026-04-29 08:30:18 -05:00 by GiteaMirror · 20 comments
Owner

Originally created by @CStone6 on GitHub (Nov 18, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/13136

What is the issue?

im trying to use https://ollama.com/blog/web-search, but I get this error. I have a workaround but it is really annoying to do and my ollama host is set to 0.0.0.0

Relevant log output

Traceback (most recent call last):
  File "b:\neo\tools.py", line 8, in <module>
    response = chat(
               ^^^^^
  File "B:\neo\.venv\Lib\site-packages\ollama\_client.py", line 365, in chat
    return self._request(
           ^^^^^^^^^^^^^^
  File "B:\neo\.venv\Lib\site-packages\ollama\_client.py", line 189, in _request        
    return cls(**self._request_raw(*args, **kwargs).json())
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "B:\neo\.venv\Lib\site-packages\ollama\_client.py", line 135, in _request_raw    
    raise ConnectionError(CONNECTION_ERROR_MESSAGE) from None
ConnectionError: Failed to connect to Ollama. Please check that Ollama is downloaded, running and accessible. https://ollama.com/download

OS

Windows

GPU

AMD

CPU

AMD

Ollama version

0.12.11

Originally created by @CStone6 on GitHub (Nov 18, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/13136 ### What is the issue? im trying to use https://ollama.com/blog/web-search, but I get this error. I have a workaround but it is really annoying to do and my ollama host is set to 0.0.0.0 ### Relevant log output ```shell Traceback (most recent call last): File "b:\neo\tools.py", line 8, in <module> response = chat( ^^^^^ File "B:\neo\.venv\Lib\site-packages\ollama\_client.py", line 365, in chat return self._request( ^^^^^^^^^^^^^^ File "B:\neo\.venv\Lib\site-packages\ollama\_client.py", line 189, in _request return cls(**self._request_raw(*args, **kwargs).json()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "B:\neo\.venv\Lib\site-packages\ollama\_client.py", line 135, in _request_raw raise ConnectionError(CONNECTION_ERROR_MESSAGE) from None ConnectionError: Failed to connect to Ollama. Please check that Ollama is downloaded, running and accessible. https://ollama.com/download ``` ### OS Windows ### GPU AMD ### CPU AMD ### Ollama version 0.12.11
GiteaMirror added the bug label 2026-04-29 08:30:18 -05:00
Author
Owner

@rick-github commented on GitHub (Nov 18, 2025):

So, is ollama downloaded, running and accessible?

<!-- gh-comment-id:3549509897 --> @rick-github commented on GitHub (Nov 18, 2025): So, is ollama downloaded, running and accessible?
Author
Owner

@CStone6 commented on GitHub (Nov 18, 2025):

So, is ollama downloaded, running and accessible?

yes it still works in cli and app

<!-- gh-comment-id:3549664234 --> @CStone6 commented on GitHub (Nov 18, 2025): > So, is ollama downloaded, running and accessible? yes it still works in cli and app
Author
Owner

@rick-github commented on GitHub (Nov 18, 2025):

Then you'll have to provide more information to debug this. Can you duplicate this with a minimal standalone script? Some details about the environment you are running this in will also help.

<!-- gh-comment-id:3549767928 --> @rick-github commented on GitHub (Nov 18, 2025): Then you'll have to provide more information to debug this. Can you duplicate this with a minimal standalone script? Some details about the environment you are running this in will also help.
Author
Owner

@CStone6 commented on GitHub (Nov 18, 2025):

I am using Python 3.11.9, and with the default script from the ollama Python GitHub, I still get the error

<!-- gh-comment-id:3549781856 --> @CStone6 commented on GitHub (Nov 18, 2025): I am using Python 3.11.9, and with the default script from the ollama Python GitHub, I still get the error
Author
Owner

@rick-github commented on GitHub (Nov 18, 2025):

What default script?

<!-- gh-comment-id:3549786732 --> @rick-github commented on GitHub (Nov 18, 2025): What default script?
Author
Owner

@CStone6 commented on GitHub (Nov 19, 2025):

from here https://github.com/ollama/ollama-python

<!-- gh-comment-id:3550037233 --> @CStone6 commented on GitHub (Nov 19, 2025): from here https://github.com/ollama/ollama-python
Author
Owner

@rick-github commented on GitHub (Nov 19, 2025):

Which script in particular?

<!-- gh-comment-id:3553512918 --> @rick-github commented on GitHub (Nov 19, 2025): Which script in particular?
Author
Owner

@CStone6 commented on GitHub (Nov 19, 2025):

the 1st one

<!-- gh-comment-id:3554406831 --> @CStone6 commented on GitHub (Nov 19, 2025): the 1st one
Author
Owner

@rick-github commented on GitHub (Nov 19, 2025):

$ cat ./13136.py
from ollama import chat
from ollama import ChatResponse

response: ChatResponse = chat(model='gemma3', messages=[
  {
    'role': 'user',
    'content': 'Why is the sky blue?',
  },
])
print(response['message']['content'])
# or access fields directly from the response object
print(response.message.content)
$ python ./13136.py
Okay, let's break down why the sky is blue – it's a fascinating phenomenon called **Rayleigh scattering**. Here's the explanation:

...
*   How this affects different colors at different times of day?
<!-- gh-comment-id:3554730283 --> @rick-github commented on GitHub (Nov 19, 2025): ```console $ cat ./13136.py from ollama import chat from ollama import ChatResponse response: ChatResponse = chat(model='gemma3', messages=[ { 'role': 'user', 'content': 'Why is the sky blue?', }, ]) print(response['message']['content']) # or access fields directly from the response object print(response.message.content) ``` ```console $ python ./13136.py Okay, let's break down why the sky is blue – it's a fascinating phenomenon called **Rayleigh scattering**. Here's the explanation: ... * How this affects different colors at different times of day? ```
Author
Owner

@CStone6 commented on GitHub (Nov 20, 2025):

i run it and i get the error

<!-- gh-comment-id:3555274062 --> @CStone6 commented on GitHub (Nov 20, 2025): i run it and i get the error
Author
Owner

@rick-github commented on GitHub (Nov 20, 2025):

What error, exactly? And what's the output of:

ollama -v
ollama list
ollama run gemma3 hello
<!-- gh-comment-id:3555285478 --> @rick-github commented on GitHub (Nov 20, 2025): What error, exactly? And what's the output of: ``` ollama -v ollama list ollama run gemma3 hello ```
Author
Owner

@CStone6 commented on GitHub (Nov 20, 2025):

0.12.11

ollama run gemma3 hello
Hello there! How’s it going today? Is there anything you’d like to chat about, or were you just saying hello? 😊

Let me know if you need anything – I can answer questions, tell a story, help you brainstorm, or just listen.

ollama list
NAME ID SIZE MODIFIED
gpt-oss:20b aa4295ac10c3 13 GB 2 months ago
mistral-small3.2:latest 5a408ab55df5 15 GB 3 months ago
llama3.2:3b a80c4f17acd5 2.0 GB 3 months ago
nomic-embed-text:latest 0a109f422b47 274 MB 3 months ago
deepseek-r1:14b c333b7232bdb 9.0 GB 3 months ago
gpt-oss:latest f2b8351c629c 13 GB 3 months ago
gemma3n:e4b 15cb39fd9394 7.5 GB 4 months ago
llava:latest 8dd30f6b0cb1 4.7 GB 4 months ago
qwen2.5vl:latest 5ced39dfa4ba 6.0 GB 5 months ago
phi4:latest ac896e5b8b34 9.1 GB 5 months ago
qwen3:14b 7d7da67570e2 9.3 GB 6 months ago
gemma3:latest a2af6cc3eb7f 3.3 GB 7 months ago
minicpm-v:latest c92bfad01205 5.5 GB 7 months ago
qwen2.5:14b 7cdf5a0187d5 9.0 GB 7 months ago
gemma3:12b 6fd036cefda5 8.1 GB 8 months ago
gemma3:1b 2d27a774bc62 815 MB 8 months ago
gemma3:27b 30ddded7fba6 17 GB 8 months ago
qwq:latest cc1091b0e276 19 GB 8 months ago
qwen2.5-coder:latest 2b0496514337 4.7 GB 8 months ago
llama2:7b 78e26419b446 3.8 GB 9 months ago
llava:13b 0d0eb4d7f485 8.0 GB 10 months ago
deepseek-r1:32b 38056bbcbb2d 19 GB 10 months ago
llama3.1:latest 46e0c10c039e 4.9 GB 10 months ago
phi3:14b cf611a26b048 7.9 GB 10 months ago
llama2-uncensored:latest 44040b922233 3.8 GB 10 months ago
dolphin-mistral:latest 5dc8c5a2be65 4.1 GB 10 months ago
gemma2:27b 53261bc9c192 15 GB 10 months ago
llama3.2:latest a80c4f17acd5 2.0 GB 13 months ago

<!-- gh-comment-id:3555331484 --> @CStone6 commented on GitHub (Nov 20, 2025): 0.12.11 ollama run gemma3 hello Hello there! How’s it going today? Is there anything you’d like to chat about, or were you just saying hello? 😊 Let me know if you need anything – I can answer questions, tell a story, help you brainstorm, or just listen. ollama list NAME ID SIZE MODIFIED gpt-oss:20b aa4295ac10c3 13 GB 2 months ago mistral-small3.2:latest 5a408ab55df5 15 GB 3 months ago llama3.2:3b a80c4f17acd5 2.0 GB 3 months ago nomic-embed-text:latest 0a109f422b47 274 MB 3 months ago deepseek-r1:14b c333b7232bdb 9.0 GB 3 months ago gpt-oss:latest f2b8351c629c 13 GB 3 months ago gemma3n:e4b 15cb39fd9394 7.5 GB 4 months ago llava:latest 8dd30f6b0cb1 4.7 GB 4 months ago qwen2.5vl:latest 5ced39dfa4ba 6.0 GB 5 months ago phi4:latest ac896e5b8b34 9.1 GB 5 months ago qwen3:14b 7d7da67570e2 9.3 GB 6 months ago gemma3:latest a2af6cc3eb7f 3.3 GB 7 months ago minicpm-v:latest c92bfad01205 5.5 GB 7 months ago qwen2.5:14b 7cdf5a0187d5 9.0 GB 7 months ago gemma3:12b 6fd036cefda5 8.1 GB 8 months ago gemma3:1b 2d27a774bc62 815 MB 8 months ago gemma3:27b 30ddded7fba6 17 GB 8 months ago qwq:latest cc1091b0e276 19 GB 8 months ago qwen2.5-coder:latest 2b0496514337 4.7 GB 8 months ago llama2:7b 78e26419b446 3.8 GB 9 months ago llava:13b 0d0eb4d7f485 8.0 GB 10 months ago deepseek-r1:32b 38056bbcbb2d 19 GB 10 months ago llama3.1:latest 46e0c10c039e 4.9 GB 10 months ago phi3:14b cf611a26b048 7.9 GB 10 months ago llama2-uncensored:latest 44040b922233 3.8 GB 10 months ago dolphin-mistral:latest 5dc8c5a2be65 4.1 GB 10 months ago gemma2:27b 53261bc9c192 15 GB 10 months ago llama3.2:latest a80c4f17acd5 2.0 GB 13 months ago
Author
Owner

@rick-github commented on GitHub (Nov 20, 2025):

What error, exactly?

<!-- gh-comment-id:3556875034 --> @rick-github commented on GitHub (Nov 20, 2025): What error, exactly?
Author
Owner

@CStone6 commented on GitHub (Nov 20, 2025):

Traceback (most recent call last):
File "b:\neo\tools.py", line 8, in
response = chat(
^^^^^
File "B:\neo.venv\Lib\site-packages\ollama_client.py", line 365, in chat
return self._request(
^^^^^^^^^^^^^^
File "B:\neo.venv\Lib\site-packages\ollama_client.py", line 189, in _request
return cls(**self._request_raw(*args, **kwargs).json())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "B:\neo.venv\Lib\site-packages\ollama_client.py", line 135, in _request_raw
raise ConnectionError(CONNECTION_ERROR_MESSAGE) from None
ConnectionError: Failed to connect to Ollama. Please check that Ollama is downloaded, running and accessible. https://ollama.com/download

<!-- gh-comment-id:3557520493 --> @CStone6 commented on GitHub (Nov 20, 2025): Traceback (most recent call last): File "b:\neo\tools.py", line 8, in <module> response = chat( ^^^^^ File "B:\neo\.venv\Lib\site-packages\ollama\_client.py", line 365, in chat return self._request( ^^^^^^^^^^^^^^ File "B:\neo\.venv\Lib\site-packages\ollama\_client.py", line 189, in _request return cls(**self._request_raw(*args, **kwargs).json()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "B:\neo\.venv\Lib\site-packages\ollama\_client.py", line 135, in _request_raw raise ConnectionError(CONNECTION_ERROR_MESSAGE) from None ConnectionError: Failed to connect to Ollama. Please check that Ollama is downloaded, running and accessible. https://ollama.com/download
Author
Owner

@rick-github commented on GitHub (Nov 20, 2025):

That's not the same script. Line 8 is not response = chat(.

<!-- gh-comment-id:3557528639 --> @rick-github commented on GitHub (Nov 20, 2025): That's not the same script. Line 8 is not `response = chat(`.
Author
Owner

@CStone6 commented on GitHub (Nov 20, 2025):

I can’t use my computer right now but the error is the same

<!-- gh-comment-id:3557533409 --> @CStone6 commented on GitHub (Nov 20, 2025): I can’t use my computer right now but the error is the same
Author
Owner

@rick-github commented on GitHub (Nov 20, 2025):

When you can use your computer, post the script you are trying to run and the exact error message when it fails.

<!-- gh-comment-id:3557538426 --> @rick-github commented on GitHub (Nov 20, 2025): When you can use your computer, post the script you are trying to run and the exact error message when it fails.
Author
Owner

@CStone6 commented on GitHub (Nov 20, 2025):

from ollama import chat
from ollama import ChatResponse

response: ChatResponse = chat(model='gemma3', messages=[
{
'role': 'user',
'content': 'Why is the sky blue?',
},
])
print(response['message']['content'])

or access fields directly from the response object

print(response.message.content)

<!-- gh-comment-id:3560049229 --> @CStone6 commented on GitHub (Nov 20, 2025): from ollama import chat from ollama import ChatResponse response: ChatResponse = chat(model='gemma3', messages=[ { 'role': 'user', 'content': 'Why is the sky blue?', }, ]) print(response['message']['content']) # or access fields directly from the response object print(response.message.content)
Author
Owner

@CStone6 commented on GitHub (Nov 20, 2025):

Traceback (most recent call last):
File "b:\neo\tools.py", line 4, in
response: ChatResponse = chat(model='gemma3', messages=[
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "B:\neo.venv\Lib\site-packages\ollama_client.py", line 365, in chat
return self._request(
^^^^^^^^^^^^^^
File "B:\neo.venv\Lib\site-packages\ollama_client.py", line 189, in _request
return cls(**self._request_raw(*args, **kwargs).json())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "B:\neo.venv\Lib\site-packages\ollama_client.py", line 135, in _request_raw
raise ConnectionError(CONNECTION_ERROR_MESSAGE) from None
ConnectionError: Failed to connect to Ollama. Please check that Ollama is downloaded, running and accessible. https://ollama.com/download

<!-- gh-comment-id:3560050114 --> @CStone6 commented on GitHub (Nov 20, 2025): Traceback (most recent call last): File "b:\neo\tools.py", line 4, in <module> response: ChatResponse = chat(model='gemma3', messages=[ ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "B:\neo\.venv\Lib\site-packages\ollama\_client.py", line 365, in chat return self._request( ^^^^^^^^^^^^^^ File "B:\neo\.venv\Lib\site-packages\ollama\_client.py", line 189, in _request return cls(**self._request_raw(*args, **kwargs).json()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "B:\neo\.venv\Lib\site-packages\ollama\_client.py", line 135, in _request_raw raise ConnectionError(CONNECTION_ERROR_MESSAGE) from None ConnectionError: Failed to connect to Ollama. Please check that Ollama is downloaded, running and accessible. https://ollama.com/download
Author
Owner

@CStone6 commented on GitHub (Nov 21, 2025):

When did i close this

<!-- gh-comment-id:3560865459 --> @CStone6 commented on GitHub (Nov 21, 2025): When did i close this
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#55206