[GH-ISSUE #5097] Ollama Offline Connection Failed Using Open-webui (even though Ollama itself works in Command Prompt) #52530

Closed
opened 2026-05-05 13:37:42 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @john911f on GitHub (Sep 2, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/5097

In an offline environment (ethernet cable unplugged):

  • Ollama is running on http://localhost:11434/
  • Ollama is running inside Cmd Prompt
  • Ollama is NOT running in open-webui (specifically, llama models are NOT available)

In an online environment (ethernet cable plugged):

  • Ollama is running in open-webui (specifically, llama models ARE available)

I am running Open-Webui manually in a Python environment, not through Docker.

Please help. Thanks a million!

Discussed in https://github.com/open-webui/open-webui/discussions/1901

Originally posted by start-life May 2, 2024
Ollama offline connection failed
When it's online it connects

INFO: Started server process [8800]
INFO: Waiting for application startup.
INFO:apps.litellm.main:start_litellm_background
INFO:apps.litellm.main:run_background_process
INFO:apps.litellm.main:Executing command: ['litellm', '--port', '14365', '--host', '127.0.0.1', '--telemetry', 'False', '--config', 'C:\zxcv\open-webui\backend\data/litellm/config.yaml']
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:8080 (Press CTRL+C to quit)
INFO:apps.litellm.main:Subprocess started successfully.
INFO: 127.0.0.1:51747 - "GET /api/config HTTP/1.1" 200 OK
INFO: 127.0.0.1:51748 - "GET /manifest.json HTTP/1.1" 200 OK
INFO: 127.0.0.1:51748 - "GET /api/v1/auths/ HTTP/1.1" 200 OK
INFO:apps.ollama.main:get_all_models()
ERROR:apps.ollama.main:Connection error: Cannot connect to host localhost:11434 ssl:default [getaddrinfo failed]
INFO:apps.ollama.main:get_all_models()
ERROR:apps.ollama.main:Connection error: Cannot connect to host localhost:11434 ssl:default [getaddrinfo failed]
INFO: 127.0.0.1:51748 - "GET /ollama/api/tags HTTP/1.1" 200 OK
INFO:apps.openai.main:get_all_models()
INFO:apps.openai.main:get_all_models()
INFO: 127.0.0.1:51748 - "GET /openai/api/models HTTP/1.1" 200 OK
INFO: 127.0.0.1:51748 - "GET /litellm/api/v1/models HTTP/1.1" 200 OK
INFO: 127.0.0.1:51754 - "GET /api/v1/modelfiles/ HTTP/1.1" 200 OK
INFO: 127.0.0.1:51754 - "GET /api/v1/prompts/ HTTP/1.1" 200 OK
INFO: 127.0.0.1:51754 - "GET /api/v1/documents/ HTTP/1.1" 200 OK
INFO: 127.0.0.1:51754 - "GET /api/v1/chats/tags/all HTTP/1.1" 200 OK
INFO:apps.ollama.main:get_all_models()
ERROR:apps.ollama.main:Connection error: Cannot connect to host localhost:11434 ssl:default [getaddrinfo failed]
INFO:apps.ollama.main:get_all_models()
ERROR:apps.ollama.main:Connection error: Cannot connect to host localhost:11434 ssl:default [getaddrinfo failed]
INFO: 127.0.0.1:51754 - "GET /ollama/api/tags HTTP/1.1" 200 OK
INFO: 127.0.0.1:51754 - "GET /api/v1/chats/ HTTP/1.1" 200 OK
INFO: 127.0.0.1:51754 - "GET /api/changelog HTTP/1.1" 200 OK
INFO:apps.ollama.main:get_all_models()
INFO:apps.openai.main:get_all_models()
ERROR:apps.ollama.main:Connection error: Cannot connect to host localhost:11434 ssl:default [getaddrinfo failed]
INFO:apps.openai.main:get_all_models()
ERROR:apps.ollama.main:Connection error: Cannot connect to host localhost:11434 ssl:default [getaddrinfo failed]
INFO: 127.0.0.1:51754 - "GET /openai/api/models HTTP/1.1" 200 OK
INFO: 127.0.0.1:51759 - "GET /ollama/api/version HTTP/1.1" 500 Internal Server Error
INFO: 127.0.0.1:51754 - "GET /static/favicon.png HTTP/1.1" 200 OK
INFO: 127.0.0.1:51759 - "GET /litellm/api/v1/models HTTP/1.1" 200 OK
INFO:apps.ollama.main:get_all_models()
ERROR:apps.ollama.main:Connection error: Cannot connect to host localhost:11434 ssl:default [getaddrinfo failed]
INFO: 127.0.0.1:51761 - "GET /ollama/urls HTTP/1.1" 200 OK
INFO:apps.ollama.main:get_all_models()
ERROR:apps.ollama.main:Connection error: Cannot connect to host localhost:11434 ssl:default [getaddrinfo failed]
ERROR:apps.ollama.main:Connection error: Cannot connect to host localhost:11434 ssl:default [getaddrinfo failed]
INFO: 127.0.0.1:51761 - "GET /ollama/api/version HTTP/1.1" 500 Internal Server Error
INFO: 127.0.0.1:51761 - "GET /litellm/api/model/info HTTP/1.1" 200 OK

Originally created by @john911f on GitHub (Sep 2, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/5097 In an offline environment (ethernet cable unplugged): - Ollama is running on http://localhost:11434/ - Ollama is running inside Cmd Prompt - Ollama is NOT running in open-webui (specifically, llama models are NOT available) In an online environment (ethernet cable plugged): - Ollama is running in open-webui (specifically, llama models ARE available) I am running Open-Webui manually in a Python environment, not through Docker. Please help. Thanks a million! ### Discussed in https://github.com/open-webui/open-webui/discussions/1901 <div type='discussions-op-text'> <sup>Originally posted by **start-life** May 2, 2024</sup> Ollama offline connection failed When it's online it connects INFO: Started server process [8800] INFO: Waiting for application startup. INFO:apps.litellm.main:start_litellm_background INFO:apps.litellm.main:run_background_process INFO:apps.litellm.main:Executing command: ['litellm', '--port', '14365', '--host', '127.0.0.1', '--telemetry', 'False', '--config', 'C:\\zxcv\\open-webui\\backend\\data/litellm/config.yaml'] INFO: Application startup complete. INFO: Uvicorn running on http://0.0.0.0:8080 (Press CTRL+C to quit) INFO:apps.litellm.main:Subprocess started successfully. INFO: 127.0.0.1:51747 - "GET /api/config HTTP/1.1" 200 OK INFO: 127.0.0.1:51748 - "GET /manifest.json HTTP/1.1" 200 OK INFO: 127.0.0.1:51748 - "GET /api/v1/auths/ HTTP/1.1" 200 OK INFO:apps.ollama.main:get_all_models() ERROR:apps.ollama.main:Connection error: Cannot connect to host localhost:11434 ssl:default [getaddrinfo failed] INFO:apps.ollama.main:get_all_models() ERROR:apps.ollama.main:Connection error: Cannot connect to host localhost:11434 ssl:default [getaddrinfo failed] INFO: 127.0.0.1:51748 - "GET /ollama/api/tags HTTP/1.1" 200 OK INFO:apps.openai.main:get_all_models() INFO:apps.openai.main:get_all_models() INFO: 127.0.0.1:51748 - "GET /openai/api/models HTTP/1.1" 200 OK INFO: 127.0.0.1:51748 - "GET /litellm/api/v1/models HTTP/1.1" 200 OK INFO: 127.0.0.1:51754 - "GET /api/v1/modelfiles/ HTTP/1.1" 200 OK INFO: 127.0.0.1:51754 - "GET /api/v1/prompts/ HTTP/1.1" 200 OK INFO: 127.0.0.1:51754 - "GET /api/v1/documents/ HTTP/1.1" 200 OK INFO: 127.0.0.1:51754 - "GET /api/v1/chats/tags/all HTTP/1.1" 200 OK INFO:apps.ollama.main:get_all_models() ERROR:apps.ollama.main:Connection error: Cannot connect to host localhost:11434 ssl:default [getaddrinfo failed] INFO:apps.ollama.main:get_all_models() ERROR:apps.ollama.main:Connection error: Cannot connect to host localhost:11434 ssl:default [getaddrinfo failed] INFO: 127.0.0.1:51754 - "GET /ollama/api/tags HTTP/1.1" 200 OK INFO: 127.0.0.1:51754 - "GET /api/v1/chats/ HTTP/1.1" 200 OK INFO: 127.0.0.1:51754 - "GET /api/changelog HTTP/1.1" 200 OK INFO:apps.ollama.main:get_all_models() INFO:apps.openai.main:get_all_models() ERROR:apps.ollama.main:Connection error: Cannot connect to host localhost:11434 ssl:default [getaddrinfo failed] INFO:apps.openai.main:get_all_models() ERROR:apps.ollama.main:Connection error: Cannot connect to host localhost:11434 ssl:default [getaddrinfo failed] INFO: 127.0.0.1:51754 - "GET /openai/api/models HTTP/1.1" 200 OK INFO: 127.0.0.1:51759 - "GET /ollama/api/version HTTP/1.1" 500 Internal Server Error INFO: 127.0.0.1:51754 - "GET /static/favicon.png HTTP/1.1" 200 OK INFO: 127.0.0.1:51759 - "GET /litellm/api/v1/models HTTP/1.1" 200 OK INFO:apps.ollama.main:get_all_models() ERROR:apps.ollama.main:Connection error: Cannot connect to host localhost:11434 ssl:default [getaddrinfo failed] INFO: 127.0.0.1:51761 - "GET /ollama/urls HTTP/1.1" 200 OK INFO:apps.ollama.main:get_all_models() ERROR:apps.ollama.main:Connection error: Cannot connect to host localhost:11434 ssl:default [getaddrinfo failed] ERROR:apps.ollama.main:Connection error: Cannot connect to host localhost:11434 ssl:default [getaddrinfo failed] INFO: 127.0.0.1:51761 - "GET /ollama/api/version HTTP/1.1" 500 Internal Server Error INFO: 127.0.0.1:51761 - "GET /litellm/api/model/info HTTP/1.1" 200 OK</div>
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#52530