Can't select model without ollama server present #625

Closed
opened 2025-11-11 14:27:45 -06:00 by GiteaMirror · 2 comments
Owner

Originally created by @jmtatsch on GitHub (Apr 10, 2024).

Bug Report

Description

Bug Summary:
Once there is no ollama server present get_all_models does not seem to return properly
and the available models from open ai interface/v1/models are not queried

open-webui          | INFO:apps.ollama.main:get_all_models()
open-webui          | ERROR:apps.ollama.main:Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection timed out]
open-webui          | INFO:apps.ollama.main:get_all_models()
open-webui          | ERROR:apps.ollama.main:Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection timed out]
open-webui          | INFO:apps.ollama.main:get_all_models()
open-webui          | ERROR:apps.ollama.main:Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection timed out]
open-webui          | INFO:apps.ollama.main:get_all_models()
open-webui          | ERROR:apps.ollama.main:Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection timed out]
open-webui          | ERROR:apps.ollama.main:Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection timed out]
open-webui          | ERROR:apps.ollama.main:Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection timed out]

Steps to Reproduce:
stop your ollama server and try to select a model

Expected Behavior:
I expect the open ai compatible models to show also without an ollama server present.

Originally created by @jmtatsch on GitHub (Apr 10, 2024). # Bug Report ## Description **Bug Summary:** Once there is no ollama server present get_all_models does not seem to return properly and the available models from open ai interface/v1/models are not queried ``` open-webui | INFO:apps.ollama.main:get_all_models() open-webui | ERROR:apps.ollama.main:Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection timed out] open-webui | INFO:apps.ollama.main:get_all_models() open-webui | ERROR:apps.ollama.main:Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection timed out] open-webui | INFO:apps.ollama.main:get_all_models() open-webui | ERROR:apps.ollama.main:Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection timed out] open-webui | INFO:apps.ollama.main:get_all_models() open-webui | ERROR:apps.ollama.main:Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection timed out] open-webui | ERROR:apps.ollama.main:Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection timed out] open-webui | ERROR:apps.ollama.main:Connection error: Cannot connect to host host.docker.internal:11434 ssl:default [Connection timed out] ``` **Steps to Reproduce:** stop your ollama server and try to select a model **Expected Behavior:** I expect the open ai compatible models to show also without an ollama server present.
Author
Owner

@jmtatsch commented on GitHub (Apr 10, 2024):

It was a case of the the openai baseurl/key not persisted properly. After putting it inside the env it works.
Nevertheless the ui freezes or blanks whenever there are models collected but nothing there to collect.

@jmtatsch commented on GitHub (Apr 10, 2024): It was a case of the the openai baseurl/key not persisted properly. After putting it inside the env it works. Nevertheless the ui freezes or blanks whenever there are models collected but nothing there to collect.
Author
Owner

@simonschmidt commented on GitHub (Apr 17, 2024):

I see this too, I ended up setting OLLAMA_BASE_URLS to something non-empty but invalid to make it crash early, I used " ".

Instead of the dragged out connection errors trying to reach whatever default configured ollama URL it now crashes almost right away with:

GET /ollama/api/version HTTP/1.1" 500 Internal Server Error

Not sure why it helped, but now the LiteLLM models all display properly in the UI all the time, they always returned correctly over the API but the UI dropdown was intermittent.

@simonschmidt commented on GitHub (Apr 17, 2024): I see this too, I ended up setting `OLLAMA_BASE_URLS` to something non-empty but invalid to make it crash early, I used `" "`. Instead of the dragged out connection errors trying to reach whatever default configured ollama URL it now crashes almost right away with: > GET /ollama/api/version HTTP/1.1" 500 Internal Server Error Not sure why it helped, but now the LiteLLM models all display properly in the UI all the time, they always returned correctly over the API but the UI dropdown was intermittent.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#625