mirror of
https://github.com/open-webui/open-webui.git
synced 2026-03-17 12:31:06 -05:00
Can't select model without ollama server present #625
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @jmtatsch on GitHub (Apr 10, 2024).
Bug Report
Description
Bug Summary:
Once there is no ollama server present get_all_models does not seem to return properly
and the available models from open ai interface/v1/models are not queried
Steps to Reproduce:
stop your ollama server and try to select a model
Expected Behavior:
I expect the open ai compatible models to show also without an ollama server present.
@jmtatsch commented on GitHub (Apr 10, 2024):
It was a case of the the openai baseurl/key not persisted properly. After putting it inside the env it works.
Nevertheless the ui freezes or blanks whenever there are models collected but nothing there to collect.
@simonschmidt commented on GitHub (Apr 17, 2024):
I see this too, I ended up setting
OLLAMA_BASE_URLSto something non-empty but invalid to make it crash early, I used" ".Instead of the dragged out connection errors trying to reach whatever default configured ollama URL it now crashes almost right away with:
Not sure why it helped, but now the LiteLLM models all display properly in the UI all the time, they always returned correctly over the API but the UI dropdown was intermittent.