mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 10:58:17 -05:00
[GH-ISSUE #19188] issue: Model drop-down fails to show models from remote hosts (ollama, llama.cpp) #18801
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @d-shehu on GitHub (Nov 14, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/19188
Check Existing Issues
Installation Method
Docker
Open WebUI Version
v0.6.36
Ollama Version (if applicable)
0.12.10
Operating System
Ubuntu 24.04
Browser (if applicable)
Chrome v142.0.744.162
Confirmation
README.md.Expected Behavior
After reconnecting to remote (Ollama) host on another network machines, it should load the models automatically and models appear in the drop down when creating a new chat.
Actual Behavior
Drop-down does not populate unless I repeatedly verify the connection to Ollama host and reload the apps. Behavior is inconsistent. Sometimes models load automatically and other times I need to retry by manually verifying the connection, reloading, starting a new thread and selecting a model from drop down.
In all cases the connection is good (verified with green checkmark) and models appear in the list in the admin panel. Only the dropdown in the chat fails to load properly.
Steps to Reproduce
Logs & Screenshots
N/A
Additional Information
No response
@tjbck commented on GitHub (Nov 16, 2025):
Unable to reproduce, the inference providers must be reachable in order for the models to be listed.