mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 02:48:13 -05:00
[GH-ISSUE #16358] issue: models from external server not showing in model list #17872
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @craigers521 on GitHub (Aug 7, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/16358
Check Existing Issues
Installation Method
Pip Install
Open WebUI Version
v0.6.18
Ollama Version (if applicable)
No response
Operating System
macOS
Browser (if applicable)
No response
Confirmation
README.md.Expected Behavior
Adding a verifed external openai server that returns models will allow me to select those models from dropdown menu.
Actual Behavior
Not seeing the model available to use
Steps to Reproduce
Added external openai compatible server and was able to verify it. The verify response json is below in logs section.
I would expect since it verified correctly i would see the llama3.3 model available in the model selection menu but it is not there.
Logs & Screenshots
Additional Information
No response
@SecureBot commented on GitHub (Aug 7, 2025):
I'm having a very similar issue. Models locally hosted with vLLM are suddenly not showing up. It will be listed under the models tab, but won't show in the model config, so it keeps dropping for users.
@craigers521 commented on GitHub (Aug 7, 2025):
Manually specifying the model ID in the connection setup ending up working for me. So rather than it just find all models available under list model I add the specific tags I want and now the API will send chat completions, and I'm able to select remote model from list.
@Eisaichen commented on GitHub (Aug 8, 2025):
Try to disable
Cache Base Model ListunderAdmin Panel>Settings>Connections@tjbck commented on GitHub (Aug 8, 2025):
We're unable to reproduce, keep us updated!