mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 19:08:59 -05:00
[GH-ISSUE #8043] Open WebUI has problems with VLLM OpenAI compatible API #30504
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @JohnConnor123 on GitHub (Dec 24, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/8043
Bug Report
Installation Method
Docker
Environment
Open WebUI Version:: latest (image id: 795de2b7c3e0)
Operating System:: Ubuntu 24.10
Confirmation:
Expected Behavior:
I want the web ui to give me the option to choose a model
Actual Behavior:
WebUI does not see openai compatible api
Description
I have vLLM running in docker with locally downloaded LLM at /mnt/weights/saiga_nemo_12b-Q6_K.gguf. I want open web ui to have the option to select this LLM. My command to run open webui is the following:
start-webui.sh
start-vllm.sh:
But I can't select any model at all in webui. How to fix this? I can provide all aditional information





Webui post request by this address:
Address is reachable:
open web-ui docker logs:
And there is no requests to vllm api (vllm docker logs):