How to load default vllm model ? #2241

Closed
opened 2025-11-11 15:03:13 -06:00 by GiteaMirror · 0 comments
Owner

Originally created by @citizenofathens on GitHub (Sep 30, 2024).

i'm using openwebui and vllm
first i start vllm start using this python3 -m vllm.entrypoints.openai.api_server --model="/root/.cache/huggingface/facebook/opt-125m"
and open webui administrator paenl and set openai api and api key to vllm serving api and key is ''(empty string) save

how i remove this process

example ,As shown above, now i can go into the administrator panel, set the URL and API key each time, save it, and run the vllm server

but i want so that vllm models can also select models when openwebui was started. Like ollama model

No need to manually set and save each time in the administrator panel,
I'm trying it now, but I can change the default openai api address to the vllm api address, but the api key is not saved and the vllm model is not automatically registered.

Originally created by @citizenofathens on GitHub (Sep 30, 2024). i'm using openwebui and vllm first i start vllm start using this python3 -m vllm.entrypoints.openai.api_server --model="/root/.cache/huggingface/facebook/opt-125m" and open webui administrator paenl and set openai api and api key to vllm serving api and key is ''(empty string) save how i remove this process example ,As shown above, now i can go into the administrator panel, set the URL and API key each time, save it, and run the vllm server but i want so that vllm models can also select models when openwebui was started. Like ollama model No need to manually set and save each time in the administrator panel, I'm trying it now, but I can change the default openai api address to the vllm api address, but the api key is not saved and the vllm model is not automatically registered.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#2241