mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-07 03:18:23 -05:00
How to load default vllm model ? #2241
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @citizenofathens on GitHub (Sep 30, 2024).
i'm using openwebui and vllm
first i start vllm start using this python3 -m vllm.entrypoints.openai.api_server --model="/root/.cache/huggingface/facebook/opt-125m"
and open webui administrator paenl and set openai api and api key to vllm serving api and key is ''(empty string) save
how i remove this process
example ,As shown above, now i can go into the administrator panel, set the URL and API key each time, save it, and run the vllm server
but i want so that vllm models can also select models when openwebui was started. Like ollama model
No need to manually set and save each time in the administrator panel,
I'm trying it now, but I can change the default openai api address to the vllm api address, but the api key is not saved and the vllm model is not automatically registered.