mirror of
https://github.com/open-webui/open-webui.git
synced 2026-03-17 20:43:32 -05:00
Additional external endpoint #182
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @davecrab on GitHub (Jan 14, 2024).
Is your feature request related to a problem? Please describe.
I'd like to be able to add both OpenAI and Mistral as endpoints so I can use either/both for prompts when needed.
Describe the solution you'd like
The option to add additional endpoints, eg to OpenAI and Mistral
Describe alternatives you've considered
I've tried using liteLLM as the endpoint but it brings with it all the ollama models so every model ends up being listed twice. Filtering out ollama models from liteLLM could also be a solution I suppose.
Thanks for all the work on this project, the speed at which it's developing is amazing!
@justinh-rahb commented on GitHub (Jan 14, 2024):
@davecrab I am working on a PR with a thorough guide in
/docsfor setting up LiteLLM alongside Ollama WebUI bringing together the various snippets I've dropped into issue reports like this.In the meantime here's a sample
config.yamlI use:You can store this configuration in a file named
config.yaml. The full documentation for setting up the LiteLLM proxy is here: https://docs.litellm.ai/docs/proxy/configsNext, you'll need to run Ollama WebUI and LiteLLM using Docker Compose. Here's an example of what the
docker-compose.litellm.yamlfile should look like:Make sure to replace the placeholders for API keys with your actual keys. Also, ensure that the
config.yamlfile is located in a directory namedlitellm, which should be inside theollama-webuidirectory. Finally, run the Docker Compose file using the commanddocker-compose -f docker-compose.litellm.yml up.By following these steps, you should end up with a cleaner and more organized model list in Ollama WebUI:

Let me know if you have any questions or need further clarification!
@davecrab commented on GitHub (Jan 14, 2024):
Oh this is great, thanks! Will give it a go tomorrow
@tjbck commented on GitHub (Jan 16, 2024):
I'll merge this issue with https://github.com/ollama-webui/ollama-webui/issues/432, Let's continue our discussion there!