mirror of
https://github.com/open-webui/open-webui.git
synced 2026-03-12 01:54:38 -05:00
Adding the supports for other models #167
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @mafrasiabi on GitHub (Jan 10, 2024).
I was wondering whether adding the support for other models e.g. from different endpoints like HuggingFace or AWS SageMaker is on the roadmap?
@justinh-rahb commented on GitHub (Jan 10, 2024):
At the moment, the only external API format supported is OpenAI's. One workaround you could do is using LiteLLM proxy to connect to multiple different external APIs, and then connect ollama-webui to litellm. Here's their documentation: https://litellm.vercel.app/docs
And here's an example
docker-compose.yamlI use to deploy both ollama-webui and litellm together in a stack:@tjbck commented on GitHub (Jan 13, 2024):
@justinh-rahb We should add this info to
/docsas well, I feel like a lot of people will find this useful!@justinh-rahb commented on GitHub (Jan 13, 2024):
Can I include a
litellm-config.yml.examplefile in the repository that provides a basic configuration for the top APIs and models? This would serve as a template for users to copy and modify according to their specific needs. Additionally, would it be possible to create another Docker Compose file that references this example configuration file?It's worth noting that LiteLLM is a highly versatile Python library, and its server includes a configuration API that allows dynamic configuration of models. Depending on the use case, this API can be integrated directly into the backend or be part of the deployed stack and configured via API.
@turnercore commented on GitHub (Jan 15, 2024):
This is great info, thanks! It should for sure be in the docs
@tjbck commented on GitHub (Jan 16, 2024):
I'll merge this issue with open-webui/docs#44, Let's continue our discussion there!
@mafrasiabi commented on GitHub (Jan 24, 2024):
@justinh-rahb could you please share your config,yaml file. I am struggling to get the litellm working but I can't see them in the list of the models
Thanks
my email is m.afrasiabi@gmail.com