Adding the supports for other models #167

Closed
opened 2025-11-11 14:09:16 -06:00 by GiteaMirror · 6 comments
Owner

Originally created by @mafrasiabi on GitHub (Jan 10, 2024).

I was wondering whether adding the support for other models e.g. from different endpoints like HuggingFace or AWS SageMaker is on the roadmap?

Originally created by @mafrasiabi on GitHub (Jan 10, 2024). I was wondering whether adding the support for other models e.g. from different endpoints like HuggingFace or AWS SageMaker is on the roadmap?
Author
Owner

@justinh-rahb commented on GitHub (Jan 10, 2024):

At the moment, the only external API format supported is OpenAI's. One workaround you could do is using LiteLLM proxy to connect to multiple different external APIs, and then connect ollama-webui to litellm. Here's their documentation: https://litellm.vercel.app/docs

And here's an example docker-compose.yaml I use to deploy both ollama-webui and litellm together in a stack:

version: '3.9'

services:
  webui:
    image: ghcr.io/ollama-webui/ollama-webui:main
    environment:
      - "OLLAMA_API_BASE_URL=${OLLAMA_API_BASE_URL}"
      - "OPENAI_API_BASE_URL=http://openai-proxy:8000/v1"
      - "OPENAI_API_KEY=${LITELLM_API_KEY}"
    ports:
      - 3000:8080
    volumes:
      - ./ollama-webui/data:/app/backend/data
    restart: unless-stopped

  openai-proxy:
    image: ghcr.io/berriai/litellm:main-latest
    environment:
      - "MASTER_KEY=${LITELLM_API_KEY}"
      - "OPENAI_API_KEY=${OPENAI_API_KEY}"
      - "MISTRAL_API_KEY=${MISTRAL_API_KEY}"
      - "ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}"
    ports:
      - 4000:8000
    volumes:
      - ./litellm/config.yaml:/app/config.yaml
    command: [ "--config", "/app/config.yaml", "--port", "8000" ]
    restart: unless-stopped
294707599-1d027ac4-23d7-410e-a90e-64d497c512dc
@justinh-rahb commented on GitHub (Jan 10, 2024): At the moment, the only external API format supported is OpenAI's. One workaround you could do is using [LiteLLM](https://github.com/BerriAI/litellm) proxy to connect to multiple different external APIs, and then connect ollama-webui to litellm. Here's their documentation: https://litellm.vercel.app/docs And here's an example `docker-compose.yaml` I use to deploy both ollama-webui and litellm together in a stack: ``` version: '3.9' services: webui: image: ghcr.io/ollama-webui/ollama-webui:main environment: - "OLLAMA_API_BASE_URL=${OLLAMA_API_BASE_URL}" - "OPENAI_API_BASE_URL=http://openai-proxy:8000/v1" - "OPENAI_API_KEY=${LITELLM_API_KEY}" ports: - 3000:8080 volumes: - ./ollama-webui/data:/app/backend/data restart: unless-stopped openai-proxy: image: ghcr.io/berriai/litellm:main-latest environment: - "MASTER_KEY=${LITELLM_API_KEY}" - "OPENAI_API_KEY=${OPENAI_API_KEY}" - "MISTRAL_API_KEY=${MISTRAL_API_KEY}" - "ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}" ports: - 4000:8000 volumes: - ./litellm/config.yaml:/app/config.yaml command: [ "--config", "/app/config.yaml", "--port", "8000" ] restart: unless-stopped ``` <img width="756" alt="294707599-1d027ac4-23d7-410e-a90e-64d497c512dc" src="https://github.com/ollama-webui/ollama-webui/assets/52832301/22799e68-3d30-44c0-a705-c9007750255b">
Author
Owner

@tjbck commented on GitHub (Jan 13, 2024):

@justinh-rahb We should add this info to /docs as well, I feel like a lot of people will find this useful!

@tjbck commented on GitHub (Jan 13, 2024): @justinh-rahb We should add this info to `/docs` as well, I feel like a lot of people will find this useful!
Author
Owner

@justinh-rahb commented on GitHub (Jan 13, 2024):

@justinh-rahb We should add this info to /docs as well, I feel like a lot of people will find this useful!

Can I include a litellm-config.yml.example file in the repository that provides a basic configuration for the top APIs and models? This would serve as a template for users to copy and modify according to their specific needs. Additionally, would it be possible to create another Docker Compose file that references this example configuration file?

It's worth noting that LiteLLM is a highly versatile Python library, and its server includes a configuration API that allows dynamic configuration of models. Depending on the use case, this API can be integrated directly into the backend or be part of the deployed stack and configured via API.

@justinh-rahb commented on GitHub (Jan 13, 2024): > @justinh-rahb We should add this info to `/docs` as well, I feel like a lot of people will find this useful! Can I include a `litellm-config.yml.example` file in the repository that provides a basic configuration for the top APIs and models? This would serve as a template for users to copy and modify according to their specific needs. Additionally, would it be possible to create another Docker Compose file that references this example configuration file? It's worth noting that LiteLLM is a highly versatile Python library, and its server includes a configuration API that allows dynamic configuration of models. Depending on the use case, this API can be integrated directly into the backend or be part of the deployed stack and configured via API.
Author
Owner

@turnercore commented on GitHub (Jan 15, 2024):

This is great info, thanks! It should for sure be in the docs

@turnercore commented on GitHub (Jan 15, 2024): This is great info, thanks! It should for sure be in the docs
Author
Owner

@tjbck commented on GitHub (Jan 16, 2024):

I'll merge this issue with open-webui/docs#44, Let's continue our discussion there!

@tjbck commented on GitHub (Jan 16, 2024): I'll merge this issue with open-webui/docs#44, Let's continue our discussion there!
Author
Owner

@mafrasiabi commented on GitHub (Jan 24, 2024):

@justinh-rahb could you please share your config,yaml file. I am struggling to get the litellm working but I can't see them in the list of the models
Thanks
my email is m.afrasiabi@gmail.com

@mafrasiabi commented on GitHub (Jan 24, 2024): @justinh-rahb could you please share your config,yaml file. I am struggling to get the litellm working but I can't see them in the list of the models Thanks my email is m.afrasiabi@gmail.com
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#167