Listing Models from Ooba OpenAI API #1023

Closed
opened 2025-11-11 14:35:48 -06:00 by GiteaMirror · 0 comments
Owner

Originally created by @murtaza-nasir on GitHub (May 24, 2024).

Discussed in https://github.com/open-webui/open-webui/discussions/2509

Originally posted by murtaza-nasir May 23, 2024
Hello, open-webui community!

I am currently integrating openwebui with ooba textgenwebui:

sudo docker run -d --network=host -p 3000:8080 -e OPENAI_API_KEY="dummy" -e OPENAI_API_BASE_URL="http://192.168.68.85:5001/v1" -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:dev

Initially, the ooba openai API, accessed via /v1/models, listed only gpt3.5 and an embedding model as dummy models. I applied a fix from this GitHub comment hoping to resolve the issue.

After applying the fix, accessing http://192.168.68.85:5001/v1/models correctly displays a new list of available models:

{
  "model_names": [
    "LoneStriker_Llama3-ChatQA-1.5-70B-4.0bpw-h6-exl2",
    "LoneStriker_Phi-3-medium-128k-instruct-5.0bpw-h6-exl2",
    "LoneStriker_wolfram_miqu-1-120b-5.0bpw-h6-exl2",
    "LoneStriker_Yi-34B-200K-RPMerge-GPTQ",
    "MaziyarPanahi_Llama-3-70B-Instruct-32k-v0.1",
    "MaziyarPanahi_miqu-1-70b-sf-GPTQ",
    "NEURALDEEPTECH_command-r-gptq",
    "TheBloke_orca_mini_v3_70B-GPTQ",
    "TroyDoesAI_Phi-3-Context-Obedient-RAG",
    "turboderp_command-r-plus-103B-exl2_4.25bpw",
    "turboderp_Mixtral-8x22B-Instruct-v0.1-exl2_4.0bpw"
  ]
}

However, while the dummy model list displayed correctly in open-webui, the correct list is showing up as empty. Could anyone provide insight into why the actual model list isn't populating in the UI, or suggest any further troubleshooting steps? I have tried this with both the main and the dev branches.

Thank you in advance for your help!

Originally created by @murtaza-nasir on GitHub (May 24, 2024). ### Discussed in https://github.com/open-webui/open-webui/discussions/2509 <div type='discussions-op-text'> <sup>Originally posted by **murtaza-nasir** May 23, 2024</sup> Hello, open-webui community! I am currently integrating openwebui with ooba textgenwebui: ```bash sudo docker run -d --network=host -p 3000:8080 -e OPENAI_API_KEY="dummy" -e OPENAI_API_BASE_URL="http://192.168.68.85:5001/v1" -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:dev ``` Initially, the ooba openai API, accessed via `/v1/models`, listed only gpt3.5 and an embedding model as dummy models. I applied a fix from [this GitHub comment](https://github.com/oobabooga/text-generation-webui/issues/5675#issuecomment-2002696615) hoping to resolve the issue. After applying the fix, accessing `http://192.168.68.85:5001/v1/models` correctly displays a new list of available models: ```json { "model_names": [ "LoneStriker_Llama3-ChatQA-1.5-70B-4.0bpw-h6-exl2", "LoneStriker_Phi-3-medium-128k-instruct-5.0bpw-h6-exl2", "LoneStriker_wolfram_miqu-1-120b-5.0bpw-h6-exl2", "LoneStriker_Yi-34B-200K-RPMerge-GPTQ", "MaziyarPanahi_Llama-3-70B-Instruct-32k-v0.1", "MaziyarPanahi_miqu-1-70b-sf-GPTQ", "NEURALDEEPTECH_command-r-gptq", "TheBloke_orca_mini_v3_70B-GPTQ", "TroyDoesAI_Phi-3-Context-Obedient-RAG", "turboderp_command-r-plus-103B-exl2_4.25bpw", "turboderp_Mixtral-8x22B-Instruct-v0.1-exl2_4.0bpw" ] } ``` However, while the dummy model list displayed correctly in open-webui, the correct list is showing up as empty. Could anyone provide insight into why the actual model list isn't populating in the UI, or suggest any further troubleshooting steps? I have tried this with both the main and the dev branches. Thank you in advance for your help!</div>
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#1023