Allow anonymous authentication for OpenAI-like API #1331

Closed
opened 2025-11-11 14:43:09 -06:00 by GiteaMirror · 1 comment
Owner

Originally created by @JeremyEastham on GitHub (Jun 20, 2024).

Bug Report

Description

Bug Summary:
If the OpenAI API endpoint is configured to point to a local OpenAI-like server, the model list is not populated if an API key is not configured.

Steps to Reproduce:

  1. Setup open-webui normally
  2. Configure an OpenAI-like server (ex: LM Studio)
  3. Configure the OpenAI API endpoint to point to the OpenAI-like server using either OPENAI_API_BASE_URL or Admin Panel -> Connections -> OpenAI API
  4. Do not configure an API key

Expected Behavior:
The models list should be populated if a request can be made to the models endpoint, regardless of whether or not an API key is provided. At the very least, an error should be displayed if an API key is not configured.

Actual Behavior:
Fetching the OpenAI models list silently fails if an API key is not configured. Observe that sending an authenticated request to /api/models results in an empty list. The models are also unavailable in the UI.

Environment

  • Open WebUI Version: v0.3.5
  • Ollama (if applicable): (not applicable)
  • Operating System: Windows 10
  • Browser (if applicable): (not applicable)

Reproduction Details

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.

Logs and Screenshots

Browser Console Logs:
(not applicable)

Docker Container Logs:
INFO:apps.openai.main:get_all_models() is printed in the console, indicating that an attempt is being made to request the models list, but the request is not actually performed if the API key is missing (verified via LM Studio logs). The request is performed successfully if the Check Connection button is pressed in the Admin Panel, but the list of models is not retained.

Screenshots (if applicable):

(not applicable)

Installation Method

Manual installation (pip install)

Additional Information

(follow up if needed)

Note

If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!

Originally created by @JeremyEastham on GitHub (Jun 20, 2024). # Bug Report ## Description **Bug Summary:** If the OpenAI API endpoint is configured to point to a local OpenAI-like server, the model list is not populated if an API key is not configured. **Steps to Reproduce:** 1. Setup `open-webui` normally 2. Configure an OpenAI-like server (ex: [LM Studio](https://lmstudio.ai/)) 3. Configure the OpenAI API endpoint to point to the OpenAI-like server using either `OPENAI_API_BASE_URL` or `Admin Panel -> Connections -> OpenAI API` 4. **Do not configure an API key** **Expected Behavior:** The models list should be populated if a request can be made to the models endpoint, regardless of whether or not an API key is provided. At the very least, an error should be displayed if an API key is not configured. **Actual Behavior:** Fetching the OpenAI models list silently fails if an API key is not configured. Observe that sending an authenticated request to `/api/models` results in an empty list. The models are also unavailable in the UI. ## Environment - **Open WebUI Version:** v0.3.5 - **Ollama (if applicable):** (not applicable) - **Operating System:** Windows 10 - **Browser (if applicable):** (not applicable) ## Reproduction Details **Confirmation:** - [X] I have read and followed all the instructions provided in the README.md. - [X] I am on the latest version of both Open WebUI and Ollama. - [ ] I have included the browser console logs. - [X] I have included the Docker container logs. ## Logs and Screenshots **Browser Console Logs:** (not applicable) **Docker Container Logs:** `INFO:apps.openai.main:get_all_models()` is printed in the console, indicating that an attempt is being made to request the models list, but the request is not actually performed if the API key is missing (verified via LM Studio logs). The request is performed successfully if the `Check Connection` button is pressed in the Admin Panel, but the list of models is not retained. **Screenshots (if applicable):** (not applicable) ## Installation Method Manual installation (`pip install`) ## Additional Information (follow up if needed) ## Note If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
Author
Owner

@justinh-rahb commented on GitHub (Jun 20, 2024):

Not a bug, expected behaviour.

@justinh-rahb commented on GitHub (Jun 20, 2024): Not a bug, expected behaviour.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#1331