issue: Adding OpenAI endpoint without /v1 path silenty fails #6753

Closed
opened 2025-11-11 17:05:06 -06:00 by GiteaMirror · 1 comment
Owner

Originally created by @a-nldisr on GitHub (Oct 24, 2025).

Check Existing Issues

  • I have searched for any existing and/or related issues.
  • I have searched for any existing and/or related discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Pip Install

Open WebUI Version

v0.6.34

Ollama Version (if applicable)

not applicable

Operating System

OSX/Arch Linux

Browser (if applicable)

Chrome Version 141.0.7390.123 (Official Build) (arm64)

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

When adding an external OpenAI connection with no /1 path i expect an error message instead of not showing the connection

Actual Behavior

Context:
Run LMStudio on an Arch Linux box.
Run OpenWebUI on my local macbook

In Admin panel connections added the LMStudio connection in OpenAI connections, configured:
http://192.168.1.150

Did not include /v1 path

Connection is established when testing the connection.

After searching for models no External connections are found in the interface as if the connection does not exist.

I expect to either give an Error when a wrong connection / path is not provided.

Steps to Reproduce

  1. Have an open-webui install
  2. Go to admin panel, connections
  3. Add your remote server, and forget to add the /v1 path: http://192.168.1.150
  4. Test connection, it will pass
  5. Save
  6. Search for remote models, no external model tab will appear.

Correct way of setting the connection:

  1. Open-webui installed
  2. Go to admin panel, connections
  3. Add your remote server, add http://192.168.1.150**/path**
  4. Test connection, it will pass
  5. Save
  6. Search for remote models, you now see external model tab

Logs & Screenshots

Logs after adding the OPENAI connection:

2025-10-24 11:21:55.992 | INFO     | open_webui.config:save:212 - Saving 'ENABLE_OPENAI_API' to the database
2025-10-24 11:21:55.995 | INFO     | open_webui.config:save:212 - Saving 'OPENAI_API_BASE_URLS' to the database
2025-10-24 11:21:55.997 | INFO     | open_webui.config:save:212 - Saving 'OPENAI_API_KEYS' to the database
2025-10-24 11:21:55.998 | INFO     | open_webui.config:save:212 - Saving 'OPENAI_API_CONFIGS' to the database
2025-10-24 11:21:56.000 | INFO     | open_webui.config:save:212 - Saving 'OPENAI_API_CONFIGS' to the database
2025-10-24 11:21:56.001 | INFO     | open_webui.config:save:212 - Saving 'ENABLE_OLLAMA_API' to the database
2025-10-24 11:21:56.004 | INFO     | open_webui.config:save:212 - Saving 'OLLAMA_BASE_URLS' to the database
2025-10-24 11:21:56.006 | INFO     | open_webui.config:save:212 - Saving 'OLLAMA_API_CONFIGS' to the database
2025-10-24 11:21:56.008 | INFO     | open_webui.config:save:212 - Saving 'OLLAMA_API_CONFIGS' to the database
2025-10-24 11:21:56.011 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 127.0.0.1:52044 - "POST /openai/config/update HTTP/1.1" 200
2025-10-24 11:21:56.011 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 127.0.0.1:52050 - "POST /ollama/config/update HTTP/1.1" 200
2025-10-24 11:21:56.018 | INFO     | open_webui.routers.openai:get_all_models:490 - get_all_models()
2025-10-24 11:21:56.033 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 127.0.0.1:52050 - "GET /api/models?refresh=true HTTP/1.1" 200
2025-10-24 11:21:56.040 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 127.0.0.1:52044 - "GET /api/models?refresh=true HTTP/1.1" 200

No call to remote server in the logs.

Screenshot of the verified connection:

Image

Additional Information

No response

Originally created by @a-nldisr on GitHub (Oct 24, 2025). ### Check Existing Issues - [x] I have searched for any existing and/or related issues. - [x] I have searched for any existing and/or related discussions. - [x] I am using the latest version of Open WebUI. ### Installation Method Pip Install ### Open WebUI Version v0.6.34 ### Ollama Version (if applicable) not applicable ### Operating System OSX/Arch Linux ### Browser (if applicable) Chrome Version 141.0.7390.123 (Official Build) (arm64) ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have **provided every relevant configuration, setting, and environment variable used in my setup.** - [x] I have clearly **listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup** (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc). - [x] I have documented **step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation**. My steps: - Start with the initial platform/version/OS and dependencies used, - Specify exact install/launch/configure commands, - List URLs visited, user input (incl. example values/emails/passwords if needed), - Describe all options and toggles enabled or changed, - Include any files or environmental changes, - Identify the expected and actual result at each stage, - Ensure any reasonably skilled user can follow and hit the same issue. ### Expected Behavior When adding an external OpenAI connection with no /1 path i expect an error message instead of not showing the connection ### Actual Behavior Context: Run LMStudio on an Arch Linux box. Run OpenWebUI on my local macbook In Admin panel connections added the LMStudio connection in OpenAI connections, configured: http://192.168.1.150 Did not include /v1 path Connection is established when testing the connection. After searching for models no External connections are found in the interface as if the connection does not exist. I expect to either give an Error when a wrong connection / path is not provided. ### Steps to Reproduce 1. Have an open-webui install 2. Go to admin panel, connections 3. Add your remote server, and forget to add the /v1 path: http://192.168.1.150 4. Test connection, it will pass 5. Save 6. Search for remote models, no external model tab will appear. Correct way of setting the connection: 1. Open-webui installed 2. Go to admin panel, connections 3. Add your remote server, add http://192.168.1.150**/path** 4. Test connection, it will pass 5. Save 6. Search for remote models, you now see external model tab ### Logs & Screenshots Logs after adding the OPENAI connection: ```text 2025-10-24 11:21:55.992 | INFO | open_webui.config:save:212 - Saving 'ENABLE_OPENAI_API' to the database 2025-10-24 11:21:55.995 | INFO | open_webui.config:save:212 - Saving 'OPENAI_API_BASE_URLS' to the database 2025-10-24 11:21:55.997 | INFO | open_webui.config:save:212 - Saving 'OPENAI_API_KEYS' to the database 2025-10-24 11:21:55.998 | INFO | open_webui.config:save:212 - Saving 'OPENAI_API_CONFIGS' to the database 2025-10-24 11:21:56.000 | INFO | open_webui.config:save:212 - Saving 'OPENAI_API_CONFIGS' to the database 2025-10-24 11:21:56.001 | INFO | open_webui.config:save:212 - Saving 'ENABLE_OLLAMA_API' to the database 2025-10-24 11:21:56.004 | INFO | open_webui.config:save:212 - Saving 'OLLAMA_BASE_URLS' to the database 2025-10-24 11:21:56.006 | INFO | open_webui.config:save:212 - Saving 'OLLAMA_API_CONFIGS' to the database 2025-10-24 11:21:56.008 | INFO | open_webui.config:save:212 - Saving 'OLLAMA_API_CONFIGS' to the database 2025-10-24 11:21:56.011 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 127.0.0.1:52044 - "POST /openai/config/update HTTP/1.1" 200 2025-10-24 11:21:56.011 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 127.0.0.1:52050 - "POST /ollama/config/update HTTP/1.1" 200 2025-10-24 11:21:56.018 | INFO | open_webui.routers.openai:get_all_models:490 - get_all_models() 2025-10-24 11:21:56.033 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 127.0.0.1:52050 - "GET /api/models?refresh=true HTTP/1.1" 200 2025-10-24 11:21:56.040 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 127.0.0.1:52044 - "GET /api/models?refresh=true HTTP/1.1" 200 ``` No call to remote server in the logs. Screenshot of the verified connection: <img width="2378" height="1388" alt="Image" src="https://github.com/user-attachments/assets/869dcc88-e022-4a88-a23e-bdcde9db08c6" /> ### Additional Information _No response_
GiteaMirror added the bug label 2025-11-11 17:05:06 -06:00
Author
Owner

@tjbck commented on GitHub (Oct 26, 2025):

Intended behaviour, the user must make sure the correct url has been added.

@tjbck commented on GitHub (Oct 26, 2025): Intended behaviour, the user must make sure the correct url has been added.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#6753