Bug: /openai/v1 endpoint is accessible even though environment variable for endpoint restriction is set - and filters dont work #3482

Closed
opened 2025-11-11 15:32:37 -06:00 by GiteaMirror · 4 comments
Owner

Originally created by @Classic298 on GitHub (Jan 29, 2025).

Bug Report


Installation Method

pip in a venv, on debian

Environment

  • Open WebUI Version: v.0.5.4 // v0.5.7

  • Operating System: Debian

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below.

Expected Behavior:

Core statement: When setting the environment variables accordingly, the endpoint /openai/v1 should not be accessible.

Just for confirming, I went through the hassle of upgrading from 0.5.4 to 0.5.7 to confirm that the endpoint remains accessible despite the most recent changenotes of endpoint accessibility changes for users.

ENABLE_API_KEY_ENDPOINT_RESTRICTION=True
API_KEY_ALLOWED_ENDPOINTS=/api/chat/completions

I used the JWT Key in my tests, but it should not make a difference compared to the API Key.
API Key is not enabled - as I do not like to have a permanently valid API Key. JWT Keys expire after a set time, so I prefer using them.

Furthermore, unlike with /api/chat/completions, the OpenWebUI System Prompt does not get passed to the model and none of the filters are invoked (neither inlet nor outlet, even filters with no event emitters are not being used).

Actual Behavior:

When the env variables are set like this, one assumes that only /api/chat/completions is accessible. However, /openai/v1 remains accessible as well.

A second issue I found during my testing: Filters are not used at all by the model, system prompt is also not passed to the model when going via /openai/v1 endpoint.

Description

Bug Summary:

  • Issue 1: Endpoint remains accessible even though access to it has been restricted via environment variable
  • Issue 2: System Prompt of the model and filters are not being invoked when the model is called via the /openai/v1 endpoint.

Reproduction Details

Steps to Reproduce:

(Obviously the same has to be true when sending the requests yourself, but this is how I discovered the issue:)
Download continue.dev plugin for Visual Studio Code

Change the config.json to be compatible with your OpenWebUI installation such as ...

{
  "models": [
    {
      "title": "Model name",
      "model": "model.string",
      "provider": "openai",
      "apiKey": "jwt key here",
      "apiBase": "http://localhost/openai/v1",
      "useLegacyCompletionsEndpoint": false
    }
  ],
  "tabAutocompleteModel": {
    "title": "Model name",
    "model": "model.string",
    "provider": "openai",
    "apiKey": "jwt here",
    "apiBase": "http://localhost/openai/v1",
    "useLegacyCompletionsEndpoint": false
  },
  "contextProviders": [
    {
      "name": "code",
      "params": {}
    },
    {
      "name": "docs",
      "params": {}
    },
    {
      "name": "diff",
      "params": {}
    },
    {
      "name": "terminal",
      "params": {}
    },
    {
      "name": "problems",
      "params": {}
    },
    {
      "name": "folder",
      "params": {}
    },
    {
      "name": "codebase",
      "params": {}
    }
  ],
  "slashCommands": [
    {
      "name": "share",
      "description": "Export the current chat session to markdown"
    },
    {
      "name": "cmd",
      "description": "Generate a shell command"
    },
    {
      "name": "commit",
      "description": "Generate a git commit message"
    }
  ]
}

Furthermore, I have used system prompts with variables such as current date and time as defined in the docs, and the model returns random dates from 2023 instead.

And for the filters: I tested it with simple filters that work on the /api/chat/completions endpoint and in the normal WebUI (no event emitters used, and only inlet method) but the filters are not being called when using the /openai/v1 endpoint.

Additional Information

Filters DO WORK when going via /api/chat/completions, but only if they do not contain event emitters and only the inlet() gets called. Outlets do not get called.

On the /openai/v1 endpoint however, neither filters are being called nor the system prompt is included when sending a request to a model.

The same access issue might be true for the ollama/v1 endpoint as well, did not test it because I do not have any local models installed. However, either way, endpoints should not work if they are not enabled excplicitly using the designated environment variable.

Here is a related discussion about this (that outlets do not work): https://github.com/open-webui/open-webui/discussions/8722

Originally created by @Classic298 on GitHub (Jan 29, 2025). # Bug Report --- ## Installation Method pip in a venv, on debian ## Environment - **Open WebUI Version:** v.0.5.4 // **v0.5.7** - **Operating System:** Debian **Confirmation:** - [X] I have read and followed all the instructions provided in the README.md. - [X] I am on the latest version of both Open WebUI and Ollama. - [ ] I have included the browser console logs. - [ ] I have included the Docker container logs. - [X] I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below. ## Expected Behavior: Core statement: **When setting the environment variables accordingly, the endpoint /openai/v1 should not be accessible.** Just for confirming, I went through the hassle of upgrading from 0.5.4 to 0.5.7 to confirm that the endpoint remains accessible despite the most recent changenotes of endpoint accessibility changes for users. ``` ENABLE_API_KEY_ENDPOINT_RESTRICTION=True API_KEY_ALLOWED_ENDPOINTS=/api/chat/completions ``` I used the JWT Key in my tests, but it should not make a difference compared to the API Key. **API Key is not enabled** - as I do not like to have a permanently valid API Key. JWT Keys expire after a set time, so I prefer using them. Furthermore, unlike with /api/chat/completions, the OpenWebUI System Prompt does not get passed to the model and none of the filters are invoked (neither inlet nor outlet, even filters with no event emitters are not being used). ## Actual Behavior: When the env variables are set like this, one assumes that only /api/chat/completions is accessible. **However, /openai/v1 remains accessible as well.** **A _second_ issue I found during my testing:** Filters are not used at all by the model, system prompt is also not passed to the model when going via /openai/v1 endpoint. ## Description **Bug Summary:** - Issue 1: Endpoint remains accessible even though access to it has been restricted via environment variable - Issue 2: System Prompt of the model and filters are not being invoked when the model is called via the /openai/v1 endpoint. ## Reproduction Details **Steps to Reproduce:** (Obviously the same has to be true when sending the requests yourself, but this is how I discovered the issue:) Download continue.dev plugin for Visual Studio Code Change the config.json to be compatible with your OpenWebUI installation such as ... ``` { "models": [ { "title": "Model name", "model": "model.string", "provider": "openai", "apiKey": "jwt key here", "apiBase": "http://localhost/openai/v1", "useLegacyCompletionsEndpoint": false } ], "tabAutocompleteModel": { "title": "Model name", "model": "model.string", "provider": "openai", "apiKey": "jwt here", "apiBase": "http://localhost/openai/v1", "useLegacyCompletionsEndpoint": false }, "contextProviders": [ { "name": "code", "params": {} }, { "name": "docs", "params": {} }, { "name": "diff", "params": {} }, { "name": "terminal", "params": {} }, { "name": "problems", "params": {} }, { "name": "folder", "params": {} }, { "name": "codebase", "params": {} } ], "slashCommands": [ { "name": "share", "description": "Export the current chat session to markdown" }, { "name": "cmd", "description": "Generate a shell command" }, { "name": "commit", "description": "Generate a git commit message" } ] } ``` Furthermore, I have used system prompts with variables such as current date and time as defined in the docs, and the model returns random dates from 2023 instead. And for the filters: I tested it with simple filters that **work** on the /api/chat/completions endpoint and in the normal WebUI (no event emitters used, and only inlet method) but the filters are not being called when using the /openai/v1 endpoint. ## Additional Information Filters DO WORK when going via /api/chat/completions, but only if they do not contain event emitters and only the inlet() gets called. Outlets do not get called. **On the /openai/v1 endpoint however, neither filters are being called nor the system prompt is included when sending a request to a model.** The same access issue might be true for the ollama/v1 endpoint as well, did not test it because I do not have any local models installed. However, either way, endpoints should not work if they are not enabled excplicitly using the designated environment variable. Here is a related discussion about this (that outlets do not work): https://github.com/open-webui/open-webui/discussions/8722
Author
Owner

@tjbck commented on GitHub (Jan 29, 2025):

API_KEY_ALLOWED_ENDPOINTS is a persistent config, once they're initialised you can only configure them via Settings UI.

@tjbck commented on GitHub (Jan 29, 2025): API_KEY_ALLOWED_ENDPOINTS is a persistent config, once they're initialised you can only configure them via Settings UI.
Author
Owner

@Classic298 commented on GitHub (Jan 29, 2025):

API_KEY_ALLOWED_ENDPOINTS is a persistent config, once they're initialised you can only configure them via Settings UI.

Please don't close it already.
I did configure them via the UI as well!

However even with that method, /openai/v1 remains accessible!

And what about the second problem mentioned in this issue?

@Classic298 commented on GitHub (Jan 29, 2025): > API_KEY_ALLOWED_ENDPOINTS is a persistent config, once they're initialised you can only configure them via Settings UI. Please don't close it already. I did configure them via the UI as well! However even with that method, /openai/v1 remains accessible! And what about the second problem mentioned in this issue?
Author
Owner

@Classic298 commented on GitHub (Jan 30, 2025):

API_KEY_ALLOWED_ENDPOINTS is a persistent config, once they're initialised you can only configure them via Settings UI.

@tjbck

Image

even when set like this, calls to /openai/v1 still work and filters still do not work.

@Classic298 commented on GitHub (Jan 30, 2025): > API_KEY_ALLOWED_ENDPOINTS is a persistent config, once they're initialised you can only configure them via Settings UI. @tjbck ![Image](https://github.com/user-attachments/assets/8d324e10-83d7-44c5-9bcc-dcaca9451b5f) even when set like this, calls to /openai/v1 still work and filters still do not work.
Author
Owner

@Classic298 commented on GitHub (Jan 30, 2025):

I suspect I found the issue to one of the two problems

    # auth by api key
    if token.startswith("sk-"):
        if not request.state.enable_api_key:
            raise HTTPException(
                status.HTTP_403_FORBIDDEN, detail=ERROR_MESSAGES.API_KEY_NOT_ALLOWED
            )

        if request.app.state.config.ENABLE_API_KEY_ENDPOINT_RESTRICTIONS:
            allowed_paths = [
                path.strip()
                for path in str(
                    request.app.state.config.API_KEY_ALLOWED_ENDPOINTS
                ).split(",")
            ]

            if request.url.path not in allowed_paths:
                raise HTTPException(
                    status.HTTP_403_FORBIDDEN, detail=ERROR_MESSAGES.API_KEY_NOT_ALLOWED
                )

        return get_current_user_by_api_key(token)

This is a code section in the auth.py

The API_KEY_ALLOWED_ENDPOINTS variable is only checked for API Keys, but not for JWT.
This could mean a much bigger issue is present, whereas users can bypass all the endpoint restrictions using their JWT

The second if block should be indented one less tab to ensure the endpoint restrictions are properly checked for all Keys.

I know that the User-Interface only allows you to change the endpoint restrictions when you also enable API Key access, but I checked the config, and the endpoint restrictions are still stored even though api key access itself is turned off. And the endpoint restriction should apply to API and JWT regardless as otherwise users can bypass the restrictions with their JWT very easily. I do not see a case when I would want to restrict the endpoints ONLY for when api keys are used and not the JWT's. Again, users can easily bypass restrictions this way then.

I think I will make a quick PR for this, should be an easy fix.

@Classic298 commented on GitHub (Jan 30, 2025): I suspect I found the issue to one of the two problems ```auth.py # auth by api key if token.startswith("sk-"): if not request.state.enable_api_key: raise HTTPException( status.HTTP_403_FORBIDDEN, detail=ERROR_MESSAGES.API_KEY_NOT_ALLOWED ) if request.app.state.config.ENABLE_API_KEY_ENDPOINT_RESTRICTIONS: allowed_paths = [ path.strip() for path in str( request.app.state.config.API_KEY_ALLOWED_ENDPOINTS ).split(",") ] if request.url.path not in allowed_paths: raise HTTPException( status.HTTP_403_FORBIDDEN, detail=ERROR_MESSAGES.API_KEY_NOT_ALLOWED ) return get_current_user_by_api_key(token) ``` This is a code section in the auth.py The API_KEY_ALLOWED_ENDPOINTS variable is only checked for API Keys, but not for JWT. This could mean a much bigger issue is present, whereas users can bypass all the endpoint restrictions using their JWT The second if block should be indented one less tab to ensure the endpoint restrictions are properly checked for all Keys. I know that the User-Interface only allows you to change the endpoint restrictions when you also enable API Key access, but I checked the config, and the endpoint restrictions are still stored even though api key access itself is turned off. And the endpoint restriction should apply to API and JWT regardless as otherwise users can bypass the restrictions with their JWT very easily. I do not see a case when I would want to restrict the endpoints ONLY for when api keys are used and not the JWT's. Again, users can easily bypass restrictions this way then. I think I will make a quick PR for this, should be an easy fix.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#3482