[GH-ISSUE #1379] Open WebUI seems not to be able to access Litellm when protected by master key #12469

Closed
opened 2026-04-19 19:24:24 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @flefevre on GitHub (Apr 1, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/1379

Bug Report

Description

Bug Summary:
When going to the Open WebUi model panel, if try to click on the litellm config panel, you have no models listed.
moreover in the log, you have a

openwebui | INFO: 172.22.0.1:0 - "GET /litellm/api/model/info HTTP/1.1" 401 Unauthorized

Expected Behavior:

Normally shoud be able to connect to Litellm configured with https://docs.litellm.ai/docs/proxy/virtual_keys#custom-auth
and list all models in the config files

Actual Behavior:

It load the litellm-config.yalm file

openwebui  | 15:08:20 - LiteLLM Proxy:DEBUG: Loaded config YAML (api_key and environment_variables are not shown):
openwebui  | {
openwebui  |   "model_list": [
openwebui  |     {
openwebui  |       "model_name": "dolphin-phi",
openwebui  |       "litellm_params": {
openwebui  |         "model": "ollama/dolphin-phi",
openwebui  |         "api_base": "http://ollama:11434",
openwebui  |         "stream": true
openwebui  |       }
openwebui  |     },
openwebui  |     {
openwebui  |       "model_name": "mistral",
openwebui  |       "litellm_params": {
openwebui  |         "model": "vllm/mistralai/Mistral-7B-v0.1",
openwebui  |         "api_base": "http://vllm:5001/v1",
openwebui  |         "api_key": "yourapikey",
openwebui  |         "stream": true
openwebui  |       }
openwebui  |     }
openwebui  |   ],
openwebui  |   "litellm_settings": {
openwebui  |     "drop_params": true,
openwebui  |     "max_budget": 100,
openwebui  |     "budget_duration": "30d",
openwebui  |     "cache": true,
openwebui  |     "cache_params": {
openwebui  |       "type": "redis"
openwebui  |     }
openwebui  |   },
openwebui  |   "general_settings": {
openwebui  |     "master_key": "sk-1234"
openwebui  |   }
openwebui  | }

but it seems to have a problem to acess the master key?

openwebui  |   File "/usr/local/lib/python3.11/site-packages/litellm/proxy/proxy_server.py", line 372, in user_api_key_auth
openwebui  |     assert api_key.startswith("sk-")  # prevent token hashes from being used
openwebui  |     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
openwebui  | AssertionError
openwebui  | INFO:     172.22.0.1:0 - "GET /litellm/api/v1/models HTTP/1.1" 200 OK
openwebui  | DEBUG:peewee:('SELECT "t1"."id", "t1"."name", "t1"."email", "t1"."role", "t1"."profile_image_url", "t1"."timestamp" FROM "user" AS "t1" WHERE ("t1"."id" = ?) LIMIT ? OFFSET ?', ['dd9eff39-5e17-4ed1-a88f-d514d41d1d22', 1, 0])
openwebui  | DEBUG:peewee:('SELECT "t1"."id", "t1"."tag_name", "t1"."user_id", "t1"."modelfile", "t1"."timestamp" FROM "modelfile" AS "t1"', [])
openwebui  | INFO:     172.22.0.1:0 - "GET /api/v1/modelfiles/ HTTP/1.1" 200 OK
openwebui  | DEBUG:peewee:('SELECT "t1"."id", "t1"."name", "t1"."email", "t1"."role", "t1"."profile_image_url", "t1"."timestamp" FROM "user" AS "t1" WHERE ("t1"."id" = ?) LIMIT ? OFFSET ?', ['dd9eff39-5e17-4ed1-a88f-d514d41d1d22', 1, 0])
openwebui  | DEBUG:peewee:('SELECT "t1"."id", "t1"."command", "t1"."user_id", "t1"."title", "t1"."content", "t1"."timestamp" FROM "prompt" AS "t1"', [])
openwebui  | INFO:     172.22.0.1:0 - "GET /api/v1/prompts/ HTTP/1.1" 200 OK
openwebui  | DEBUG:peewee:('SELECT "t1"."id", "t1"."name", "t1"."email", "t1"."role", "t1"."profile_image_url", "t1"."timestamp" FROM "user" AS "t1" WHERE ("t1"."id" = ?) LIMIT ? OFFSET ?', ['dd9eff39-5e17-4ed1-a88f-d514d41d1d22', 1, 0])
openwebui  | DEBUG:peewee:('SELECT "t1"."id", "t1"."collection_name", "t1"."name", "t1"."title", "t1"."filename", "t1"."content", "t1"."user_id", "t1"."timestamp" FROM "document" AS "t1"', [])
openwebui  | INFO:     172.22.0.1:0 - "GET /api/v1/documents/ HTTP/1.1" 200 OK
openwebui  | DEBUG:peewee:('SELECT "t1"."id", "t1"."name", "t1"."email", "t1"."role", "t1"."profile_image_url", "t1"."timestamp" FROM "user" AS "t1" WHERE ("t1"."id" = ?) LIMIT ? OFFSET ?', ['dd9eff39-5e17-4ed1-a88f-d514d41d1d22', 1, 0])
openwebui  | DEBUG:peewee:('SELECT "t1"."id", "t1"."tag_name", "t1"."chat_id", "t1"."user_id", "t1"."timestamp" FROM "chatidtag" AS "t1" WHERE ("t1"."user_id" = ?) ORDER BY "t1"."timestamp" DESC', ['dd9eff39-5e17-4ed1-a88f-d514d41d1d22'])
openwebui  | DEBUG:peewee:('SELECT "t1"."id", "t1"."name", "t1"."user_id", "t1"."data" FROM "tag" AS "t1" WHERE (0 = 1)', [])
openwebui  | INFO:     172.22.0.1:0 - "GET /api/v1/chats/tags/all HTTP/1.1" 200 OK
openwebui  | INFO:apps.ollama.main:get_all_models()
openwebui  | DEBUG:peewee:('SELECT "t1"."id", "t1"."name", "t1"."email", "t1"."role", "t1"."profile_image_url", "t1"."timestamp" FROM "user" AS "t1" WHERE ("t1"."id" = ?) LIMIT ? OFFSET ?', ['dd9eff39-5e17-4ed1-a88f-d514d41d1d22', 1, 0])
openwebui  | INFO:apps.ollama.main:get_all_models()
openwebui  | INFO:     172.22.0.1:0 - "GET /ollama/api/tags HTTP/1.1" 200 OK
openwebui  | INFO:apps.ollama.main:get_all_models()
openwebui  | INFO:     172.22.0.1:0 - "GET /api/changelog HTTP/1.1" 200 OK
openwebui  | DEBUG:peewee:('SELECT "t1"."id", "t1"."name", "t1"."email", "t1"."role", "t1"."profile_image_url", "t1"."timestamp" FROM "user" AS "t1" WHERE ("t1"."id" = ?) LIMIT ? OFFSET ?', ['dd9eff39-5e17-4ed1-a88f-d514d41d1d22', 1, 0])
openwebui  | DEBUG:peewee:('SELECT "t1"."id", "t1"."user_id", "t1"."title", "t1"."chat", "t1"."timestamp" FROM "chat" AS "t1" WHERE ("t1"."user_id" = ?) ORDER BY "t1"."timestamp" DESC', ['dd9eff39-5e17-4ed1-a88f-d514d41d1d22'])
openwebui  | INFO:     172.22.0.1:0 - "GET /api/v1/chats/ HTTP/1.1" 200 OK
openwebui  | INFO:     172.22.0.1:0 - "GET /user.png HTTP/1.1" 200 OK
openwebui  | INFO:     172.22.0.1:0 - "GET /ollama/api/version HTTP/1.1" 200 OK
openwebui  | INFO:apps.openai.main:get_all_models()
openwebui  | DEBUG:peewee:('SELECT "t1"."id", "t1"."name", "t1"."email", "t1"."role", "t1"."profile_image_url", "t1"."timestamp" FROM "user" AS "t1" WHERE ("t1"."id" = ?) LIMIT ? OFFSET ?', ['dd9eff39-5e17-4ed1-a88f-d514d41d1d22', 1, 0])
openwebui  | INFO:apps.openai.main:get_all_models()
openwebui  | INFO:     172.22.0.1:0 - "GET /openai/api/models HTTP/1.1" 200 OK
openwebui  | DEBUG:peewee:('SELECT "t1"."id", "t1"."name", "t1"."email", "t1"."role", "t1"."profile_image_url", "t1"."timestamp" FROM "user" AS "t1" WHERE ("t1"."id" = ?) LIMIT ? OFFSET ?', ['dd9eff39-5e17-4ed1-a88f-d514d41d1d22', 1, 0])
openwebui  | DEBUG:apps.litellm.main:user: id='dd9eff39-5e17-4ed1-a88f-d514d41d1d22' name='Iagen' email='iagen.innov@cea.fr' role='admin' profile_image_url='/user.png' timestamp=1711984127
openwebui  | Traceback (most recent call last):
openwebui  |   File "/usr/local/lib/python3.11/site-packages/litellm/proxy/proxy_server.py", line 372, in user_api_key_auth
openwebui  |     assert api_key.startswith("sk-")  # prevent token hashes from being used
openwebui  |     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
openwebui  | AssertionError
openwebui  | INFO:     172.22.0.1:0 - "GET /litellm/api/v1/models HTTP/1.1" 200 OK
openwebui  | INFO:     172.22.0.1:0 - "GET /_app/immutable/chunks/languages.2f0a333b.js HTTP/1.1" 200 OK
openwebui  | INFO:apps.ollama.main:get_all_models()
openwebui  | DEBUG:peewee:('SELECT "t1"."id", "t1"."name", "t1"."email", "t1"."role", "t1"."profile_image_url", "t1"."timestamp" FROM "user" AS "t1" WHERE ("t1"."id" = ?) LIMIT ? OFFSET ?', ['dd9eff39-5e17-4ed1-a88f-d514d41d1d22', 1, 0])
openwebui  | INFO:     172.22.0.1:0 - "GET /ollama/urls HTTP/1.1" 200 OK
openwebui  | INFO:apps.ollama.main:get_all_models()
openwebui  | INFO:     172.22.0.1:0 - "GET /ollama/api/version HTTP/1.1" 200 OK
openwebui  | DEBUG:peewee:('SELECT "t1"."id", "t1"."name", "t1"."email", "t1"."role", "t1"."profile_image_url", "t1"."timestamp" FROM "user" AS "t1" WHERE ("t1"."id" = ?) LIMIT ? OFFSET ?', ['dd9eff39-5e17-4ed1-a88f-d514d41d1d22', 1, 0])
openwebui  | DEBUG:apps.litellm.main:user: id='dd9eff39-5e17-4ed1-a88f-d514d41d1d22' name='Iagen' email='iagen.innov@cea.fr' role='admin' profile_image_url='/user.png' timestamp=1711984127
openwebui  | Traceback (most recent call last):
openwebui  |   File "/usr/local/lib/python3.11/site-packages/litellm/proxy/proxy_server.py", line 372, in user_api_key_auth
openwebui  |     assert api_key.startswith("sk-")  # prevent token hashes from being used
openwebui  |     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
openwebui  | AssertionError
openwebui  | INFO:     172.22.0.1:0 - "GET /litellm/api/model/info HTTP/1.1" 401 Unauthorized

Environment

  • Operating System: [e.g., Windows 10, macOS Big Sur, Ubuntu 20.04]
  • Browser (if applicable): [e.g., Chrome 100.0, Firefox 98.0]

Reproduction Details

Confirmation:

  • [ X] I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • [X ] I have included the Docker container logs.
Originally created by @flefevre on GitHub (Apr 1, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/1379 # Bug Report ## Description **Bug Summary:** When going to the Open WebUi model panel, if try to click on the litellm config panel, you have no models listed. moreover in the log, you have a `openwebui | INFO: 172.22.0.1:0 - "GET /litellm/api/model/info HTTP/1.1" 401 Unauthorized ` **Expected Behavior:** Normally shoud be able to connect to Litellm configured with https://docs.litellm.ai/docs/proxy/virtual_keys#custom-auth and list all models in the config files **Actual Behavior:** It load the litellm-config.yalm file ``` openwebui | 15:08:20 - LiteLLM Proxy:DEBUG: Loaded config YAML (api_key and environment_variables are not shown): openwebui | { openwebui | "model_list": [ openwebui | { openwebui | "model_name": "dolphin-phi", openwebui | "litellm_params": { openwebui | "model": "ollama/dolphin-phi", openwebui | "api_base": "http://ollama:11434", openwebui | "stream": true openwebui | } openwebui | }, openwebui | { openwebui | "model_name": "mistral", openwebui | "litellm_params": { openwebui | "model": "vllm/mistralai/Mistral-7B-v0.1", openwebui | "api_base": "http://vllm:5001/v1", openwebui | "api_key": "yourapikey", openwebui | "stream": true openwebui | } openwebui | } openwebui | ], openwebui | "litellm_settings": { openwebui | "drop_params": true, openwebui | "max_budget": 100, openwebui | "budget_duration": "30d", openwebui | "cache": true, openwebui | "cache_params": { openwebui | "type": "redis" openwebui | } openwebui | }, openwebui | "general_settings": { openwebui | "master_key": "sk-1234" openwebui | } openwebui | } ``` but it seems to have a problem to acess the master key? ``` openwebui | File "/usr/local/lib/python3.11/site-packages/litellm/proxy/proxy_server.py", line 372, in user_api_key_auth openwebui | assert api_key.startswith("sk-") # prevent token hashes from being used openwebui | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ openwebui | AssertionError openwebui | INFO: 172.22.0.1:0 - "GET /litellm/api/v1/models HTTP/1.1" 200 OK openwebui | DEBUG:peewee:('SELECT "t1"."id", "t1"."name", "t1"."email", "t1"."role", "t1"."profile_image_url", "t1"."timestamp" FROM "user" AS "t1" WHERE ("t1"."id" = ?) LIMIT ? OFFSET ?', ['dd9eff39-5e17-4ed1-a88f-d514d41d1d22', 1, 0]) openwebui | DEBUG:peewee:('SELECT "t1"."id", "t1"."tag_name", "t1"."user_id", "t1"."modelfile", "t1"."timestamp" FROM "modelfile" AS "t1"', []) openwebui | INFO: 172.22.0.1:0 - "GET /api/v1/modelfiles/ HTTP/1.1" 200 OK openwebui | DEBUG:peewee:('SELECT "t1"."id", "t1"."name", "t1"."email", "t1"."role", "t1"."profile_image_url", "t1"."timestamp" FROM "user" AS "t1" WHERE ("t1"."id" = ?) LIMIT ? OFFSET ?', ['dd9eff39-5e17-4ed1-a88f-d514d41d1d22', 1, 0]) openwebui | DEBUG:peewee:('SELECT "t1"."id", "t1"."command", "t1"."user_id", "t1"."title", "t1"."content", "t1"."timestamp" FROM "prompt" AS "t1"', []) openwebui | INFO: 172.22.0.1:0 - "GET /api/v1/prompts/ HTTP/1.1" 200 OK openwebui | DEBUG:peewee:('SELECT "t1"."id", "t1"."name", "t1"."email", "t1"."role", "t1"."profile_image_url", "t1"."timestamp" FROM "user" AS "t1" WHERE ("t1"."id" = ?) LIMIT ? OFFSET ?', ['dd9eff39-5e17-4ed1-a88f-d514d41d1d22', 1, 0]) openwebui | DEBUG:peewee:('SELECT "t1"."id", "t1"."collection_name", "t1"."name", "t1"."title", "t1"."filename", "t1"."content", "t1"."user_id", "t1"."timestamp" FROM "document" AS "t1"', []) openwebui | INFO: 172.22.0.1:0 - "GET /api/v1/documents/ HTTP/1.1" 200 OK openwebui | DEBUG:peewee:('SELECT "t1"."id", "t1"."name", "t1"."email", "t1"."role", "t1"."profile_image_url", "t1"."timestamp" FROM "user" AS "t1" WHERE ("t1"."id" = ?) LIMIT ? OFFSET ?', ['dd9eff39-5e17-4ed1-a88f-d514d41d1d22', 1, 0]) openwebui | DEBUG:peewee:('SELECT "t1"."id", "t1"."tag_name", "t1"."chat_id", "t1"."user_id", "t1"."timestamp" FROM "chatidtag" AS "t1" WHERE ("t1"."user_id" = ?) ORDER BY "t1"."timestamp" DESC', ['dd9eff39-5e17-4ed1-a88f-d514d41d1d22']) openwebui | DEBUG:peewee:('SELECT "t1"."id", "t1"."name", "t1"."user_id", "t1"."data" FROM "tag" AS "t1" WHERE (0 = 1)', []) openwebui | INFO: 172.22.0.1:0 - "GET /api/v1/chats/tags/all HTTP/1.1" 200 OK openwebui | INFO:apps.ollama.main:get_all_models() openwebui | DEBUG:peewee:('SELECT "t1"."id", "t1"."name", "t1"."email", "t1"."role", "t1"."profile_image_url", "t1"."timestamp" FROM "user" AS "t1" WHERE ("t1"."id" = ?) LIMIT ? OFFSET ?', ['dd9eff39-5e17-4ed1-a88f-d514d41d1d22', 1, 0]) openwebui | INFO:apps.ollama.main:get_all_models() openwebui | INFO: 172.22.0.1:0 - "GET /ollama/api/tags HTTP/1.1" 200 OK openwebui | INFO:apps.ollama.main:get_all_models() openwebui | INFO: 172.22.0.1:0 - "GET /api/changelog HTTP/1.1" 200 OK openwebui | DEBUG:peewee:('SELECT "t1"."id", "t1"."name", "t1"."email", "t1"."role", "t1"."profile_image_url", "t1"."timestamp" FROM "user" AS "t1" WHERE ("t1"."id" = ?) LIMIT ? OFFSET ?', ['dd9eff39-5e17-4ed1-a88f-d514d41d1d22', 1, 0]) openwebui | DEBUG:peewee:('SELECT "t1"."id", "t1"."user_id", "t1"."title", "t1"."chat", "t1"."timestamp" FROM "chat" AS "t1" WHERE ("t1"."user_id" = ?) ORDER BY "t1"."timestamp" DESC', ['dd9eff39-5e17-4ed1-a88f-d514d41d1d22']) openwebui | INFO: 172.22.0.1:0 - "GET /api/v1/chats/ HTTP/1.1" 200 OK openwebui | INFO: 172.22.0.1:0 - "GET /user.png HTTP/1.1" 200 OK openwebui | INFO: 172.22.0.1:0 - "GET /ollama/api/version HTTP/1.1" 200 OK openwebui | INFO:apps.openai.main:get_all_models() openwebui | DEBUG:peewee:('SELECT "t1"."id", "t1"."name", "t1"."email", "t1"."role", "t1"."profile_image_url", "t1"."timestamp" FROM "user" AS "t1" WHERE ("t1"."id" = ?) LIMIT ? OFFSET ?', ['dd9eff39-5e17-4ed1-a88f-d514d41d1d22', 1, 0]) openwebui | INFO:apps.openai.main:get_all_models() openwebui | INFO: 172.22.0.1:0 - "GET /openai/api/models HTTP/1.1" 200 OK openwebui | DEBUG:peewee:('SELECT "t1"."id", "t1"."name", "t1"."email", "t1"."role", "t1"."profile_image_url", "t1"."timestamp" FROM "user" AS "t1" WHERE ("t1"."id" = ?) LIMIT ? OFFSET ?', ['dd9eff39-5e17-4ed1-a88f-d514d41d1d22', 1, 0]) openwebui | DEBUG:apps.litellm.main:user: id='dd9eff39-5e17-4ed1-a88f-d514d41d1d22' name='Iagen' email='iagen.innov@cea.fr' role='admin' profile_image_url='/user.png' timestamp=1711984127 openwebui | Traceback (most recent call last): openwebui | File "/usr/local/lib/python3.11/site-packages/litellm/proxy/proxy_server.py", line 372, in user_api_key_auth openwebui | assert api_key.startswith("sk-") # prevent token hashes from being used openwebui | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ openwebui | AssertionError openwebui | INFO: 172.22.0.1:0 - "GET /litellm/api/v1/models HTTP/1.1" 200 OK openwebui | INFO: 172.22.0.1:0 - "GET /_app/immutable/chunks/languages.2f0a333b.js HTTP/1.1" 200 OK openwebui | INFO:apps.ollama.main:get_all_models() openwebui | DEBUG:peewee:('SELECT "t1"."id", "t1"."name", "t1"."email", "t1"."role", "t1"."profile_image_url", "t1"."timestamp" FROM "user" AS "t1" WHERE ("t1"."id" = ?) LIMIT ? OFFSET ?', ['dd9eff39-5e17-4ed1-a88f-d514d41d1d22', 1, 0]) openwebui | INFO: 172.22.0.1:0 - "GET /ollama/urls HTTP/1.1" 200 OK openwebui | INFO:apps.ollama.main:get_all_models() openwebui | INFO: 172.22.0.1:0 - "GET /ollama/api/version HTTP/1.1" 200 OK openwebui | DEBUG:peewee:('SELECT "t1"."id", "t1"."name", "t1"."email", "t1"."role", "t1"."profile_image_url", "t1"."timestamp" FROM "user" AS "t1" WHERE ("t1"."id" = ?) LIMIT ? OFFSET ?', ['dd9eff39-5e17-4ed1-a88f-d514d41d1d22', 1, 0]) openwebui | DEBUG:apps.litellm.main:user: id='dd9eff39-5e17-4ed1-a88f-d514d41d1d22' name='Iagen' email='iagen.innov@cea.fr' role='admin' profile_image_url='/user.png' timestamp=1711984127 openwebui | Traceback (most recent call last): openwebui | File "/usr/local/lib/python3.11/site-packages/litellm/proxy/proxy_server.py", line 372, in user_api_key_auth openwebui | assert api_key.startswith("sk-") # prevent token hashes from being used openwebui | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ openwebui | AssertionError openwebui | INFO: 172.22.0.1:0 - "GET /litellm/api/model/info HTTP/1.1" 401 Unauthorized ``` ## Environment - **Operating System:** [e.g., Windows 10, macOS Big Sur, Ubuntu 20.04] - **Browser (if applicable):** [e.g., Chrome 100.0, Firefox 98.0] ## Reproduction Details **Confirmation:** - [ X] I have read and followed all the instructions provided in the README.md. - [X] I am on the latest version of both Open WebUI and Ollama. - [ ] I have included the browser console logs. - [X ] I have included the Docker container logs.
Author
Owner

@justinh-rahb commented on GitHub (Apr 1, 2024):

It appears there has been a confusion or incorrect usage of our internal LiteLLM connection, which might be due to the mix-up of outdated information from past discussions with the present situation. While your custom configuration file should function correctly in an external LiteLLM proxy server container, it is essential to note that our system is not intended for this kind of setup.

<!-- gh-comment-id:2030028211 --> @justinh-rahb commented on GitHub (Apr 1, 2024): It appears there has been a confusion or incorrect usage of our internal LiteLLM connection, which might be due to the mix-up of outdated information from past discussions with the present situation. While your custom configuration file should function correctly in an external LiteLLM proxy server container, it is essential to note that our system is not intended for this kind of setup.
Author
Owner

@flefevre commented on GitHub (Apr 1, 2024):

ok, thanks you for your answer.
I am closing the issue.

<!-- gh-comment-id:2030166721 --> @flefevre commented on GitHub (Apr 1, 2024): ok, thanks you for your answer. I am closing the issue.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#12469