[GH-ISSUE #6139] Internal Server Error after adding a pipeline connection #52921

Closed
opened 2026-05-05 14:08:05 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @voka12345 on GitHub (Oct 12, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/6139

Bug Report

Installation Method

docker run -d --gpus=all -v ollama:/root/.ollama -p 11434:11434 --name ollama --restart always ollama/ollama
docker run -d -p 80:8080 --add-host=host.docker.internal:host-gateway -v ollama-webui:/app/backend/data --name ollama-webui --restart always ghcr.io/open-webui/open-webui:0.3.32
docker run -d -p 9099:9099 --add-host=host.docker.internal:host-gateway -v pipelines:/app/pipelines --name pipelines --restart always ghcr.io/open-webui/pipelines:main

Environment

  • Open WebUI Version: 0.3.32

  • Operating System: Ubuntu 22.04.4 LTS

  • Browser (if applicable): Chrome Version 122.0.6261.111

Confirmation:

  • [ x] I have read and followed all the instructions provided in the README.md.
  • [x ] I am on the latest version of both Open WebUI and Ollama.
  • [x ] I have included the browser console logs.
  • [x ] I have included the Docker container logs.
  • I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below.

Expected Behavior:

I am trying to connect pipelines to my WebUI

Actual Behavior:

InternalServerError After adding new connection

Description

Bug Summary:
[Provide a brief but clear summary of the bug]

Reproduction Details

Steps to Reproduce:

  1. Start olama server, olama-web-ui, pipelines on docker (see installation method below)
  2. Go to Admin > Connections, add http://host.docker.internal:9099
  3. Click Save
  4. Click through UI, navigate to Pipelines
  5. UI eventually first shows loading on all pages, after trying switching between tabs for a short time, returns InternalServerErorr on all consequent requests

Logs and Screenshots

Browser Console Logs:
[Include relevant browser console logs, if applicable]

Docker Container Logs:
File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__ response = await self.dispatch_func(request, call_next) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/backend/open_webui/main.py", line 884, in inspect_websocket return await call_next(request) ^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 165, in call_next raise app_exc File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 151, in coro await self.app(scope, receive_or_disconnect, send_no_error) File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__ with collapse_excgroups(): File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__ self.gen.throw(typ, value, traceback) File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups raise exc File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__ response = await self.dispatch_func(request, call_next) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/backend/open_webui/main.py", line 863, in update_embedding_function response = await call_next(request) ^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 165, in call_next raise app_exc File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 151, in coro await self.app(scope, receive_or_disconnect, send_no_error) File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__ with collapse_excgroups(): File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__ self.gen.throw(typ, value, traceback) File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups raise exc File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__ response = await self.dispatch_func(request, call_next) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/backend/open_webui/main.py", line 849, in check_url await get_all_models() File "/app/backend/open_webui/main.py", line 914, in get_all_models ollama_models = await get_ollama_models() ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/backend/open_webui/apps/ollama/main.py", line 217, in get_all_models "models": merge_models_lists( ^^^^^^^^^^^^^^^^^^^ File "/app/backend/open_webui/apps/ollama/main.py", line 194, in merge_models_lists for idx, model_list in enumerate(model_lists): File "/app/backend/open_webui/apps/ollama/main.py", line 219, in <lambda> lambda response: response["models"] if response else None, responses ~~~~~~~~^^^^^^^^^^ KeyError: 'models'
Screenshots/Screen Recordings (if applicable):

image

Additional Information

When checking the connection, the pop up appeared: 'Server Connection Error'. From withing the container, pipelines are reachable ( curl http://host.docker.internal:9099 {"status":true}(

Note

If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!

Originally created by @voka12345 on GitHub (Oct 12, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/6139 # Bug Report ## Installation Method docker run -d --gpus=all -v ollama:/root/.ollama -p 11434:11434 --name ollama --restart always ollama/ollama docker run -d -p 80:8080 --add-host=host.docker.internal:host-gateway -v ollama-webui:/app/backend/data --name ollama-webui --restart always ghcr.io/open-webui/open-webui:0.3.32 docker run -d -p 9099:9099 --add-host=host.docker.internal:host-gateway -v pipelines:/app/pipelines --name pipelines --restart always ghcr.io/open-webui/pipelines:main ## Environment - **Open WebUI Version:** 0.3.32 - **Operating System:** Ubuntu 22.04.4 LTS - **Browser (if applicable):** Chrome Version 122.0.6261.111 **Confirmation:** - [ x] I have read and followed all the instructions provided in the README.md. - [x ] I am on the latest version of both Open WebUI and Ollama. - [x ] I have included the browser console logs. - [x ] I have included the Docker container logs. - [x] I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below. ## Expected Behavior: I am trying to connect pipelines to my WebUI ## Actual Behavior: InternalServerError After adding new connection ## Description **Bug Summary:** [Provide a brief but clear summary of the bug] ## Reproduction Details **Steps to Reproduce:** 1. Start olama server, olama-web-ui, pipelines on docker (see installation method below) 2. Go to Admin > Connections, add http://host.docker.internal:9099 3. Click Save 4. Click through UI, navigate to Pipelines 5. UI eventually first shows loading on all pages, after trying switching between tabs for a short time, returns InternalServerErorr on all consequent requests ## Logs and Screenshots **Browser Console Logs:** [Include relevant browser console logs, if applicable] **Docker Container Logs:** ` File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__ response = await self.dispatch_func(request, call_next) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/backend/open_webui/main.py", line 884, in inspect_websocket return await call_next(request) ^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 165, in call_next raise app_exc File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 151, in coro await self.app(scope, receive_or_disconnect, send_no_error) File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__ with collapse_excgroups(): File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__ self.gen.throw(typ, value, traceback) File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups raise exc File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__ response = await self.dispatch_func(request, call_next) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/backend/open_webui/main.py", line 863, in update_embedding_function response = await call_next(request) ^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 165, in call_next raise app_exc File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 151, in coro await self.app(scope, receive_or_disconnect, send_no_error) File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__ with collapse_excgroups(): File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__ self.gen.throw(typ, value, traceback) File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups raise exc File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__ response = await self.dispatch_func(request, call_next) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/backend/open_webui/main.py", line 849, in check_url await get_all_models() File "/app/backend/open_webui/main.py", line 914, in get_all_models ollama_models = await get_ollama_models() ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/backend/open_webui/apps/ollama/main.py", line 217, in get_all_models "models": merge_models_lists( ^^^^^^^^^^^^^^^^^^^ File "/app/backend/open_webui/apps/ollama/main.py", line 194, in merge_models_lists for idx, model_list in enumerate(model_lists): File "/app/backend/open_webui/apps/ollama/main.py", line 219, in <lambda> lambda response: response["models"] if response else None, responses ~~~~~~~~^^^^^^^^^^ KeyError: 'models' ` **Screenshots/Screen Recordings (if applicable):** ![image](https://github.com/user-attachments/assets/28f9599a-2a4a-4bb8-870e-8db3a1e61c04) ## Additional Information When checking the connection, the pop up appeared: 'Server Connection Error'. From withing the container, pipelines are reachable ( curl http://host.docker.internal:9099 {"status":true}( ## Note If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#52921