mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-07 11:28:35 -05:00
[GH-ISSUE #6139] Internal Server Error after adding a pipeline connection #14255
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @voka12345 on GitHub (Oct 12, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/6139
Bug Report
Installation Method
docker run -d --gpus=all -v ollama:/root/.ollama -p 11434:11434 --name ollama --restart always ollama/ollama
docker run -d -p 80:8080 --add-host=host.docker.internal:host-gateway -v ollama-webui:/app/backend/data --name ollama-webui --restart always ghcr.io/open-webui/open-webui:0.3.32
docker run -d -p 9099:9099 --add-host=host.docker.internal:host-gateway -v pipelines:/app/pipelines --name pipelines --restart always ghcr.io/open-webui/pipelines:main
Environment
Open WebUI Version: 0.3.32
Operating System: Ubuntu 22.04.4 LTS
Browser (if applicable): Chrome Version 122.0.6261.111
Confirmation:
Expected Behavior:
I am trying to connect pipelines to my WebUI
Actual Behavior:
InternalServerError After adding new connection
Description
Bug Summary:
[Provide a brief but clear summary of the bug]
Reproduction Details
Steps to Reproduce:
Logs and Screenshots
Browser Console Logs:
[Include relevant browser console logs, if applicable]
Docker Container Logs:
File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__ response = await self.dispatch_func(request, call_next) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/backend/open_webui/main.py", line 884, in inspect_websocket return await call_next(request) ^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 165, in call_next raise app_exc File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 151, in coro await self.app(scope, receive_or_disconnect, send_no_error) File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__ with collapse_excgroups(): File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__ self.gen.throw(typ, value, traceback) File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups raise exc File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__ response = await self.dispatch_func(request, call_next) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/backend/open_webui/main.py", line 863, in update_embedding_function response = await call_next(request) ^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 165, in call_next raise app_exc File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 151, in coro await self.app(scope, receive_or_disconnect, send_no_error) File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__ with collapse_excgroups(): File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__ self.gen.throw(typ, value, traceback) File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups raise exc File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__ response = await self.dispatch_func(request, call_next) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/backend/open_webui/main.py", line 849, in check_url await get_all_models() File "/app/backend/open_webui/main.py", line 914, in get_all_models ollama_models = await get_ollama_models() ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/backend/open_webui/apps/ollama/main.py", line 217, in get_all_models "models": merge_models_lists( ^^^^^^^^^^^^^^^^^^^ File "/app/backend/open_webui/apps/ollama/main.py", line 194, in merge_models_lists for idx, model_list in enumerate(model_lists): File "/app/backend/open_webui/apps/ollama/main.py", line 219, in <lambda> lambda response: response["models"] if response else None, responses ~~~~~~~~^^^^^^^^^^ KeyError: 'models'Screenshots/Screen Recordings (if applicable):
Additional Information
When checking the connection, the pop up appeared: 'Server Connection Error'. From withing the container, pipelines are reachable ( curl http://host.docker.internal:9099 {"status":true}(
Note
If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!