mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 19:08:59 -05:00
[PR #22589] [CLOSED] fix: propagate Ollama embedding HTTP errors instead of silently returning None #49820
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
📋 Pull Request Information
Original PR: https://github.com/open-webui/open-webui/pull/22589
Author: @NIK-TIGER-BILL
Created: 3/11/2026
Status: ❌ Closed
Base:
main← Head:fix/embedding-503-propagate-error📝 Commits (1)
d1a75d5fix: propagate Ollama embedding HTTP errors instead of silently returning None📊 Changes
1 file changed (+10 additions, -2 deletions)
View changed files
📝
backend/open_webui/retrieval/utils.py(+10 -2)📄 Description
Problem
When Ollama returns a non-2xx response (e.g. 503 while the embedding model is reloading after its TTL expires),
agenerate_ollama_batch_embeddings()catches the exception and returnsNonesilently (fixes #22571).The caller then tries
len(None)orNone[idx], raising a confusingTypeError/IndexErrorthat only appears in server logs. The end user sees the file silently disappear from the Knowledge Collection with no error message.Root cause
Fix
raise_for_status()with an explicit check that includes the HTTP status and truncated response body in the exception message — making the error actionable (503 Service Unavailablevs a generic crash).None, so it propagates toprocess_file(), which marks the file asfailedand returns an informative HTTP 400 to the caller.Fixes #22571
🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.