[PR #22589] [CLOSED] fix: propagate Ollama embedding HTTP errors instead of silently returning None #42402

Closed
opened 2026-04-25 14:18:41 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/open-webui/open-webui/pull/22589
Author: @NIK-TIGER-BILL
Created: 3/11/2026
Status: Closed

Base: mainHead: fix/embedding-503-propagate-error


📝 Commits (1)

  • d1a75d5 fix: propagate Ollama embedding HTTP errors instead of silently returning None

📊 Changes

1 file changed (+10 additions, -2 deletions)

View changed files

📝 backend/open_webui/retrieval/utils.py (+10 -2)

📄 Description

Problem

When Ollama returns a non-2xx response (e.g. 503 while the embedding model is reloading after its TTL expires), agenerate_ollama_batch_embeddings() catches the exception and returns None silently (fixes #22571).

The caller then tries len(None) or None[idx], raising a confusing TypeError / IndexError that only appears in server logs. The end user sees the file silently disappear from the Knowledge Collection with no error message.

Root cause

# Before
except Exception as e:
    log.exception(f"Error generating ollama batch embeddings: {e}")
    return None   # ← swallows the error; caller crashes on None[idx]

Fix

  1. Replace implicit raise_for_status() with an explicit check that includes the HTTP status and truncated response body in the exception message — making the error actionable (503 Service Unavailable vs a generic crash).
  2. Re-raise the exception instead of returning None, so it propagates to process_file(), which marks the file as failed and returns an informative HTTP 400 to the caller.
if not r.ok:
    body = await r.text()
    raise Exception(f"Ollama embedding API returned HTTP {r.status}: {body[:200]}")
...
except Exception as e:
    log.exception(...)
    raise   # ← propagate so process_file() can mark status=failed

Fixes #22571


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/open-webui/open-webui/pull/22589 **Author:** [@NIK-TIGER-BILL](https://github.com/NIK-TIGER-BILL) **Created:** 3/11/2026 **Status:** ❌ Closed **Base:** `main` ← **Head:** `fix/embedding-503-propagate-error` --- ### 📝 Commits (1) - [`d1a75d5`](https://github.com/open-webui/open-webui/commit/d1a75d5e21b641fdda3e1d192614b90e4122d479) fix: propagate Ollama embedding HTTP errors instead of silently returning None ### 📊 Changes **1 file changed** (+10 additions, -2 deletions) <details> <summary>View changed files</summary> 📝 `backend/open_webui/retrieval/utils.py` (+10 -2) </details> ### 📄 Description ## Problem When Ollama returns a non-2xx response (e.g. **503** while the embedding model is reloading after its TTL expires), `agenerate_ollama_batch_embeddings()` catches the exception and **returns `None` silently** (fixes #22571). The caller then tries `len(None)` or `None[idx]`, raising a confusing `TypeError` / `IndexError` that only appears in server logs. The end user sees the file silently disappear from the Knowledge Collection with no error message. ## Root cause ```python # Before except Exception as e: log.exception(f"Error generating ollama batch embeddings: {e}") return None # ← swallows the error; caller crashes on None[idx] ``` ## Fix 1. Replace implicit `raise_for_status()` with an explicit check that includes the HTTP status and truncated response body in the exception message — making the error actionable (`503 Service Unavailable` vs a generic crash). 2. **Re-raise** the exception instead of returning `None`, so it propagates to `process_file()`, which marks the file as `failed` and returns an informative HTTP 400 to the caller. ```python if not r.ok: body = await r.text() raise Exception(f"Ollama embedding API returned HTTP {r.status}: {body[:200]}") ... except Exception as e: log.exception(...) raise # ← propagate so process_file() can mark status=failed ``` Fixes #22571 --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-25 14:18:41 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#42402