mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 10:58:17 -05:00
[PR #23483] [MERGED] feat: add /v1/responses proxy endpoint for Ollama #66072
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
📋 Pull Request Information
Original PR: https://github.com/open-webui/open-webui/pull/23483
Author: @Classic298
Created: 4/7/2026
Status: ✅ Merged
Merged: 4/8/2026
Merged by: @tjbck
Base:
dev← Head:feat/ollama-responses-api-support📝 Commits (1)
e5ba6c6feat: add /v1/responses proxy endpoint for Ollama📊 Changes
1 file changed (+78 additions, -0 deletions)
View changed files
📝
backend/open_webui/routers/ollama.py(+78 -0)📄 Description
Ollama recently added Responses API support via its OpenAI-compatible endpoint (/v1/responses). This adds a proxy endpoint to the Ollama router that forwards requests to Ollama's /v1/responses, applying the same model resolution, access control, and prefix_id handling used by the existing /v1/chat/completions and /v1/messages proxies.
This allows API consumers (Codex, Claude Code, etc.) to use the Responses API directly with Ollama-hosted models without requiring a separate OpenAI-compatible connection.
Contributor License Agreement
🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.