[PR #23483] [MERGED] feat: add /v1/responses proxy endpoint for Ollama #66072

Closed
opened 2026-05-06 12:11:16 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/open-webui/open-webui/pull/23483
Author: @Classic298
Created: 4/7/2026
Status: Merged
Merged: 4/8/2026
Merged by: @tjbck

Base: devHead: feat/ollama-responses-api-support


📝 Commits (1)

  • e5ba6c6 feat: add /v1/responses proxy endpoint for Ollama

📊 Changes

1 file changed (+78 additions, -0 deletions)

View changed files

📝 backend/open_webui/routers/ollama.py (+78 -0)

📄 Description

Ollama recently added Responses API support via its OpenAI-compatible endpoint (/v1/responses). This adds a proxy endpoint to the Ollama router that forwards requests to Ollama's /v1/responses, applying the same model resolution, access control, and prefix_id handling used by the existing /v1/chat/completions and /v1/messages proxies.

This allows API consumers (Codex, Claude Code, etc.) to use the Responses API directly with Ollama-hosted models without requiring a separate OpenAI-compatible connection.

Contributor License Agreement

Note

Deleting the CLA section will lead to immediate closure of your PR and it will not be merged in.


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/open-webui/open-webui/pull/23483 **Author:** [@Classic298](https://github.com/Classic298) **Created:** 4/7/2026 **Status:** ✅ Merged **Merged:** 4/8/2026 **Merged by:** [@tjbck](https://github.com/tjbck) **Base:** `dev` ← **Head:** `feat/ollama-responses-api-support` --- ### 📝 Commits (1) - [`e5ba6c6`](https://github.com/open-webui/open-webui/commit/e5ba6c6a5e7c090c6fcc4840e39a99a42a746a8b) feat: add /v1/responses proxy endpoint for Ollama ### 📊 Changes **1 file changed** (+78 additions, -0 deletions) <details> <summary>View changed files</summary> 📝 `backend/open_webui/routers/ollama.py` (+78 -0) </details> ### 📄 Description Ollama recently added Responses API support via its OpenAI-compatible endpoint (/v1/responses). This adds a proxy endpoint to the Ollama router that forwards requests to Ollama's /v1/responses, applying the same model resolution, access control, and prefix_id handling used by the existing /v1/chat/completions and /v1/messages proxies. This allows API consumers (Codex, Claude Code, etc.) to use the Responses API directly with Ollama-hosted models without requiring a separate OpenAI-compatible connection. ### Contributor License Agreement <!-- 🚨 DO NOT DELETE THE TEXT BELOW 🚨 Keep the "Contributor License Agreement" confirmation text intact. Deleting it will trigger the CLA-Bot to INVALIDATE your PR. Your PR will NOT be reviewed or merged until you check the box below confirming that you have read and agree to the terms of the CLA. --> - [X] By submitting this pull request, I confirm that I have read and fully agree to the [Contributor License Agreement (CLA)](https://github.com/open-webui/open-webui/blob/main/CONTRIBUTOR_LICENSE_AGREEMENT), and I am providing my contributions under its terms. > [!NOTE] > Deleting the CLA section will lead to immediate closure of your PR and it will not be merged in. --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-05-06 12:11:16 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#66072