[GH-ISSUE #10309] Support for OpenAI Responses API (for Codex CLI compatibility) #68826

Closed
opened 2026-05-04 15:22:23 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @vmotta8 on GitHub (Apr 16, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10309

Originally assigned to: @drifkin, @ParthSareen on GitHub.

The new OpenAI Codex CLI uses OpenAI’s /v1/responses endpoint, enabling agent-like functionality directly from the terminal. Currently, Ollama’s OpenAI compatibility layer only supports the /chat/completions endpoint. This would allow developers to point Codex CLI directly to Ollama.

Originally created by @vmotta8 on GitHub (Apr 16, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10309 Originally assigned to: @drifkin, @ParthSareen on GitHub. The new [OpenAI Codex CLI](https://github.com/openai/codex) uses OpenAI’s [/v1/responses endpoint](https://platform.openai.com/docs/guides/responses-vs-chat-completions), enabling agent-like functionality directly from the terminal. Currently, Ollama’s OpenAI compatibility layer only supports the /chat/completions endpoint. This would allow developers to point Codex CLI directly to Ollama.
GiteaMirror added the feature request label 2026-05-04 15:22:23 -05:00
Author
Owner

@xyb commented on GitHub (Apr 18, 2025):

duplicate of #9659

<!-- gh-comment-id:2815213781 --> @xyb commented on GitHub (Apr 18, 2025): duplicate of #9659
Author
Owner

@pdevine commented on GitHub (Sep 3, 2025):

Let's track this in #9659 . It is something we're looking at.

<!-- gh-comment-id:3247205690 --> @pdevine commented on GitHub (Sep 3, 2025): Let's track this in #9659 . It is something we're looking at.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#68826