[GH-ISSUE #15921] OpenAI Compatibility: Support namespace field in tool calls for Responses API parity #72200

Open
opened 2026-05-05 03:37:28 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @rnett on GitHub (May 1, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/15921

What is the issue?

The OpenAI "Responses API" (used by tools like Codex CLI) has introduced namespaced tool calling. This allows tools to be grouped (e.g., mcp__tilth__.search) and expects a response format that includes a top-level namespace field alongside the tool name.

Currently, Ollama flattens these into a single name string in the /v1/chat/completions output. This prevents clients that strictly follow the Responses API (like Codex) from resolving the tool call correctly, as they expect the namespace field to be preserved if it was provided in the initial tool spec.

Reproduction Details

  1. Provide a tool definition to Ollama that includes a namespace field (or is grouped under a namespace type as seen in the newer Responses API).
  2. Have the model call that tool.
  3. Observe the output JSON.
  4. Actual: {"tool_calls": [{"function": {"name": "mcp__server__tool_name", ...}}]}
  5. Expected (for Responses API parity): {"tool_calls": [{"function": {"name": "tool_name", "namespace": "mcp__server__", ...}}]}

Relevant log output

N/A (This is a protocol compatibility issue in the OpenAI-compatible translation layer).

Environment

  • OS: Windows 11 / Linux
  • Ollama version: 0.4.7+
Originally created by @rnett on GitHub (May 1, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/15921 ### What is the issue? The OpenAI "Responses API" (used by tools like Codex CLI) has introduced namespaced tool calling. This allows tools to be grouped (e.g., `mcp__tilth__.search`) and expects a response format that includes a top-level `namespace` field alongside the tool `name`. Currently, Ollama flattens these into a single `name` string in the `/v1/chat/completions` output. This prevents clients that strictly follow the Responses API (like Codex) from resolving the tool call correctly, as they expect the `namespace` field to be preserved if it was provided in the initial tool spec. ### Reproduction Details 1. Provide a tool definition to Ollama that includes a `namespace` field (or is grouped under a namespace type as seen in the newer Responses API). 2. Have the model call that tool. 3. Observe the output JSON. 4. **Actual:** `{"tool_calls": [{"function": {"name": "mcp__server__tool_name", ...}}]}` 5. **Expected (for Responses API parity):** `{"tool_calls": [{"function": {"name": "tool_name", "namespace": "mcp__server__", ...}}]}` ### Relevant log output N/A (This is a protocol compatibility issue in the OpenAI-compatible translation layer). ### Environment * **OS:** Windows 11 / Linux * **Ollama version:** 0.4.7+
Author
Owner

@MukundaKatta commented on GitHub (May 3, 2026):

The Responses API and chat-completions API differ on this exact point, and Ollama emulates the older chat-completions shape, so technically this isn't a bug against the API it claims to implement. The cleaner solution is probably an opt-in flag like OLLAMA_OPENAI_RESPONSES_PARITY=1 that switches output to the Responses-shaped envelope, leaving the default unchanged for the many existing clients that depend on the current flat name. Codex CLI and similar Responses-native clients can flip the flag.

<!-- gh-comment-id:4366284607 --> @MukundaKatta commented on GitHub (May 3, 2026): The Responses API and chat-completions API differ on this exact point, and Ollama emulates the older chat-completions shape, so technically this isn't a bug against the API it claims to implement. The cleaner solution is probably an opt-in flag like `OLLAMA_OPENAI_RESPONSES_PARITY=1` that switches output to the Responses-shaped envelope, leaving the default unchanged for the many existing clients that depend on the current flat `name`. Codex CLI and similar Responses-native clients can flip the flag.
Author
Owner

@rnett commented on GitHub (May 3, 2026):

🤦 I put the wrong one in the issue - I am using the /v1/responses API and seeing this. I do not think that the completions API needs to be updated, but the responses API should handle this.

<!-- gh-comment-id:4367156717 --> @rnett commented on GitHub (May 3, 2026): 🤦 I put the wrong one in the issue - I am using the `/v1/responses` API and seeing this. I do not think that the completions API needs to be updated, but the responses API should handle this.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#72200