[GH-ISSUE #15912] Codex fails to call tools #72194

Open
opened 2026-05-05 03:37:07 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @rnett on GitHub (May 1, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/15912

What is the issue?

See https://github.com/openai/codex/issues/19871#issuecomment-4357732299

Relevant log output

Nothing bug a bunch of

[GIN] 2026/04/30 - 20:45:35 | 404 |    182.8993ms |       127.0.0.1 | POST     "/v1/responses"
[GIN] 2026/04/30 - 20:45:36 | 404 |    390.9967ms |       127.0.0.1 | POST     "/v1/responses"
[GIN] 2026/04/30 - 20:45:51 | 404 |      1.5486ms |       127.0.0.1 | POST     "/v1/responses"
[GIN] 2026/04/30 - 20:45:51 | 404 |     48.4534ms |       127.0.0.1 | POST     "/v1/responses"
[GIN] 2026/04/30 - 20:45:56 | 200 |    5.2775931s |       127.0.0.1 | POST     "/v1/responses"
[GIN] 2026/04/30 - 20:47:49 | 404 |      1.0368ms |       127.0.0.1 | POST     "/v1/responses"
[GIN] 2026/04/30 - 20:47:49 | 404 |      1.0368ms |       127.0.0.1 | POST     "/v1/responses"

OS

Windows

GPU

No response

CPU

No response

Ollama version

0.22.0

Originally created by @rnett on GitHub (May 1, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/15912 ### What is the issue? See https://github.com/openai/codex/issues/19871#issuecomment-4357732299 ### Relevant log output Nothing bug a bunch of ```shell [GIN] 2026/04/30 - 20:45:35 | 404 | 182.8993ms | 127.0.0.1 | POST "/v1/responses" [GIN] 2026/04/30 - 20:45:36 | 404 | 390.9967ms | 127.0.0.1 | POST "/v1/responses" [GIN] 2026/04/30 - 20:45:51 | 404 | 1.5486ms | 127.0.0.1 | POST "/v1/responses" [GIN] 2026/04/30 - 20:45:51 | 404 | 48.4534ms | 127.0.0.1 | POST "/v1/responses" [GIN] 2026/04/30 - 20:45:56 | 200 | 5.2775931s | 127.0.0.1 | POST "/v1/responses" [GIN] 2026/04/30 - 20:47:49 | 404 | 1.0368ms | 127.0.0.1 | POST "/v1/responses" [GIN] 2026/04/30 - 20:47:49 | 404 | 1.0368ms | 127.0.0.1 | POST "/v1/responses" ``` ### OS Windows ### GPU _No response_ ### CPU _No response_ ### Ollama version 0.22.0
GiteaMirror added the bug label 2026-05-05 03:37:07 -05:00
Author
Owner

@rick-github commented on GitHub (May 3, 2026):

What context length has been set in the ollama server?

<!-- gh-comment-id:4367134924 --> @rick-github commented on GitHub (May 3, 2026): What [context length](https://docs.ollama.com/integrations/codex#usage-with-ollama:~:text=Codex%20requires%20a%20larger%20context%20window.%20It%20is%20recommended%20to%20use%20a%20context%20window%20of%20at%20least%2064k%20tokens.) has been set in the ollama server?
Author
Owner

@rnett commented on GitHub (May 3, 2026):

It was 4k, but does that apply to cloud models? That's all I'm using (and I've tried both going directly to the cloud endpoints and via the local server).

<!-- gh-comment-id:4367147781 --> @rnett commented on GitHub (May 3, 2026): It was 4k, but does that apply to cloud models? That's all I'm using (and I've tried both going directly to the cloud endpoints and via the local server).
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#72194