[GH-ISSUE #5370] OpenAI Chat Compatibility Incorrect Prompt Eval #65399

Closed
opened 2026-05-03 21:11:30 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @royjhan on GitHub (Jun 29, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/5370

Originally assigned to: @royjhan on GitHub.

What is the issue?

ollama returns 0 for prompt eval if the prompt was cached, but openai returns the actual count. prompt_tokens = 0 in the usage struct in the response.

OS

No response

GPU

No response

CPU

No response

Ollama version

No response

Originally created by @royjhan on GitHub (Jun 29, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/5370 Originally assigned to: @royjhan on GitHub. ### What is the issue? ollama returns 0 for prompt eval if the prompt was cached, but openai returns the actual count. prompt_tokens = 0 in the usage struct in the response. ### OS _No response_ ### GPU _No response_ ### CPU _No response_ ### Ollama version _No response_
GiteaMirror added the bug label 2026-05-03 21:11:30 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#65399