[GH-ISSUE #9100] return final prompt for generating / chat completion #5923

Closed
opened 2026-04-12 17:15:44 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @Robstei on GitHub (Feb 14, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/9100

It is currently possible to check the final prompt by setting OLLAMA_DEBUG="1".

It would be helpfull to also get the final response back when calling /api/chat.

The response could look like this although it might be even nicer to return it in the first chunk already.

{
  "model": "llama3.2",
  "created_at": "2023-08-04T19:22:45.499127Z",
  "done": true,
  "final_prompt": "...",
  "total_duration": 4883583458,
  "load_duration": 1334875,
  "prompt_eval_count": 26,
  "prompt_eval_duration": 342546000,
  "eval_count": 282,
  "eval_duration": 4535599000,
}
Originally created by @Robstei on GitHub (Feb 14, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/9100 It is currently possible to check the final prompt [by setting OLLAMA_DEBUG="1"](https://github.com/ollama/ollama/issues/6560). It would be helpfull to also get the final response back when calling /api/chat. The response could look like this although it might be even nicer to return it in the first chunk already. ```json { "model": "llama3.2", "created_at": "2023-08-04T19:22:45.499127Z", "done": true, "final_prompt": "...", "total_duration": 4883583458, "load_duration": 1334875, "prompt_eval_count": 26, "prompt_eval_duration": 342546000, "eval_count": 282, "eval_duration": 4535599000, } ```
GiteaMirror added the feature request label 2026-04-12 17:15:44 -05:00
Author
Owner

@rick-github commented on GitHub (Feb 14, 2025):

https://github.com/ollama/ollama/issues/6565

<!-- gh-comment-id:2659099907 --> @rick-github commented on GitHub (Feb 14, 2025): https://github.com/ollama/ollama/issues/6565
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#5923