[GH-ISSUE #9900] chat complete documentation disagree with implementation #32241

Open
opened 2026-04-22 13:19:21 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @humblemat810 on GitHub (Mar 20, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/9900

What is the issue?

for streaming chat api
In client that use ChatResponse
class ChatResponse(BaseGenerateResponse):
"""
Response returned by chat requests.
"""

message: Message
'Response message.'
message is required

however in documentation

https://github.com/ollama/ollama/blob/main/docs/api.md

example final response

{
"model": "llama3.2",
"created_at": "2023-08-04T19:22:45.499127Z",
"done": true,
"total_duration": 4883583458,
"load_duration": 1334875,
"prompt_eval_count": 26,
"prompt_eval_duration": 342546000,
"eval_count": 282,
"eval_duration": 4535599000
}

does not have message at all

Resolution: either:

  1. change documentation to have message
  2. create pydanic FinalResponse class that hsa done = True and no message required. When streaming, pydantic checking that once validation error is found, check if it matches FinalResponse type, OK if match, else raise validation error.

Relevant log output


OS

No response

GPU

No response

CPU

No response

Ollama version

No response

Originally created by @humblemat810 on GitHub (Mar 20, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/9900 ### What is the issue? for streaming chat api In client that use ChatResponse class ChatResponse(BaseGenerateResponse): """ Response returned by chat requests. """ message: Message 'Response message.' message is required however in documentation https://github.com/ollama/ollama/blob/main/docs/api.md example final response { "model": "llama3.2", "created_at": "2023-08-04T19:22:45.499127Z", "done": true, "total_duration": 4883583458, "load_duration": 1334875, "prompt_eval_count": 26, "prompt_eval_duration": 342546000, "eval_count": 282, "eval_duration": 4535599000 } does not have message at all Resolution: either: 1. change documentation to have message 2. create pydanic FinalResponse class that hsa done = True and no message required. When streaming, pydantic checking that once validation error is found, check if it matches FinalResponse type, OK if match, else raise validation error. ### Relevant log output ```shell ``` ### OS _No response_ ### GPU _No response_ ### CPU _No response_ ### Ollama version _No response_
GiteaMirror added the documentation label 2026-04-22 13:19:21 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#32241