[GH-ISSUE #14583] API chat stopped working #71515

Closed
opened 2026-05-05 02:00:49 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @niksedk on GitHub (Mar 3, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/14583

What is the issue?

OCR via /api/chat/ has stopped working (it worked a week ago).

URL:
http://localhost:11434/api/chat/

BODY:
{ "model": "glm-ocr:latest", "messages": [ { "role": "user", "content": "Act as a precise OCR engine. Transcribe every line of text from this image exactly as it appears. The language is English. Maintain the vertical order. Use a single '\n' to separate each line. Do not skip any text. Output only the transcribed text", "images": [ "iVB..."] } ], "stream": false }

RESULT:
{"model":"glm-ocr:latest","created_at":"2026-03-03T09:05:59.1653998Z","message":{"role":"assistant","content":"Act as a precise OCR engine. Transcribe every line of text exactly as it appears. The language is English. Maintain the vertical order. Use a single '\n' to separate each line. Do not skip any text. Output only the transcribed text."},"done":true,"done_reason":"stop","total_duration":625415000,"load_duration":37456800,"prompt_eval_count":61,"prompt_eval_duration":160191500,"eval_count":54,"eval_duration":417013900}


It used to give the actual OCR result back, but now it just echoes the prompt back.

Any ideas would be appreciated.

Relevant log output


OS

ALL

GPU

Yes

CPU

Yes

Ollama version

Latest

Originally created by @niksedk on GitHub (Mar 3, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/14583 ### What is the issue? OCR via /api/chat/ has stopped working (it worked a week ago). URL: http://localhost:11434/api/chat/ BODY: { "model": "glm-ocr:latest", "messages": [ { "role": "user", "content": "Act as a precise OCR engine. Transcribe every line of text from this image exactly as it appears. The language is English. Maintain the vertical order. Use a single '\n' to separate each line. Do not skip any text. Output only the transcribed text", "images": [ "iVB..."] } ], "stream": false } RESULT: {"model":"glm-ocr:latest","created_at":"2026-03-03T09:05:59.1653998Z","message":{"role":"assistant","content":"Act as a precise OCR engine. Transcribe every line of text exactly as it appears. The language is English. Maintain the vertical order. Use a single '\n' to separate each line. Do not skip any text. Output only the transcribed text."},"done":true,"done_reason":"stop","total_duration":625415000,"load_duration":37456800,"prompt_eval_count":61,"prompt_eval_duration":160191500,"eval_count":54,"eval_duration":417013900} ---- It used to give the actual OCR result back, but now it just echoes the prompt back. Any ideas would be appreciated. ### Relevant log output ```shell ``` ### OS ALL ### GPU Yes ### CPU Yes ### Ollama version Latest
GiteaMirror added the bug label 2026-05-05 02:00:49 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#71515