[GH-ISSUE #10894] [bug] ollama executing "" at <.Thinking>: can\'t evaluate field Thinking in type *api.Message' #53674

Closed
opened 2026-04-29 04:27:17 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @whlook on GitHub (May 29, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10894

What is the issue?

everythin is fine a few days ago. bug today i got this error: ollama executing "" at <.Thinking>: can't evaluate field Thinking in type *api.Message'.

  • model: qwen3
  • version: latest (just updated)

Relevant log output

[ 2025-05-29 15:07:23,723 - INFO ] Retrying request to /chat/completions in 0.771083 seconds
[ 2025-05-29 15:07:24,514 - INFO ] HTTP Request: POST http://localhost:11434/v1/chat/completions "HTTP/1.1 500 Internal Server Error"
[error] Error code: 500 - {'error': {'message': 'template: :42:11: executing "" at <.Thinking>: can\'t evaluate field Thinking in type *api.Message', 'type': 'api_error', 'param': None, 'code': None}}

OS

Linux

GPU

Nvidia

CPU

Intel

Ollama version

0.8.0

Originally created by @whlook on GitHub (May 29, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10894 ### What is the issue? everythin is fine a few days ago. bug today i got this error: ollama executing "" at <.Thinking>: can\'t evaluate field Thinking in type *api.Message'. - model: qwen3 - version: latest (just updated) ### Relevant log output ```shell [ 2025-05-29 15:07:23,723 - INFO ] Retrying request to /chat/completions in 0.771083 seconds [ 2025-05-29 15:07:24,514 - INFO ] HTTP Request: POST http://localhost:11434/v1/chat/completions "HTTP/1.1 500 Internal Server Error" [error] Error code: 500 - {'error': {'message': 'template: :42:11: executing "" at <.Thinking>: can\'t evaluate field Thinking in type *api.Message', 'type': 'api_error', 'param': None, 'code': None}} ``` ### OS Linux ### GPU Nvidia ### CPU Intel ### Ollama version 0.8.0
GiteaMirror added the bug label 2026-04-29 04:27:17 -05:00
Author
Owner

@jmorganca commented on GitHub (May 29, 2025):

@whlook would it be possible to re-pull qwen3 with ollama pull qwen3? There was a short period while the model was being updated where this error might occur. Let me know if that doesn't solve it and sorry about the error.

<!-- gh-comment-id:2918564313 --> @jmorganca commented on GitHub (May 29, 2025): @whlook would it be possible to re-pull `qwen3` with `ollama pull qwen3`? There was a short period while the model was being updated where this error might occur. Let me know if that doesn't solve it and sorry about the error.
Author
Owner

@xulisha123 commented on GitHub (May 29, 2025):

+1

<!-- gh-comment-id:2918671732 --> @xulisha123 commented on GitHub (May 29, 2025): +1
Author
Owner

@xulisha123 commented on GitHub (May 29, 2025):

at <.Thinking>: eval_duration is null

<!-- gh-comment-id:2918674021 --> @xulisha123 commented on GitHub (May 29, 2025): at <.Thinking>: eval_duration is null
Author
Owner

@whlook commented on GitHub (Jun 6, 2025):

@whlook would it be possible to re-pull qwen3 with ollama pull qwen3? There was a short period while the model was being updated where this error might occur. Let me know if that doesn't solve it and sorry about the error.

Thanks! It's going well after re-pull qwen3

<!-- gh-comment-id:2947955450 --> @whlook commented on GitHub (Jun 6, 2025): > [@whlook](https://github.com/whlook) would it be possible to re-pull `qwen3` with `ollama pull qwen3`? There was a short period while the model was being updated where this error might occur. Let me know if that doesn't solve it and sorry about the error. Thanks! It's going well after re-pull qwen3
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#53674