[GH-ISSUE #13030] unmarshal: invalid character #8632

Closed
opened 2026-04-12 21:22:23 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @xNeo-git on GitHub (Nov 9, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/13030

What is the issue?

I see the the following issue in https://github.com/ollama/ollama-python/issues/600
I'm facing something similar, same model, same error:
{"error":"unmarshal: invalid character 'I' looking for beginning of value"} {"context":"llm_stream_chat","model":"kimi-k2:1t-cloud","provider":"ollama","useOpenAIAdapter":false,"streamEnabled":true,"templateMessages":false}

Running ollama server on wsl + kimi-k2:1t-cloud + Continue on VS Code.
Ollama version is 0.12.10
All other apps are up to date.
Happens every time I send a message in Continue with the model kimi-k2:1t-cloud selected

Thanks!

Relevant log output

ERR [Extension Host] Error handling webview message: {
  "msg": {
    "messageId": "d9cce790-cc50-4b46-8aed-960af78b6b11",
    "messageType": "llm/streamChat",
    "data": {
      "completionOptions": {
        "reasoning": false
      },
      "title": "kimi-k2:1t-cloud",
      "messages": [
        {
          "role": "system",
          "content": "<A LOT OF CONTENT>"
        },
        {
          "role": "user",
          "content": "Hello"
        }
      ],
      "messageOptions": {
        "precompiled": true
      }
    }
  }
}

Error: HTTP 500 Internal Server Error from http://127.0.0.1:11434/api/chat

{"error":"unmarshal: invalid character 'I' looking for beginning of value"}

OS

No response

GPU

No response

CPU

No response

Ollama version

No response

Originally created by @xNeo-git on GitHub (Nov 9, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/13030 ### What is the issue? I see the the following issue in https://github.com/ollama/ollama-python/issues/600 I'm facing something similar, same model, same error: **{"error":"unmarshal: invalid character 'I' looking for beginning of value"} {"context":"llm_stream_chat","model":"kimi-k2:1t-cloud","provider":"ollama","useOpenAIAdapter":false,"streamEnabled":true,"templateMessages":false}** Running ollama server on wsl + kimi-k2:1t-cloud + Continue on VS Code. Ollama version is 0.12.10 All other apps are up to date. Happens every time I send a message in Continue with the model kimi-k2:1t-cloud selected Thanks! ### Relevant log output ```shell ERR [Extension Host] Error handling webview message: { "msg": { "messageId": "d9cce790-cc50-4b46-8aed-960af78b6b11", "messageType": "llm/streamChat", "data": { "completionOptions": { "reasoning": false }, "title": "kimi-k2:1t-cloud", "messages": [ { "role": "system", "content": "<A LOT OF CONTENT>" }, { "role": "user", "content": "Hello" } ], "messageOptions": { "precompiled": true } } } } Error: HTTP 500 Internal Server Error from http://127.0.0.1:11434/api/chat {"error":"unmarshal: invalid character 'I' looking for beginning of value"} ``` ### OS _No response_ ### GPU _No response_ ### CPU _No response_ ### Ollama version _No response_
GiteaMirror added the bug label 2026-04-12 21:22:23 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#8632