[GH-ISSUE #13017] Bug: 500 Internal Server Error: unmarshal: invalid character 'I' with 'kimi-k2-thinking:cloud' model #70678

Closed
opened 2026-05-04 22:31:21 -05:00 by GiteaMirror · 12 comments
Owner

Originally created by @TJbarros on GitHub (Nov 8, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/13017

What is the issue?

Summary
This is a bug report for the kimi-k2-thinking:cloud model endpoint. The ollama.com server is consistently returning a 500 Internal Server Error.

This error response is not valid JSON, which causes the ollama client to fail with an unmarshal error.

🚶 Steps to Reproduce
Run the kimi-k2-thinking:cloud model:

Bash

ollama run kimi-k2-thinking:cloud
After connecting, send any prompt. (I used qual a sua capacidade de programar em python, seja cincero e objetivo ?, but any prompt triggers the error).

💥 Actual Behavior (The Error)
The client immediately prints the following error, indicating a non-JSON response was received from the server.

Connecting to 'kimi-k2-thinking' on 'ollama.com'

qual a sua capacidade de programar em python, seja cincero e objetivo ?
Error: 500 Internal Server Error: unmarshal: invalid character 'I' looking for beginning of value
🎯 Error Analysis
The key part of the error is: unmarshal: invalid character 'I' looking for beginning of value.

This confirms that the ollama client received a 500 Internal Server Error and the body of that error response was not JSON. The client expected a JSON object (starting with { or [) but instead received a response body starting with the character I.

This is almost certainly a plain-text or HTML error page from the server (e.g., "Internal Server Error" or "Invalid...").

This suggests the kimi-k2-thinking:cloud endpoint is failing and not correctly formatting its error response as a JSON object that the client can understand.

Environment
Ollama Version: ollama version is 0.12.10
OS: Windows / PowerShell

Relevant log output


OS

No response

GPU

No response

CPU

No response

Ollama version

No response

Originally created by @TJbarros on GitHub (Nov 8, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/13017 ### What is the issue? Summary This is a bug report for the kimi-k2-thinking:cloud model endpoint. The ollama.com server is consistently returning a 500 Internal Server Error. This error response is not valid JSON, which causes the ollama client to fail with an unmarshal error. 🚶 Steps to Reproduce Run the kimi-k2-thinking:cloud model: Bash ollama run kimi-k2-thinking:cloud After connecting, send any prompt. (I used qual a sua capacidade de programar em python, seja cincero e objetivo ?, but any prompt triggers the error). 💥 Actual Behavior (The Error) The client immediately prints the following error, indicating a non-JSON response was received from the server. Connecting to 'kimi-k2-thinking' on 'ollama.com' ⚡ >>> qual a sua capacidade de programar em python, seja cincero e objetivo ? Error: 500 Internal Server Error: unmarshal: invalid character 'I' looking for beginning of value 🎯 Error Analysis The key part of the error is: unmarshal: invalid character 'I' looking for beginning of value. This confirms that the ollama client received a 500 Internal Server Error and the body of that error response was not JSON. The client expected a JSON object (starting with { or [) but instead received a response body starting with the character I. This is almost certainly a plain-text or HTML error page from the server (e.g., "Internal Server Error" or "Invalid..."). This suggests the kimi-k2-thinking:cloud endpoint is failing and not correctly formatting its error response as a JSON object that the client can understand. Environment Ollama Version: ollama version is 0.12.10 OS: Windows / PowerShell ### Relevant log output ```shell ``` ### OS _No response_ ### GPU _No response_ ### CPU _No response_ ### Ollama version _No response_
GiteaMirror added the bug label 2026-05-04 22:31:21 -05:00
Author
Owner

@benhot commented on GitHub (Nov 8, 2025):

same issues

Image
<!-- gh-comment-id:3506556567 --> @benhot commented on GitHub (Nov 8, 2025): same issues <img width="547" height="505" alt="Image" src="https://github.com/user-attachments/assets/1ed2ea6d-30eb-4308-9a22-75e854c764f7" />
Author
Owner

@josephlugo commented on GitHub (Nov 8, 2025):

Yep

<!-- gh-comment-id:3506562129 --> @josephlugo commented on GitHub (Nov 8, 2025): <!-- Failed to upload "IMG_1137.jpeg" --> Yep
Author
Owner

@rick-github commented on GitHub (Nov 8, 2025):

This should be fixed now.

$ ollama run kimi-k2-thinking:cloud hello
...
Hello! How can I help you today?
<!-- gh-comment-id:3506790483 --> @rick-github commented on GitHub (Nov 8, 2025): This should be fixed now. ```console $ ollama run kimi-k2-thinking:cloud hello ... Hello! How can I help you today? ```
Author
Owner

@benhot commented on GitHub (Nov 9, 2025):

fixed

<!-- gh-comment-id:3507535524 --> @benhot commented on GitHub (Nov 9, 2025): fixed
Author
Owner

@rick-github commented on GitHub (Nov 9, 2025):

It appears this is an intermittent ongoing issue. The invalid character 'I' is a backend error that's causing Internal Server Error to be returned, and because it's not properly formed JSON, the local ollama server is throwing a 500.

<!-- gh-comment-id:3508881785 --> @rick-github commented on GitHub (Nov 9, 2025): It appears this is an intermittent ongoing issue. The `invalid character 'I'` is a backend error that's causing `Internal Server Error` to be returned, and because it's not properly formed JSON, the local ollama server is throwing a 500.
Author
Owner

@jmorganca commented on GitHub (Nov 10, 2025):

It should be fixed now, thanks @rick-github for all the help and everyone who reported! Let me know if you're still seeing the issue

<!-- gh-comment-id:3509181260 --> @jmorganca commented on GitHub (Nov 10, 2025): It should be fixed now, thanks @rick-github for all the help and everyone who reported! Let me know if you're still seeing the issue
Author
Owner

@kiril1976 commented on GitHub (Nov 11, 2025):

Error
500 Internal Server Error: unmarshal: invalid character 'I' looking for beginning of value. I still have this mistake and I dont know if this I something wrong with GUI or my MacBook OS Tahoe I use?

<!-- gh-comment-id:3516162338 --> @kiril1976 commented on GitHub (Nov 11, 2025): Error 500 Internal Server Error: unmarshal: invalid character 'I' looking for beginning of value. I still have this mistake and I dont know if this I something wrong with GUI or my MacBook OS Tahoe I use?
Author
Owner

@simmorsal commented on GitHub (Nov 11, 2025):

Getting this error when trying to upload an image using Open WebUI. Text works ok.

<!-- gh-comment-id:3516531758 --> @simmorsal commented on GitHub (Nov 11, 2025): Getting this error when trying to upload an image using Open WebUI. Text works ok.
Author
Owner

@rouk750 commented on GitHub (Nov 11, 2025):

I'm facing the same error with several cloud models; the latest failure occurred with gpt-oss:120b-cloud.
The same tests worked correctly with models running locally.

<!-- gh-comment-id:3516687613 --> @rouk750 commented on GitHub (Nov 11, 2025): I'm facing the same error with several cloud models; the latest failure occurred with gpt-oss:120b-cloud. The same tests worked correctly with models running locally.
Author
Owner

@kiril1976 commented on GitHub (Nov 11, 2025):

The problem is with Ollama GUI.I try all the models with Docker and Open WebUi and all the models works fine.So I will use them with Open WebUi.But they need to fix the Ollama GUi.

<!-- gh-comment-id:3516815714 --> @kiril1976 commented on GitHub (Nov 11, 2025): The problem is with Ollama GUI.I try all the models with Docker and Open WebUi and all the models works fine.So I will use them with Open WebUi.But they need to fix the Ollama GUi.
Author
Owner

@rouk750 commented on GitHub (Nov 11, 2025):

The problem is with Ollama GUI.I try all the models with Docker and Open WebUi and all the models works fine.So I will use them with Open WebUi.But they need to fix the Ollama GUi.
Not using GUI on my side. I am using langchain to consume model. I am exposing locally ollama through ollama serve

<!-- gh-comment-id:3516964370 --> @rouk750 commented on GitHub (Nov 11, 2025): > The problem is with Ollama GUI.I try all the models with Docker and Open WebUi and all the models works fine.So I will use them with Open WebUi.But they need to fix the Ollama GUi. Not using GUI on my side. I am using langchain to consume model. I am exposing locally ollama through ollama serve
Author
Owner

@xNeo-git commented on GitHub (Nov 11, 2025):

In my case, was solved for kimi-k2-thinking:cloud.
Same error with kimi-k2:1t-cloud.

Thanks!

<!-- gh-comment-id:3518461269 --> @xNeo-git commented on GitHub (Nov 11, 2025): In my case, was solved for kimi-k2-thinking:cloud. Same error with kimi-k2:1t-cloud. Thanks!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#70678