[GH-ISSUE #14542] Error 500 on calling ollama cloud models with tools #9434

Closed
opened 2026-04-12 22:21:31 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @Syntaxrabbit on GitHub (Mar 2, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/14542

What is the issue?

Using the library (e.g. reproducable with the sample application) causes an error 500 if at least one tool is provided.

  • Local models work without any issues.
  • Cloud models run without any issues if there is no tool involved.

── Extensions.AI ───────────────────────────────────────────────────────────────────────────────────────────────────────
Define a system prompt (optional)
>

You are talking to qwen3-vl:235b-cloud now.
When asked for the weather, population or the GPS coordinates for a city, it will try to use a predefined tool.
If any tool is used, the intended usage information is printed.
Enter /new to start over or /exit to leave.
Begin with [ to start multiline input. Submit it by ending with ].
Think mode is (null). Type /togglethink to change.
Enter /tools to list all available tools.

> /togglethink
Think mode is false.

> /togglethink
Think mode is true.

> /tools

Available tools:
GetWeather              Gets the current weather for a given location.
GetLatLon               Gets the latitude and longitude for a given location.
GetPopulation           Gets the amount of people living in a given city

You are talking to qwen3-vl:235b-cloud now.
When asked for the weather, population or the GPS coordinates for a city, it will try to use a predefined tool.
If any tool is used, the intended usage information is printed.
Enter /new to start over or /exit to leave.
Begin with [ to start multiline input. Submit it by ending with ].
Think mode is true. Type /togglethink to change.
Enter /tools to list all available tools.

> How is the weather today?
Response status code does not indicate success: 500 (Internal Server Error).

Relevant log output

[GIN] 2026/03/02 - 06:10:48 | 500 |    248.3771ms |             ::1 | POST     "/api/chat"

OS

Windows

GPU

No response

CPU

No response

Ollama version

0.17.4

Originally created by @Syntaxrabbit on GitHub (Mar 2, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/14542 ### What is the issue? Using the library (e.g. reproducable with the sample application) causes an error 500 if at least one tool is provided. - Local models work without any issues. - Cloud models run without any issues if there is no tool involved. ``` ── Extensions.AI ─────────────────────────────────────────────────────────────────────────────────────────────────────── Define a system prompt (optional) > You are talking to qwen3-vl:235b-cloud now. When asked for the weather, population or the GPS coordinates for a city, it will try to use a predefined tool. If any tool is used, the intended usage information is printed. Enter /new to start over or /exit to leave. Begin with [ to start multiline input. Submit it by ending with ]. Think mode is (null). Type /togglethink to change. Enter /tools to list all available tools. > /togglethink Think mode is false. > /togglethink Think mode is true. > /tools Available tools: GetWeather Gets the current weather for a given location. GetLatLon Gets the latitude and longitude for a given location. GetPopulation Gets the amount of people living in a given city You are talking to qwen3-vl:235b-cloud now. When asked for the weather, population or the GPS coordinates for a city, it will try to use a predefined tool. If any tool is used, the intended usage information is printed. Enter /new to start over or /exit to leave. Begin with [ to start multiline input. Submit it by ending with ]. Think mode is true. Type /togglethink to change. Enter /tools to list all available tools. > How is the weather today? Response status code does not indicate success: 500 (Internal Server Error). ``` ### Relevant log output ```shell [GIN] 2026/03/02 - 06:10:48 | 500 | 248.3771ms | ::1 | POST "/api/chat" ``` ### OS Windows ### GPU _No response_ ### CPU _No response_ ### Ollama version 0.17.4
GiteaMirror added the bug label 2026-04-12 22:21:31 -05:00
Author
Owner

@cheryei commented on GitHub (Mar 2, 2026):

从哪里添加技能?

Image
<!-- gh-comment-id:3982229518 --> @cheryei commented on GitHub (Mar 2, 2026): 从哪里添加技能? <img width="897" height="600" alt="Image" src="https://github.com/user-attachments/assets/56923bd0-a0a4-4132-8f1f-f1a24dd4653c" />
Author
Owner

@Syntaxrabbit commented on GitHub (Mar 2, 2026):

Sorry, should be duplicate.

<!-- gh-comment-id:3982232258 --> @Syntaxrabbit commented on GitHub (Mar 2, 2026): Sorry, should be duplicate.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#9434