[GH-ISSUE #14395] Make timeout configurable for local Ollama with Cline #71411

Open
opened 2026-05-05 01:33:26 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @cutlery on GitHub (Feb 24, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/14395

What is the issue?

Hi,

The default timeout in Cline (CLI) is 30,000 milliseconds, i.e. 30 seconds, for any response from Ollama.
When running local models, esp. on CPU-only inference, this is way too short.
I therefore suggest two modifications when running "ollama > launch > Cline":

  1. Add a configuration for Cline, and perhaps others, to set the timeout for any response (not in milliseconds but seconds).
  2. Always add the line "requestTimeoutMs": <timeout in miilliseconds>, to ~/.cline/data/globalState.json (or perhaps a local configuration file).

Thanks and regards,

Danai

OS

Debian Linux (testing)

GPU

None

CPU

Lots

Ollama version

0.16.3

Originally created by @cutlery on GitHub (Feb 24, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/14395 ### What is the issue? Hi, The default timeout in Cline (CLI) is 30,000 milliseconds, i.e. 30 seconds, for any response from Ollama. When running local models, esp. on CPU-only inference, this is way too short. I therefore suggest two modifications when running "ollama > launch > Cline": 1. Add a configuration for Cline, and perhaps others, to set the timeout for any response (not in milliseconds but seconds). 2. Always add the line `"requestTimeoutMs": <timeout in miilliseconds>,` to `~/.cline/data/globalState.json` (or perhaps a local configuration file). Thanks and regards, Danai ### OS Debian Linux (testing) ### GPU None ### CPU Lots ### Ollama version 0.16.3
GiteaMirror added the launchbug labels 2026-05-05 01:33:26 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#71411