[GH-ISSUE #14742] lfm2.5-thinking cannot run tools #35291

Closed
opened 2026-04-22 19:41:38 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @yajo on GitHub (Mar 9, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/14742

What is the issue?

https://ollama.com/library/lfm2.5-thinking is tagged with "tools", but it can't run any tools when using ollama launch opencode --model lfm2.5-thinking:latest.

Try asking it to list available tools or to use them, and you'll see it never uses any tool.

Example session: https://opncd.ai/share/HqkDcBYa

Relevant log output

mar 09 16:14:49 iceland ollama[2732]: [GIN] 2026/03/09 - 16:14:49 | 200 |      18.756µs |       127.0.0.1 | HEAD     "/"
mar 09 16:14:49 iceland ollama[2732]: [GIN] 2026/03/09 - 16:14:49 | 200 |   33.444216ms |       127.0.0.1 | POST     "/api/show"
mar 09 16:14:50 iceland ollama[2732]: [GIN] 2026/03/09 - 16:14:50 | 200 |    40.21652ms |       127.0.0.1 | POST     "/api/show"
mar 09 16:14:50 iceland ollama[2732]: [GIN] 2026/03/09 - 16:14:50 | 404 |     105.404µs |       127.0.0.1 | POST     "/api/show"
mar 09 16:14:50 iceland ollama[2732]: [GIN] 2026/03/09 - 16:14:50 | 200 |   28.305417ms |       127.0.0.1 | POST     "/api/show"
mar 09 16:14:50 iceland ollama[2732]: [GIN] 2026/03/09 - 16:14:50 | 200 |   31.074005ms |       127.0.0.1 | POST     "/api/show"
mar 09 16:14:50 iceland ollama[2732]: [GIN] 2026/03/09 - 16:14:50 | 404 |     152.929µs |       127.0.0.1 | POST     "/api/show"
mar 09 16:14:50 iceland ollama[2732]: [GIN] 2026/03/09 - 16:14:50 | 200 |   33.954934ms |       127.0.0.1 | POST     "/api/show"
mar 09 16:15:24 iceland ollama[2732]: time=2026-03-09T16:15:24.397Z level=WARN source=vocabulary.go:49 msg="adding bos token to prompt which already has it" id=[1]
mar 09 16:15:35 iceland ollama[2732]: [GIN] 2026/03/09 - 16:15:35 | 200 | 11.156455353s |       127.0.0.1 | POST     "/v1/chat/completions"

OS

Linux

GPU

Nvidia

CPU

Intel

Ollama version

0.17.4

Originally created by @yajo on GitHub (Mar 9, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/14742 ### What is the issue? https://ollama.com/library/lfm2.5-thinking is tagged with "tools", but it can't run any tools when using `ollama launch opencode --model lfm2.5-thinking:latest`. Try asking it to list available tools or to use them, and you'll see it never uses any tool. Example session: https://opncd.ai/share/HqkDcBYa ### Relevant log output ```shell mar 09 16:14:49 iceland ollama[2732]: [GIN] 2026/03/09 - 16:14:49 | 200 | 18.756µs | 127.0.0.1 | HEAD "/" mar 09 16:14:49 iceland ollama[2732]: [GIN] 2026/03/09 - 16:14:49 | 200 | 33.444216ms | 127.0.0.1 | POST "/api/show" mar 09 16:14:50 iceland ollama[2732]: [GIN] 2026/03/09 - 16:14:50 | 200 | 40.21652ms | 127.0.0.1 | POST "/api/show" mar 09 16:14:50 iceland ollama[2732]: [GIN] 2026/03/09 - 16:14:50 | 404 | 105.404µs | 127.0.0.1 | POST "/api/show" mar 09 16:14:50 iceland ollama[2732]: [GIN] 2026/03/09 - 16:14:50 | 200 | 28.305417ms | 127.0.0.1 | POST "/api/show" mar 09 16:14:50 iceland ollama[2732]: [GIN] 2026/03/09 - 16:14:50 | 200 | 31.074005ms | 127.0.0.1 | POST "/api/show" mar 09 16:14:50 iceland ollama[2732]: [GIN] 2026/03/09 - 16:14:50 | 404 | 152.929µs | 127.0.0.1 | POST "/api/show" mar 09 16:14:50 iceland ollama[2732]: [GIN] 2026/03/09 - 16:14:50 | 200 | 33.954934ms | 127.0.0.1 | POST "/api/show" mar 09 16:15:24 iceland ollama[2732]: time=2026-03-09T16:15:24.397Z level=WARN source=vocabulary.go:49 msg="adding bos token to prompt which already has it" id=[1] mar 09 16:15:35 iceland ollama[2732]: [GIN] 2026/03/09 - 16:15:35 | 200 | 11.156455353s | 127.0.0.1 | POST "/v1/chat/completions" ``` ### OS Linux ### GPU Nvidia ### CPU Intel ### Ollama version 0.17.4
GiteaMirror added the bug label 2026-04-22 19:41:38 -05:00
Author
Owner

@rick-github commented on GitHub (Mar 9, 2026):

Did you increase the context length of the model? OpenCode sends instructions and tools that account for about 11k tokens, so if your context buffer is too small, the tools will be lost.

$ ollama run lfm2.5-thinking 
>>> /set parameter num_ctx 32768
Set parameter 'num_ctx' to '32768'
>>> /save lfm2.5-thinking:c32k
Created new model 'lfm2.5-thinking:c32k'
>>> 
$ ollama run lfm2.5-thinking:c32k '' ; ollama ps
NAME                    ID              SIZE      PROCESSOR    CONTEXT    UNTIL   
lfm2.5-thinking:c32k    a394506c5b82    1.4 GB    100% GPU     32768      Forever    
<!-- gh-comment-id:4025004814 --> @rick-github commented on GitHub (Mar 9, 2026): Did you increase the [context length](https://docs.ollama.com/integrations/opencode#:~:text=OpenCode%20requires%20a%20larger%20context%20window.%20It%20is%20recommended%20to%20use%20a%20context%20window%20of%20at%20least%2064k%20tokens.%20See%20Context%20length%20for%20more%20information.) of the model? OpenCode sends instructions and tools that account for about 11k tokens, so if your context buffer is too small, the tools will be lost. ```console $ ollama run lfm2.5-thinking >>> /set parameter num_ctx 32768 Set parameter 'num_ctx' to '32768' >>> /save lfm2.5-thinking:c32k Created new model 'lfm2.5-thinking:c32k' >>> $ ollama run lfm2.5-thinking:c32k '' ; ollama ps NAME ID SIZE PROCESSOR CONTEXT UNTIL lfm2.5-thinking:c32k a394506c5b82 1.4 GB 100% GPU 32768 Forever ```
Author
Owner

@yajo commented on GitHub (Mar 9, 2026):

Yes:

ollama run lfm2.5-thinking:latest '' ; ollama ps

NAME                      ID              SIZE      PROCESSOR    CONTEXT    UNTIL              
lfm2.5-thinking:latest    95bd9d45385f    1.9 GB    100% GPU     64000      4 minutes from now

I did it with the OLLAMA_CONTEXT_LENGTH env variable.

<!-- gh-comment-id:4025071312 --> @yajo commented on GitHub (Mar 9, 2026): Yes: ``` ollama run lfm2.5-thinking:latest '' ; ollama ps NAME ID SIZE PROCESSOR CONTEXT UNTIL lfm2.5-thinking:latest 95bd9d45385f 1.9 GB 100% GPU 64000 4 minutes from now ``` I did it with the `OLLAMA_CONTEXT_LENGTH` env variable.
Author
Owner

@rick-github commented on GitHub (Mar 9, 2026):

Logs show that the model is getting the tool list, it's just very bad at tool use. This is not uncommon with small models. It uses tools if you give explicit directions ("write 'hello world' into a file called x.txt") but it just can't generalize. I suggest looking at larger models for more accurate tool use.

<!-- gh-comment-id:4025421161 --> @rick-github commented on GitHub (Mar 9, 2026): Logs show that the model is getting the tool list, it's just very bad at tool use. This is not uncommon with small models. It uses tools if you give explicit directions ("write 'hello world' into a file called x.txt") but it just can't generalize. I suggest looking at larger models for more accurate tool use.
Author
Owner

@yajo commented on GitHub (Mar 9, 2026):

oh, correct. That prompt actually uses a tool. Then indeed it's too dumb... Thanks!

<!-- gh-comment-id:4025804680 --> @yajo commented on GitHub (Mar 9, 2026): oh, correct. That prompt actually uses a tool. Then indeed it's too dumb... Thanks!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#35291