[GH-ISSUE #8344] Function/Tool Call doesn't always work, is it memory dependent? #51860

Closed
opened 2026-04-28 21:04:44 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @samcov on GitHub (Jan 8, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/8344

What is the issue?

This may not be a bug in the software, but due to low memory on my machine, so I'm reporting this in the hopes of someone knowing/experiencing the same thing.

  1. My machine is an i7 with 16GB ram, and I have Ollama running on top of Windows.
  2. I have a lot of other things going, like Visual studio, two browsers & other things
  3. I did just order a 64GB machine, due to arrive this week with NVidia graphics card, so if it's memory, that will solve it
  4. I'm using Microsofts new AI Extensions which are not released, so there's that

The question is this: Will low memory cause functions to not be called? They appear to get called when there is enough memory.

OS

Windows

GPU

No response

CPU

Intel

Ollama version

0.5.4

Originally created by @samcov on GitHub (Jan 8, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/8344 ### What is the issue? This may not be a bug in the software, but due to low memory on my machine, so I'm reporting this in the hopes of someone knowing/experiencing the same thing. 1. My machine is an i7 with 16GB ram, and I have Ollama running on top of Windows. 2. I have a lot of other things going, like Visual studio, two browsers & other things 3. I did just order a 64GB machine, due to arrive this week with NVidia graphics card, so if it's memory, that will solve it 4. **I'm using Microsofts new AI Extensions which are not released**, so there's that **The question is this**: Will low memory cause functions to not be called? They appear to get called when there is enough memory. ### OS Windows ### GPU _No response_ ### CPU Intel ### Ollama version 0.5.4
GiteaMirror added the bug label 2026-04-28 21:04:44 -05:00
Author
Owner

@rick-github commented on GitHub (Jan 8, 2025):

Will low memory cause functions to not be called?

Unlikely. If the model has been loaded, inference and tool generation should both perform the same. If there's a lack of RAM and a runner dies from OOM, the client will get an error, not a failed tool call. Server logs might help with debugging.

<!-- gh-comment-id:2577242222 --> @rick-github commented on GitHub (Jan 8, 2025): > Will low memory cause functions to not be called? Unlikely. If the model has been loaded, inference and tool generation should both perform the same. If there's a lack of RAM and a runner dies from OOM, the client will get an error, not a failed tool call. [Server logs](https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#how-to-troubleshoot-issues ) might help with debugging.
Author
Owner

@samcov commented on GitHub (Jan 8, 2025):

Fantastic!

BTW, thanks for the server logs link, I'll use that ASAP!

<!-- gh-comment-id:2577839316 --> @samcov commented on GitHub (Jan 8, 2025): Fantastic! BTW, thanks for the **server logs link**, I'll use that ASAP!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#51860