[GH-ISSUE #7754] 300+mb of ram while idle #67007

Closed
opened 2026-05-04 09:13:45 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @Omar-000 on GitHub (Nov 20, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7754

What is the issue?

Screenshot_2024-11-20_05-47-48

I made sure to stop all running models and i restarted my system also

OS

Linux

GPU

AMD, Intel

CPU

Intel

Ollama version

0.3.12

Originally created by @Omar-000 on GitHub (Nov 20, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7754 ### What is the issue? ![Screenshot_2024-11-20_05-47-48](https://github.com/user-attachments/assets/8e1cdf33-37f3-4d6e-b512-839336408283) I made sure to stop all running models and i restarted my system also ### OS Linux ### GPU AMD, Intel ### CPU Intel ### Ollama version 0.3.12
GiteaMirror added the bug label 2026-05-04 09:13:45 -05:00
Author
Owner

@Coecoenut commented on GitHub (Nov 21, 2024):

Hi,
please try the current 0.4.2 release. On my machine, it only consumes 50 MB

<!-- gh-comment-id:2491349548 --> @Coecoenut commented on GitHub (Nov 21, 2024): Hi, please try the current 0.4.2 release. On my machine, it only consumes 50 MB
Author
Owner

@dhiltgen commented on GitHub (Nov 21, 2024):

I'm anticipating RAM usage of the main ollama server to come down once #7499 merges and we're no longer carrying the runners as payloads inside the main binary.

<!-- gh-comment-id:2491836342 --> @dhiltgen commented on GitHub (Nov 21, 2024): I'm anticipating RAM usage of the main ollama server to come down once #7499 merges and we're no longer carrying the runners as payloads inside the main binary.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#67007