[GH-ISSUE #6723] How to change the system memory folder ? #4235

Closed
opened 2026-04-12 15:10:13 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @mdabir1203 on GitHub (Sep 10, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6723

What is the issue?

I tried to run the llama 3.1 with Ollama and I am getting this :
image

I have enough diskspace but what are the reasons behind this ?

OS

Windows

GPU

No response

CPU

Intel

Ollama version

0.3.9

Originally created by @mdabir1203 on GitHub (Sep 10, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6723 ### What is the issue? I tried to run the llama 3.1 with Ollama and I am getting this : ![image](https://github.com/user-attachments/assets/1c5903d2-fd77-434e-b00b-f39ab349aede) I have enough diskspace but what are the reasons behind this ? ### OS Windows ### GPU _No response_ ### CPU Intel ### Ollama version 0.3.9
GiteaMirror added the bug label 2026-04-12 15:10:13 -05:00
Author
Owner

@rick-github commented on GitHub (Sep 10, 2024):

You don't have enough RAM to load the model. If you increase swap or terminate other programs you may have enough, you need 500M.

<!-- gh-comment-id:2340126286 --> @rick-github commented on GitHub (Sep 10, 2024): You don't have enough RAM to load the model. If you increase swap or terminate other programs you may have enough, you need 500M.
Author
Owner

@jmorganca commented on GitHub (Sep 12, 2024):

As @rick-github mentioned this there isn't enough memory or page size for the model to run – it would most likely crash.

<!-- gh-comment-id:2345053703 --> @jmorganca commented on GitHub (Sep 12, 2024): As @rick-github mentioned this there isn't enough memory or page size for the model to run – it would most likely crash.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#4235