[GH-ISSUE #689] no space left on device Error #314

Closed
opened 2026-04-12 09:51:47 -05:00 by GiteaMirror · 16 comments
Owner

Originally created by @daaniyaan on GitHub (Oct 3, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/689

I'm getting " no space left on device " error while i have enough space on my MacBook.
CleanShot 2023-10-03 at 14 27 13
CleanShot 2023-10-03 at 14 27 54
CleanShot 2023-10-03 at 14 44 46@2x

Originally created by @daaniyaan on GitHub (Oct 3, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/689 I'm getting " no space left on device " error while i have enough space on my MacBook. ![CleanShot 2023-10-03 at 14 27 13](https://github.com/jmorganca/ollama/assets/31348710/e361dd12-db9b-4f38-ae02-32b98e2190fb) ![CleanShot 2023-10-03 at 14 27 54](https://github.com/jmorganca/ollama/assets/31348710/4f5638cb-e101-4162-a3ba-e446f841144b) ![CleanShot 2023-10-03 at 14 44 46@2x](https://github.com/jmorganca/ollama/assets/31348710/21d3432a-de08-4f18-b7e0-92900d3f25d5)
Author
Owner

@daaniyaan commented on GitHub (Oct 3, 2023):

shouldn't it download the models in the ~./ollama ?
why in the first screenshot it's mentioning "/usr/share/ollama/.ollama/models/blobs"
CleanShot 2023-10-03 at 16 03 00

CleanShot 2023-10-03 at 15 28 06@2x

<!-- gh-comment-id:1744818450 --> @daaniyaan commented on GitHub (Oct 3, 2023): shouldn't it download the models in the ~./ollama ? why in the first screenshot it's mentioning "/usr/share/ollama/.ollama/models/blobs" ![CleanShot 2023-10-03 at 16 03 00](https://github.com/jmorganca/ollama/assets/31348710/1b294ab5-5b4a-40a3-8f79-7af57dc0550d) ![CleanShot 2023-10-03 at 15 28 06@2x](https://github.com/jmorganca/ollama/assets/31348710/c40cebce-dafa-49af-b2bc-d382dd5c3259)
Author
Owner

@technovangelist commented on GitHub (Oct 3, 2023):

That's an error from your Mac. These models can be big. I often have to delete models I don't want to make room for more and I have 4 TB.

<!-- gh-comment-id:1744965682 --> @technovangelist commented on GitHub (Oct 3, 2023): That's an error from your Mac. These models can be big. I often have to delete models I don't want to make room for more and I have 4 TB.
Author
Owner

@daaniyaan commented on GitHub (Oct 3, 2023):

That's an error from your Mac. These models can be big. I often have to delete models I don't want to make room for more and I have 4 TB.

weird that there is nothing to delete and when i try to pull the data it resume and says not enough space.

CleanShot 2023-10-03 at 17 26 15

<!-- gh-comment-id:1745034945 --> @daaniyaan commented on GitHub (Oct 3, 2023): > That's an error from your Mac. These models can be big. I often have to delete models I don't want to make room for more and I have 4 TB. weird that there is nothing to delete and when i try to pull the data it resume and says not enough space. ![CleanShot 2023-10-03 at 17 26 15](https://github.com/jmorganca/ollama/assets/31348710/4fa320ad-eccb-4981-8eae-a2af9e9cc2b8)
Author
Owner

@mxyng commented on GitHub (Oct 3, 2023):

Are you running ollama in a container? The model path /usr/share/ollama is a giveaway because it's used exclusively for Linux installs. If that's the case, Docker Desktop allocate a subset of total system disk space for the Linux VM hosting the container runner. You could increase this in Docker Desktop settings.

If this isn't the case, please describe your installation environment and process.

<!-- gh-comment-id:1745137855 --> @mxyng commented on GitHub (Oct 3, 2023): Are you running ollama in a container? The model path `/usr/share/ollama` is a giveaway because it's used exclusively for Linux installs. If that's the case, Docker Desktop allocate a subset of total system disk space for the Linux VM hosting the container runner. You could increase this in Docker Desktop settings. If this isn't the case, please describe your installation environment and process.
Author
Owner

@daaniyaan commented on GitHub (Oct 3, 2023):

Are you running ollama in a container? The model path /usr/share/ollama is a giveaway because it's used exclusively for Linux installs. If that's the case, Docker Desktop allocate a subset of total system disk space for the Linux VM hosting the container runner. You could increase this in Docker Desktop settings.

If this isn't the case, please describe your installation environment and process.

I have docker desktop installed but I didn't do anything related to docker here. and it's not even running/open.
my installation process was exactly as it should be. downloaded the MacOs version from the website. unzipped it. dragged the application into the application folder. opened it and for install granted permission by inserting password. and then getting the model via the curl command as i wrote in the previous comments.

<!-- gh-comment-id:1745143053 --> @daaniyaan commented on GitHub (Oct 3, 2023): > Are you running ollama in a container? The model path `/usr/share/ollama` is a giveaway because it's used exclusively for Linux installs. If that's the case, Docker Desktop allocate a subset of total system disk space for the Linux VM hosting the container runner. You could increase this in Docker Desktop settings. > > If this isn't the case, please describe your installation environment and process. I have docker desktop installed but I didn't do anything related to docker here. and it's not even running/open. my installation process was exactly as it should be. downloaded the MacOs version from the website. unzipped it. dragged the application into the application folder. opened it and for install granted permission by inserting password. and then getting the model via the curl command as i wrote in the previous comments.
Author
Owner

@daaniyaan commented on GitHub (Oct 3, 2023):

maybe the problem is related to curl that even tho you're tying to pull the model from macos while you use curl it try to download it in /usr/share ?

update: i did free up more space. re-installed the app.
still get the same error!

<!-- gh-comment-id:1745585352 --> @daaniyaan commented on GitHub (Oct 3, 2023): maybe the problem is related to curl that even tho you're tying to pull the model from macos while you use curl it try to download it in /usr/share ? update: i did free up more space. re-installed the app. still get the same error!
Author
Owner

@nimkar commented on GitHub (Oct 3, 2023):

Can you check what is listening to your port 11434 ?

lsof -i:11434

<!-- gh-comment-id:1745697220 --> @nimkar commented on GitHub (Oct 3, 2023): Can you check what is listening to your port 11434 ? `lsof -i:11434`
Author
Owner

@daaniyaan commented on GitHub (Oct 3, 2023):

Can you check what is listening to your port 11434 ?

lsof -i:11434

CleanShot 2023-10-04 at 00 19 23

<!-- gh-comment-id:1745704547 --> @daaniyaan commented on GitHub (Oct 3, 2023): > Can you check what is listening to your port 11434 ? > > `lsof -i:11434` ![CleanShot 2023-10-04 at 00 19 23](https://github.com/jmorganca/ollama/assets/31348710/d7da1bff-2f52-4799-a79f-0d36d33df839)
Author
Owner

@mxyng commented on GitHub (Oct 4, 2023):

Is it possible your Ollama is configured to use another instance? Can you run OLLAMA_HOST and paste the outputs?

<!-- gh-comment-id:1747217909 --> @mxyng commented on GitHub (Oct 4, 2023): Is it possible your Ollama is configured to use another instance? Can you run `OLLAMA_HOST` and paste the outputs?
Author
Owner

@daaniyaan commented on GitHub (Oct 4, 2023):

I managed to fix this by cloning the source code and then building from the source.
and then using the proxy to download the model via "ollama pull mistral" instead of using curl.
so the problem could have been either

  • ollama command wasn't respecting my proxy
  • there was a problem with downloading via curl
<!-- gh-comment-id:1747727585 --> @daaniyaan commented on GitHub (Oct 4, 2023): I managed to fix this by cloning the source code and then building from the source. and then using the proxy to download the model via "ollama pull mistral" instead of using curl. so the problem could have been either - ollama command wasn't respecting my proxy - there was a problem with downloading via curl
Author
Owner

@mxyng commented on GitHub (Oct 11, 2023):

It was the first one, not using configured proxies, which is fixed in #743

<!-- gh-comment-id:1758039470 --> @mxyng commented on GitHub (Oct 11, 2023): It was the first one, not using configured proxies, which is fixed in #743
Author
Owner

@daaniyaan commented on GitHub (Oct 28, 2023):

It was the first one, not using configured proxies, which is fixed in #743

looks like it's still there.
image

<!-- gh-comment-id:1783860010 --> @daaniyaan commented on GitHub (Oct 28, 2023): > It was the first one, not using configured proxies, which is fixed in #743 looks like it's still there. ![image](https://github.com/jmorganca/ollama/assets/31348710/6e4998e8-b8d3-48d7-bf72-b71ab37024c0)
Author
Owner

@r0mer0 commented on GitHub (Mar 19, 2024):

I've managed this issue editing the /etc/systemd/system/ollama.service file adding the line:

Environment="OLLAMA_MODELS=/any-path-with-space/"

Then restarted the service:

sudo systemctl daemon-reload
sudo systemctl restart ollama.service
<!-- gh-comment-id:2005392728 --> @r0mer0 commented on GitHub (Mar 19, 2024): I've managed this issue editing the `/etc/systemd/system/ollama.service` file adding the line: `Environment="OLLAMA_MODELS=/any-path-with-space/"` Then restarted the service: ``` sudo systemctl daemon-reload sudo systemctl restart ollama.service ```
Author
Owner

@mingones commented on GitHub (Apr 23, 2024):

我已经通过编辑/etc/systemd/system/ollama.service文件添加了以下行来解决这个问题:

Environment="OLLAMA_MODELS=/any-path-with-space/"

然后重新启动服务:

sudo systemctl daemon-reload
sudo systemctl restart ollama.service

hi,请问下这样新的模型存储地址放到那里了

<!-- gh-comment-id:2071839223 --> @mingones commented on GitHub (Apr 23, 2024): > 我已经通过编辑`/etc/systemd/system/ollama.service`文件添加了以下行来解决这个问题: > > `Environment="OLLAMA_MODELS=/any-path-with-space/"` > > 然后重新启动服务: > > ``` > sudo systemctl daemon-reload > sudo systemctl restart ollama.service > ``` hi,请问下这样新的模型存储地址放到那里了
Author
Owner

@potCookie commented on GitHub (Aug 2, 2024):

我已经通过编辑/etc/systemd/system/ollama.service文件添加了以下行来解决这个问题:
Environment="OLLAMA_MODELS=/any-path-with-space/"
然后重新启动服务:

sudo systemctl daemon-reload
sudo systemctl restart ollama.service

hi,请问下这样新的模型存储地址放到那里了

我这样配置了,启动时还是不行,依然报同样的错误,同样的no space路径,请问是否还有哪些额外配置?

<!-- gh-comment-id:2264281085 --> @potCookie commented on GitHub (Aug 2, 2024): > > 我已经通过编辑`/etc/systemd/system/ollama.service`文件添加了以下行来解决这个问题: > > `Environment="OLLAMA_MODELS=/any-path-with-space/"` > > 然后重新启动服务: > > ``` > > sudo systemctl daemon-reload > > sudo systemctl restart ollama.service > > ``` > > hi,请问下这样新的模型存储地址放到那里了 我这样配置了,启动时还是不行,依然报同样的错误,同样的no space路径,请问是否还有哪些额外配置?
Author
Owner

@potCookie commented on GitHub (Aug 2, 2024):

我已经通过编辑/etc/systemd/system/ollama.service文件添加了以下行来解决这个问题:
Environment="OLLAMA_MODELS=/any-path-with-space/"
然后重新启动服务:

sudo systemctl daemon-reload
sudo systemctl restart ollama.service

hi,请问下这样新的模型存储地址放到那里了

我这样配置了,启动时还是不行,依然报同样的错误,同样的no space路径,请问是否还有哪些额外配置?

最后通过扩展/tmp空间解决……

<!-- gh-comment-id:2264322069 --> @potCookie commented on GitHub (Aug 2, 2024): > > > 我已经通过编辑`/etc/systemd/system/ollama.service`文件添加了以下行来解决这个问题: > > > `Environment="OLLAMA_MODELS=/any-path-with-space/"` > > > 然后重新启动服务: > > > ``` > > > sudo systemctl daemon-reload > > > sudo systemctl restart ollama.service > > > ``` > > > > > > hi,请问下这样新的模型存储地址放到那里了 > > 我这样配置了,启动时还是不行,依然报同样的错误,同样的no space路径,请问是否还有哪些额外配置? 最后通过扩展/tmp空间解决……
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#314