[GH-ISSUE #3740] ollama serve yields "'model' not found" error because of incorrect default OLLAMA_MODELS value on Linux #64341

Closed
opened 2026-05-03 17:11:08 -05:00 by GiteaMirror · 7 comments
Owner

Originally created by @cedricvidal on GitHub (Apr 19, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3740

What is the issue?

On Linux, the pull and run commands use the /usr/share/ollama/.ollama/models models folder but serve uses the OLLAMA_MODELS env var which defaults to ~/.ollama/models as per the help:

$ ollama serve --help
    OLLAMA_MODELS       The path to the models directory (default is "~/.ollama/models")

By default, the serve command will therefore not see any models and will yield an error such as the following:

'llama3:latest' not found, try pulling it first

A workaround is to set the OLLAMA_MODELS to point to the /usr/share/ollama/.ollama/models directory like so:

export OLLAMA_MODELS=/usr/share/ollama/.ollama/models

OS

Linux

GPU

Nvidia

CPU

Intel

Ollama version

0.1.32

Originally created by @cedricvidal on GitHub (Apr 19, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3740 ### What is the issue? On Linux, the `pull` and `run` commands use the `/usr/share/ollama/.ollama/models` models folder but `serve` uses the `OLLAMA_MODELS` env var which defaults to `~/.ollama/models` as per the help: ``` $ ollama serve --help OLLAMA_MODELS The path to the models directory (default is "~/.ollama/models") ``` By default, the `serve` command will therefore not see any models and will yield an error such as the following: ``` 'llama3:latest' not found, try pulling it first ``` A workaround is to set the `OLLAMA_MODELS` to point to the `/usr/share/ollama/.ollama/models` directory like so: ``` export OLLAMA_MODELS=/usr/share/ollama/.ollama/models ``` ### OS Linux ### GPU Nvidia ### CPU Intel ### Ollama version 0.1.32
GiteaMirror added the bug label 2026-05-03 17:11:08 -05:00
Author
Owner

@woshilaowanga commented on GitHub (May 14, 2024):

same question in windows

<!-- gh-comment-id:2109428413 --> @woshilaowanga commented on GitHub (May 14, 2024): same question in windows
Author
Owner

@pdevine commented on GitHub (May 14, 2024):

Hey @cedricvidal , the ollama pull and ollama run commands talk directly to the ollama server using the REST API and do not look for models on disk at all.

Did you by chance change the OLLAMA_MODELS environment variable after using pull or run? Are you running ollama through systemd or in some other way?

I'll go ahead and close the issue since it's running as intended, but please feel free to keep commenting.

<!-- gh-comment-id:2111332514 --> @pdevine commented on GitHub (May 14, 2024): Hey @cedricvidal , the `ollama pull` and `ollama run` commands talk directly to the ollama server using the REST API and do not look for models on disk at all. Did you by chance change the `OLLAMA_MODELS` environment variable _after_ using `pull` or `run`? Are you running ollama through systemd or in some other way? I'll go ahead and close the issue since it's running as intended, but please feel free to keep commenting.
Author
Owner

@woshilaowanga commented on GitHub (May 16, 2024):

then I try to pull in CMD,issue is only pull in webui

<!-- gh-comment-id:2113874144 --> @woshilaowanga commented on GitHub (May 16, 2024): then I try to pull in CMD,issue is only pull in webui
Author
Owner

@pdevine commented on GitHub (May 16, 2024):

@woshilaowanga I think check with the webui project? Maybe they are doing things incorrectly.

<!-- gh-comment-id:2116192589 --> @pdevine commented on GitHub (May 16, 2024): @woshilaowanga I think check with the webui project? Maybe they are doing things incorrectly.
Author
Owner

@PranayShah commented on GitHub (Jun 1, 2024):

I did run into exactly the same issue on Linux. As you can see below
Screenshot 2024-06-01 at 12 29 04 PM

And setting OLLAMA_MODELS fixed it for me.

<!-- gh-comment-id:2143331998 --> @PranayShah commented on GitHub (Jun 1, 2024): I did run into exactly the same issue on Linux. As you can see below <img width="1303" alt="Screenshot 2024-06-01 at 12 29 04 PM" src="https://github.com/ollama/ollama/assets/2013275/c18d2005-c281-4ce2-894e-cf20ccfae65a"> And setting `OLLAMA_MODELS` fixed it for me.
Author
Owner

@EzeLLM commented on GitHub (Nov 22, 2024):

setting OLLAMA_MODELS fixed it for me too.
thanks

<!-- gh-comment-id:2493088128 --> @EzeLLM commented on GitHub (Nov 22, 2024): setting OLLAMA_MODELS fixed it for me too. thanks
Author
Owner

@sourcesync commented on GitHub (Mar 19, 2026):

OLLAMA_MODELS works on Jetson Orin
I run the following:

OLLAMA_HOST=0.0.0.0:1234 OLLAMA_MODELS=/usr/share/ollama/.ollama/models ollama serve
<!-- gh-comment-id:4093164451 --> @sourcesync commented on GitHub (Mar 19, 2026): OLLAMA_MODELS works on Jetson Orin I run the following: ``` OLLAMA_HOST=0.0.0.0:1234 OLLAMA_MODELS=/usr/share/ollama/.ollama/models ollama serve ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#64341