[GH-ISSUE #1144] not found, try pulling it first #577

Closed
opened 2026-04-12 10:16:35 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @KadirErturk4r on GitHub (Nov 15, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1144

I am on ubuntu22.04 with ollama 0.1.9. If I run the app with

ollama run mistral,

all works great.

However when i run the following

OLLAMA_HOST=0.0.0.0:1234 ollama serve

the curl -X POST http://ipaddress:1234/api/generate -d '{"model": "mistral", "prompt": "Why is the sky blue?" }'

returns;

{"error":"model 'mistral' not found, try pulling it first"}

server restart, model reload etc did not work.

Any suggestion?

Originally created by @KadirErturk4r on GitHub (Nov 15, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1144 I am on ubuntu22.04 with ollama 0.1.9. If I run the app with ollama run mistral, all works great. However when i run the following OLLAMA_HOST=0.0.0.0:1234 ollama serve the curl -X POST http://ipaddress:1234/api/generate -d '{"model": "mistral", "prompt": "Why is the sky blue?" }' returns; {"error":"model 'mistral' not found, try pulling it first"} server restart, model reload etc did not work. Any suggestion?
Author
Owner

@technovangelist commented on GitHub (Nov 15, 2023):

hi @KadirErturk4r

It looks like you are running as two different users. For the first command, ollama run mistral, ollama serve is already running as the ollama user. But then you launch ollama serve again as the user you logged in as.

The models have been installed to the serve running as ollama, but when you run as yourself, its looking at the .ollama directory in your home directory. And there isn't anything there.

So you have two options. You could copy the files from /usr/share/ollama/.ollama/models, to the corresponding directory in your home directory. This is going to get confusing pretty quickly.

Your better option is to follow the instructions for linux here: https://github.com/jmorganca/ollama/blob/main/docs/faq.md. This will ensure you have a single server running and that it's using the correct directory and you don't have duplicate models installed.

Make sense??

<!-- gh-comment-id:1813401745 --> @technovangelist commented on GitHub (Nov 15, 2023): hi @KadirErturk4r It looks like you are running as two different users. For the first command, `ollama run mistral`, `ollama serve` is already running as the ollama user. But then you launch `ollama serve` again as the user you logged in as. The models have been installed to the serve running as ollama, but when you run as yourself, its looking at the `.ollama` directory in your home directory. And there isn't anything there. So you have two options. You could copy the files from /usr/share/ollama/.ollama/models, to the corresponding directory in your home directory. This is going to get confusing pretty quickly. Your better option is to follow the instructions for linux here: [https://github.com/jmorganca/ollama/blob/main/docs/faq.md](https://github.com/jmorganca/ollama/blob/main/docs/faq.md#how-can-i-expose-ollama-on-my-network). This will ensure you have a single server running and that it's using the correct directory and you don't have duplicate models installed. Make sense??
Author
Owner

@KadirErturk4r commented on GitHub (Nov 16, 2023):

Thanks @technovangelist. it is a bit confusing but I am able to manage. I will close the issue.

<!-- gh-comment-id:1814788344 --> @KadirErturk4r commented on GitHub (Nov 16, 2023): Thanks @technovangelist. it is a bit confusing but I am able to manage. I will close the issue.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#577