[GH-ISSUE #876] Linux: In console session ollama can't answer /show requests #62459

Closed
opened 2026-05-03 09:02:39 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @byteconcepts on GitHub (Oct 22, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/876

On linux, ollama is run as a system-service and it's home directory is defined in it's system-service file.

Am I wrong? I thought, that when I start a ollama console session on a linux box, the console client is just a client that requests the ollama system-service via the api. - Is this wrong?

It seems pretty strange to me and makes absolutely no sense, that if I request some model information about the currently used model via...

/show modelfile

... ollama answers with...

error: couldn't get model
Error: stat /root/.ollama/models/manifests/registry.ollama.ai/library/llama2-uncensored/latest

...and then the client crashes.

(In this case, just for this demonstration, I started the client as user root, which noone would do normally.
If I start the client as a normal user, it's the same: ollama want's to look in the home directory of that user, which is also completely wrong.)

If not the ollama system-service, but the client program itself answers the /show commands, it should at least use the home directory of the ollama user. - In this case, the installation should provide us with the hint, that if we would like to run the ollama consol app, the user used for this must be added to the ollama usergroup.

Am I the only one who get's this error?

Originally created by @byteconcepts on GitHub (Oct 22, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/876 On linux, ollama is run as a system-service and it's home directory is defined in it's system-service file. Am I wrong? I thought, that when I start a ollama console session on a linux box, the console client is just a client that requests the ollama system-service via the api. - Is this wrong? It seems pretty strange to me and makes absolutely no sense, that if I request some model information about the currently used model via... /show modelfile ... ollama answers with... error: couldn't get model Error: stat /root/.ollama/models/manifests/registry.ollama.ai/library/llama2-uncensored/latest ...and then the client crashes. (In this case, just for this demonstration, I started the client as user root, which noone would do normally. If I start the client as a normal user, it's the same: ollama want's to look in the home directory of that user, which is also completely wrong.) If not the ollama system-service, but the client program itself answers the /show commands, it should at least use the home directory of the ollama user. - In this case, the installation should provide us with the hint, that if we would like to run the ollama consol app, the user used for this must be added to the ollama usergroup. Am I the only one who get's this error?
Author
Owner

@BruceMacD commented on GitHub (Oct 23, 2023):

Thanks for opening this issue, this should be fixed in the next release via #778

In the meantime adding the ollama service user to the current user's group should be a workaround:
usermod -aG ollama $USER

<!-- gh-comment-id:1775481296 --> @BruceMacD commented on GitHub (Oct 23, 2023): Thanks for opening this issue, this should be fixed in the next release via #778 In the meantime adding the ollama service user to the current user's group should be a workaround: `usermod -aG ollama $USER`
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#62459