[GH-ISSUE #717] Change system model when running as a service #46841

Closed
opened 2026-04-28 00:46:26 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @wifiuk on GitHub (Oct 6, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/717

If I originally was messing around with Llama 7b and got it running as a background service, how do I change the model that it uses?

Originally created by @wifiuk on GitHub (Oct 6, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/717 If I originally was messing around with Llama 7b and got it running as a background service, how do I change the model that it uses?
Author
Owner

@BruceMacD commented on GitHub (Oct 6, 2023):

Thanks for the question.

Ollama orchestrates which model is loaded based on the model requested in your prompt.

In other words, all you have to do is send a new request to Ollama, and if the requested model is changed the currently running model is un-loaded and the new one is loaded in.

Ex:

$ ollama run llama2 hello
# llama2 is loaded and used to generate a response

$ ollama run mistral hello
# llama2 is un-loaded, mistral is loaded, and used to generate a response
<!-- gh-comment-id:1750826980 --> @BruceMacD commented on GitHub (Oct 6, 2023): Thanks for the question. Ollama orchestrates which model is loaded based on the model requested in your prompt. In other words, all you have to do is send a new request to Ollama, and if the requested model is changed the currently running model is un-loaded and the new one is loaded in. Ex: ``` $ ollama run llama2 hello # llama2 is loaded and used to generate a response $ ollama run mistral hello # llama2 is un-loaded, mistral is loaded, and used to generate a response ```
Author
Owner

@wifiuk commented on GitHub (Oct 6, 2023):

i mean as the system service, not just in the terminal

<!-- gh-comment-id:1750843676 --> @wifiuk commented on GitHub (Oct 6, 2023): i mean as the system service, not just in the terminal
Author
Owner

@BruceMacD commented on GitHub (Oct 6, 2023):

No problem, the system service is what actually manages the running model rather than the CLI tool. The CLI tool is just an interface for the system service.

<!-- gh-comment-id:1750946764 --> @BruceMacD commented on GitHub (Oct 6, 2023): No problem, the system service is what actually manages the running model rather than the CLI tool. The CLI tool is just an interface for the system service.
Author
Owner

@wifiuk commented on GitHub (Oct 6, 2023):

i dont see where to mention the model in the ollama.service file:

GNU nano 7.2 /etc/systemd/system/ollama.service
[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="HOME=/usr/share/ollama"
Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/snap/bin"
Environment="OLLAMA_HOST=0.0.0.0:11435"
Environment="OLLAMA_ORIGINS=http://x.x.x.x:*"
[Install]
WantedBy=default.target

<!-- gh-comment-id:1750966292 --> @wifiuk commented on GitHub (Oct 6, 2023): i dont see where to mention the model in the ollama.service file: GNU nano 7.2 /etc/systemd/system/ollama.service [Unit] Description=Ollama Service After=network-online.target [Service] ExecStart=/usr/local/bin/ollama serve User=ollama Group=ollama Restart=always RestartSec=3 Environment="HOME=/usr/share/ollama" Environment="PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/snap/bin" Environment="OLLAMA_HOST=0.0.0.0:11435" Environment="OLLAMA_ORIGINS=http://x.x.x.x:*" [Install] WantedBy=default.target
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#46841