[GH-ISSUE #885] Add Parameter Environment="OLLAMA_HOST=127.0.0.1:11434" to the ollama.service file #427

Closed
opened 2026-04-12 10:05:05 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @byteconcepts on GitHub (Oct 23, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/885

For those who prefer to use ollama primarily via it's API it would be nice if the ollama.service file would already contain the line...
Environment="OLLAMA_HOST=127.0.0.1:11434"

Additionally it would be nice if the info that for the service the interface IP and Port may be changed in this file, would be added to the Readme.

For access over all interfaces on port 4711 change to:
Environment="OLLAMA_HOST=0.0.0.0:4711"

But if you like to use the console command then, you then need to use it like this:
OLLAMA_HOST="[REAL-INTERFACE-IP-ADDRESS]:4711" ollama show llama2-uncensored --modelfile

As shortcuts, you then may add for example two aliases to your ~/.bash_aliases:

alias ollama-run='OLLAMA_HOST="[REAL-INTERFACE-IP-ADDRESS or localhost]:4711" ollama run'
alias ollama-list='OLLAMA_HOST="[REAL-INTERFACE-IP-ADDRESS or localhost]:4711" ollama list'
alias ollama-show='OLLAMA_HOST="[REAL-INTERFACE-IP-ADDRESS or localhost]:4711" ollama show'

After you have added the aliases, you need to logoff and logon again or do...
source ~/.bash_aliases

Additionally, if you like to run for example chatbot-ollama on the same machine like ollama, but like to make chatbot-ollama available for the whole network, you may then start that one with this command:
OLLAMA_HOST="http://127.0.0.1:4711" npm run dev -- -H [REAL-INTERFACE-IP-ADDRESS-OF-CHATBOT-OLLAMA]

If you do that on a machine accessible by the public, be sure to block the chatbot-ollama service/website for the public or use eg. apache or nginx as a proxy to protect it with username/password protection.

Originally created by @byteconcepts on GitHub (Oct 23, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/885 For those who prefer to use ollama primarily via it's API it would be nice if the ollama.service file would already contain the line... `Environment="OLLAMA_HOST=127.0.0.1:11434"` Additionally it would be nice if the info that for the service the interface IP and Port may be changed in this file, would be added to the Readme. For access over all interfaces on port 4711 change to: `Environment="OLLAMA_HOST=0.0.0.0:4711"` But if you like to use the console command then, you then need to use it like this: `OLLAMA_HOST="[REAL-INTERFACE-IP-ADDRESS]:4711" ollama show llama2-uncensored --modelfile` As shortcuts, you then may add for example two aliases to your ~/.bash_aliases: ``` alias ollama-run='OLLAMA_HOST="[REAL-INTERFACE-IP-ADDRESS or localhost]:4711" ollama run' alias ollama-list='OLLAMA_HOST="[REAL-INTERFACE-IP-ADDRESS or localhost]:4711" ollama list' alias ollama-show='OLLAMA_HOST="[REAL-INTERFACE-IP-ADDRESS or localhost]:4711" ollama show' ``` After you have added the aliases, you need to logoff and logon again or do... `source ~/.bash_aliases` Additionally, if you like to run for example chatbot-ollama on the same machine like ollama, but like to make chatbot-ollama available for the whole network, you may then start that one with this command: `OLLAMA_HOST="http://127.0.0.1:4711" npm run dev -- -H [REAL-INTERFACE-IP-ADDRESS-OF-CHATBOT-OLLAMA]` If you do that on a machine accessible by the public, be sure to block the chatbot-ollama service/website for the public or use eg. apache or nginx as a proxy to protect it with username/password protection.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#427