[GH-ISSUE #11613] OLLAMA_HOST stopped working after the update to version 0.10.1. #7669

Open
opened 2026-04-12 19:45:49 -05:00 by GiteaMirror · 5 comments
Owner

Originally created by @bobyka on GitHub (Jul 31, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/11613

What is the issue?

After updating to version 0.10.1 (Ubuntu 24.04, Ollama running as a service), the setting Environment="OLLAMA_HOST=0.0.0.0:11434" no longer works as expected.
If OLLAMA_HOST is not set, the Ollama service opens ports 127.0.0.1:11434 and [::]:11434.
However, if OLLAMA_HOST is set, the service only opens the [::]:11434 port.
The previous version of Ollama worked correctly.
Environment="OLLAMA_HOST=192.168.1.10:11434" works as expected.

[Unit]
Description=Ollama Service
After=network-online.target

[Service]
Environment="OLLAMA_HOST=0.0.0.0:11434"
ExecStart=/usr/local/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=/home/milos/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin"

[Install]
WantedBy=default.target
~

Relevant log output

time=2025-07-31T22:13:32.350+02:00 level=INFO source=routes.go:1238 msg="server config" env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:4096 OLLAMA_DEBUG:INFO OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://0.0.0.0:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/home/milos/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES: http_proxy: https_proxy: no_proxy:]"
time=2025-07-31T22:13:32.351+02:00 level=INFO source=images.go:476 msg="total blobs: 0"
time=2025-07-31T22:13:32.351+02:00 level=INFO source=images.go:483 msg="total unused blobs removed: 0"
time=2025-07-31T22:13:32.351+02:00 level=INFO source=routes.go:1291 msg="Listening on [::]:11434 (version 0.10.1)"
time=2025-07-31T22:13:32.351+02:00 level=INFO source=gpu.go:217 msg="looking for compatible GPUs"
time=2025-07-31T22:13:32.363+02:00 level=INFO source=amd_linux.go:386 msg="amdgpu is supported" gpu=GPU-503df7320caffe23 gpu_type=gfx1100
time=2025-07-31T22:13:32.363+02:00 level=INFO source=amd_linux.go:386 msg="amdgpu is supported" gpu=GPU-13aa0e314faa27ce gpu_type=gfx1100
time=2025-07-31T22:13:32.366+02:00 level=INFO source=types.go:130 msg="inference compute" id=GPU-503df7320caffe23 library=rocm variant="" compute=gfx1100 driver=6.12 name=1002:744c total="24.0 GiB" available="23.7 GiB"
time=2025-07-31T22:13:32.366+02:00 level=INFO source=types.go:130 msg="inference compute" id=GPU-13aa0e314faa27ce library=rocm variant="" compute=gfx1100 driver=6.12 name=1002:744c total="20.0 GiB" available="20.0 GiB"

OS

Linux

GPU

AMD

CPU

AMD

Ollama version

0.10.1

Originally created by @bobyka on GitHub (Jul 31, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/11613 ### What is the issue? After updating to version 0.10.1 (Ubuntu 24.04, Ollama running as a service), the setting Environment="OLLAMA_HOST=0.0.0.0:11434" no longer works as expected. If OLLAMA_HOST is not set, the Ollama service opens ports 127.0.0.1:11434 and [::]:11434. However, if OLLAMA_HOST is set, the service only opens the [::]:11434 port. The previous version of Ollama worked correctly. Environment="OLLAMA_HOST=192.168.1.10:11434" works as expected. [Unit] Description=Ollama Service After=network-online.target [Service] Environment="OLLAMA_HOST=0.0.0.0:11434" ExecStart=/usr/local/bin/ollama serve User=ollama Group=ollama Restart=always RestartSec=3 Environment="PATH=/home/milos/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin" [Install] WantedBy=default.target ~ ### Relevant log output ```shell time=2025-07-31T22:13:32.350+02:00 level=INFO source=routes.go:1238 msg="server config" env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:4096 OLLAMA_DEBUG:INFO OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://0.0.0.0:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/home/milos/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES: http_proxy: https_proxy: no_proxy:]" time=2025-07-31T22:13:32.351+02:00 level=INFO source=images.go:476 msg="total blobs: 0" time=2025-07-31T22:13:32.351+02:00 level=INFO source=images.go:483 msg="total unused blobs removed: 0" time=2025-07-31T22:13:32.351+02:00 level=INFO source=routes.go:1291 msg="Listening on [::]:11434 (version 0.10.1)" time=2025-07-31T22:13:32.351+02:00 level=INFO source=gpu.go:217 msg="looking for compatible GPUs" time=2025-07-31T22:13:32.363+02:00 level=INFO source=amd_linux.go:386 msg="amdgpu is supported" gpu=GPU-503df7320caffe23 gpu_type=gfx1100 time=2025-07-31T22:13:32.363+02:00 level=INFO source=amd_linux.go:386 msg="amdgpu is supported" gpu=GPU-13aa0e314faa27ce gpu_type=gfx1100 time=2025-07-31T22:13:32.366+02:00 level=INFO source=types.go:130 msg="inference compute" id=GPU-503df7320caffe23 library=rocm variant="" compute=gfx1100 driver=6.12 name=1002:744c total="24.0 GiB" available="23.7 GiB" time=2025-07-31T22:13:32.366+02:00 level=INFO source=types.go:130 msg="inference compute" id=GPU-13aa0e314faa27ce library=rocm variant="" compute=gfx1100 driver=6.12 name=1002:744c total="20.0 GiB" available="20.0 GiB" ``` ### OS Linux ### GPU AMD ### CPU AMD ### Ollama version 0.10.1
GiteaMirror added the bug label 2026-04-12 19:45:49 -05:00
Author
Owner

@aaronpliu commented on GitHub (Aug 1, 2025):

downgrade to v0.10.0 first

<!-- gh-comment-id:3141881772 --> @aaronpliu commented on GitHub (Aug 1, 2025): downgrade to v0.10.0 first
Author
Owner

@SuperUserNameMan commented on GitHub (Aug 1, 2025):

Can't find the post, but I think I read someone official saying that this method of setting OLLAMA_HOST was deprecated in favor of these new method : https://github.com/ollama/ollama/blob/main/docs/faq.md#setting-environment-variables-on-linux

<!-- gh-comment-id:3143610489 --> @SuperUserNameMan commented on GitHub (Aug 1, 2025): Can't find the post, but I think I read someone official saying that this method of setting `OLLAMA_HOST` was deprecated in favor of these new method : https://github.com/ollama/ollama/blob/main/docs/faq.md#setting-environment-variables-on-linux
Author
Owner

@technovangelist commented on GitHub (Aug 1, 2025):

What 'new' way are you referring to? The link provided points to the standard way of setting env vars for ollama on linux since it was first available.

<!-- gh-comment-id:3145497861 --> @technovangelist commented on GitHub (Aug 1, 2025): What 'new' way are you referring to? The link provided points to the standard way of setting env vars for ollama on linux since it was first available.
Author
Owner

@SuperUserNameMan commented on GitHub (Aug 1, 2025):

What 'new' way are you referring to? The link provided points to the standard way of setting env vars for ollama on linux since it was first available.

My memory is shit, sorry. I don't recall where, but when i started using ollama (probably since version 0.5), i stumbled upon instructions somewhere in this ollama github that was explaining to add Environment="OLLAMA_HOST=0.0.0.0" or other env variables into i dont recall which ollama config file that was overwritten each time i updated ollama to the latest version. I kept doing sudo nano /etc/whatever/dunowhat.file for several versions after each updates till recently after I stumbled upon an ollama issue about someone else doing the same thing with OLLAMA_HOST, and where an other user pointed to the right documentation i linked above and stating that it was the correct way to do that.

So, since the OP copy/pasted the content of the config file i sudo nanoed ad nauseam, i guessed they must have stumbled upon the same misleading instructions than i did and missed the correct instructions the link points to.

EDIT : it is also possible that, back then, I only read the https://github.com/ollama/ollama/blob/main/docs/linux.md file and thought that since it was named "linux.md" it contained all essential information about where to set env variables.

And regarding the issue mentioned by the OP, i just would like to say that i don't have this issue with 0.10.1 on my local server.

<!-- gh-comment-id:3145918343 --> @SuperUserNameMan commented on GitHub (Aug 1, 2025): > What 'new' way are you referring to? The link provided points to the standard way of setting env vars for ollama on linux since it was first available. My memory is shit, sorry. I don't recall where, but when i started using ollama (probably since version 0.5), i stumbled upon instructions somewhere in this ollama github that was explaining to add `Environment="OLLAMA_HOST=0.0.0.0"` or other env variables into i dont recall which ollama config file that was overwritten each time i updated ollama to the latest version. I kept doing `sudo nano /etc/whatever/dunowhat.file` for several versions after each updates till recently after I stumbled upon an ollama issue about someone else doing the same thing with OLLAMA_HOST, and where an other user pointed to the right documentation i linked above and stating that it was the correct way to do that. So, since the OP copy/pasted the content of the config file i `sudo nanoed` ad nauseam, i guessed they must have stumbled upon the same misleading instructions than i did and missed the correct instructions the link points to. **EDIT :** it is also possible that, back then, I only read the https://github.com/ollama/ollama/blob/main/docs/linux.md file and thought that since it was named "linux.md" it contained all essential information about where to set env variables. And regarding the issue mentioned by the OP, i just would like to say that i don't have this issue with 0.10.1 on my local server.
Author
Owner

@eranif commented on GitHub (Sep 9, 2025):

This issue still persist on Ubuntu, I tried downgrading multiple version, does not work. The issue I am facing:

OLLAMA_HOST=0.0.0.0:12345 ollama serve

only binds ipv6.

To workaround it, I wrote a tiny proxy that binds on the port/ip and forward it to the local ollama service.

sudo service ollama start
RUST_LOG=warn ./tinyproxy --source 0.0.0.0:80 --destination 127.0.0.1:11434
<!-- gh-comment-id:3269270348 --> @eranif commented on GitHub (Sep 9, 2025): This issue still persist on Ubuntu, I tried downgrading multiple version, does not work. The issue I am facing: ```bash OLLAMA_HOST=0.0.0.0:12345 ollama serve ``` only binds ipv6. To workaround it, I wrote [a tiny proxy](https://github.com/eranif/tinyproxy) that binds on the port/ip and forward it to the local `ollama` service. ```bash sudo service ollama start RUST_LOG=warn ./tinyproxy --source 0.0.0.0:80 --destination 127.0.0.1:11434 ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#7669