[GH-ISSUE #13795] ZImagePipeline ignores OLLAMA_MODELS ? #34797

Closed
opened 2026-04-22 18:39:33 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @kamenik on GitHub (Jan 20, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/13795

What is the issue?

Ollama version
0.14.3-rc2

OS / Distro
Ubuntu 24.04 LTS (kernel 6.14.0-37-generic #37~24.04.1-Ubuntu)

GPU / Driver / CUDA
NVIDIA RTX PRO 6000 Blackwell
Driver Version: 580.95.05
CUDA Version: 13.0

I have path to models overriden in /etc/systemd/system/ollama.service.d/override.conf

Environment="OLLAMA_MODELS=/data/ollama-models"

Classic models works well but x/z-image-turbo crashes as it tries to use default path. Model seems to be pulled correctly to my overriden path.

Relevant log output

ollama run x/z-image-turbo "cat"
Error: 500 Internal Server Error: image runner failed: Error: failed to load model: load manifest: read manifest: open /usr/share/ollama/.ollama/models/manifests/registry.ollama.ai/x/z-image-turbo/latest: no such file or directory (exit: exit status 1)

OS

Linux

GPU

Nvidia

CPU

AMD

Ollama version

0.14.3-rc2

Originally created by @kamenik on GitHub (Jan 20, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/13795 ### What is the issue? **Ollama version** 0.14.3-rc2 **OS / Distro** Ubuntu 24.04 LTS (kernel 6.14.0-37-generic #37~24.04.1-Ubuntu) **GPU / Driver / CUDA** NVIDIA RTX PRO 6000 Blackwell Driver Version: 580.95.05 CUDA Version: 13.0 I have path to models overriden in /etc/systemd/system/ollama.service.d/override.conf ``` Environment="OLLAMA_MODELS=/data/ollama-models" ``` Classic models works well but x/z-image-turbo crashes as it tries to use default path. Model seems to be pulled correctly to my overriden path. ### Relevant log output ```shell ollama run x/z-image-turbo "cat" Error: 500 Internal Server Error: image runner failed: Error: failed to load model: load manifest: read manifest: open /usr/share/ollama/.ollama/models/manifests/registry.ollama.ai/x/z-image-turbo/latest: no such file or directory (exit: exit status 1) ``` ### OS Linux ### GPU Nvidia ### CPU AMD ### Ollama version 0.14.3-rc2
GiteaMirror added the bug label 2026-04-22 18:39:33 -05:00
Author
Owner

@next-n commented on GitHub (Jan 20, 2026):

I’m working on this — if no one else is already on it.

<!-- gh-comment-id:3773825330 --> @next-n commented on GitHub (Jan 20, 2026): I’m working on this — if no one else is already on it.
Author
Owner

@next-n commented on GitHub (Jan 20, 2026):

PR opened: #13797

<!-- gh-comment-id:3774336583 --> @next-n commented on GitHub (Jan 20, 2026): PR opened: #13797
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#34797