[GH-ISSUE #13727] Release v0.14.1 for Linux (amd64) contains binary version 0.13.5 #71059

Closed
opened 2026-05-04 23:52:33 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @enprstroman-del on GitHub (Jan 15, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/13727

What is the issue?

pto04@EPS-PTO-04:/tmp/test_ollama$ curl -L -H 'Cache-Control: no-cache' -o ollama-linux-amd64.tar.zst https://github.com/ollama/ollama/releases/download/v0.14.1/ollama-linux-amd64.tar.zst
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
100 1696M 100 1696M 0 0 5474k 0 0:05:17 0:05:17 --:--:-- 6807k
pto04@EPS-PTO-04:/tmp/test_ollama$ mkdir -p /tmp/verify && cd /tmp/verify
pto04@EPS-PTO-04:/tmp/verify$ tar -I zstd -xvf ../ollama-linux-amd64.tar.zst
bin/ollama
bin/ollama-mlx
lib/ollama/cuda_v12/
lib/ollama/cuda_v12/libcudart.so.12.8.90
lib/ollama/cuda_v12/libcublasLt.so.12
lib/ollama/cuda_v12/libcublas.so.12.8.4.1
lib/ollama/cuda_v12/libcublas.so.12
lib/ollama/cuda_v12/libcublasLt.so.12.8.4.1
lib/ollama/cuda_v12/libggml-cuda.so
lib/ollama/cuda_v12/libcudart.so.12
lib/ollama/cuda_v13/
lib/ollama/cuda_v13/libcublas.so.13
lib/ollama/cuda_v13/libcublas.so.13.1.0.3
lib/ollama/cuda_v13/libcublasLt.so.13.1.0.3
lib/ollama/cuda_v13/libcublasLt.so.13
lib/ollama/cuda_v13/libcudart.so.13
lib/ollama/cuda_v13/libggml-cuda.so
lib/ollama/cuda_v13/libcudart.so.13.0.96
lib/ollama/libggml-base.so
lib/ollama/libggml-base.so.0
lib/ollama/libggml-base.so.0.0.0
lib/ollama/libggml-cpu-alderlake.so
lib/ollama/libggml-cpu-haswell.so
lib/ollama/libggml-cpu-icelake.so
lib/ollama/libggml-cpu-sandybridge.so
lib/ollama/libggml-cpu-skylakex.so
lib/ollama/libggml-cpu-sse42.so
lib/ollama/libggml-cpu-x64.so
lib/ollama/mlx_cuda_v13/
lib/ollama/mlx_cuda_v13/libcublas.so.13
lib/ollama/mlx_cuda_v13/libnvrtc.so.13.0.88
lib/ollama/mlx_cuda_v13/libcudnn_ops.so.9.17.1
lib/ollama/mlx_cuda_v13/libcudart.so
lib/ollama/mlx_cuda_v13/libcublas.so.13.1.0.3
lib/ollama/mlx_cuda_v13/libcudnn_engines_runtime_compiled.so.9.17.1
lib/ollama/mlx_cuda_v13/libcublasLt.so.13.1.0.3
lib/ollama/mlx_cuda_v13/libnccl.so.2.29.2
lib/ollama/mlx_cuda_v13/libmlxc.so
lib/ollama/mlx_cuda_v13/libmlx.so
lib/ollama/mlx_cuda_v13/libcudnn_engines_precompiled.so.9.17.1
lib/ollama/mlx_cuda_v13/libcublas.so
lib/ollama/mlx_cuda_v13/libcudnn_heuristic.so.9.17.1
lib/ollama/mlx_cuda_v13/libgfortran.so.5
lib/ollama/mlx_cuda_v13/libcudnn_heuristic.so.9
lib/ollama/mlx_cuda_v13/libnvrtc.so.13
lib/ollama/mlx_cuda_v13/libcublasLt.so.13
lib/ollama/mlx_cuda_v13/libcudnn_graph.so.9.17.1
lib/ollama/mlx_cuda_v13/libopenblas-r0.3.15.so
lib/ollama/mlx_cuda_v13/libcudnn_engines_runtime_compiled.so.9
lib/ollama/mlx_cuda_v13/libcudnn_adv.so.9.17.1
lib/ollama/mlx_cuda_v13/libcudnn_cnn.so.9
lib/ollama/mlx_cuda_v13/libcudart.so.13
lib/ollama/mlx_cuda_v13/libcudnn.so.9.17.1
lib/ollama/mlx_cuda_v13/libcudnn_graph.so.9
lib/ollama/mlx_cuda_v13/libnccl.so.2
lib/ollama/mlx_cuda_v13/libcudnn.so.9
lib/ollama/mlx_cuda_v13/libgfortran.so.5.0.0
lib/ollama/mlx_cuda_v13/libcudnn_engines_precompiled.so.9
lib/ollama/mlx_cuda_v13/libcudnn_cnn.so.9.17.1
lib/ollama/mlx_cuda_v13/libcudart.so.13.0.96
lib/ollama/mlx_cuda_v13/libcudnn_adv.so.9
lib/ollama/mlx_cuda_v13/libcudnn_ops.so.9
lib/ollama/mlx_cuda_v13/libopenblas.so.0
lib/ollama/vulkan/
lib/ollama/vulkan/libvulkan.so.1
lib/ollama/vulkan/libvulkan.so.1.4.321
lib/ollama/vulkan/libggml-vulkan.so
pto04@EPS-PTO-04:/tmp/verify$ ./bin/ollama --version
ollama version is 0.13.5
Warning: client version is 0.14.1
pto04@EPS-PTO-04:/tmp/verify$ sha256sum /tmp/verify/../ollama-linux-amd64.tar.zst
93729e94dd4bb60889a1a4987fc09f932a2bdadc6ecc30a6194afb384e48bd0b /tmp/verify/../ollama-linux-amd64.tar.zst

Relevant log output


OS

No response

GPU

No response

CPU

No response

Ollama version

No response

Originally created by @enprstroman-del on GitHub (Jan 15, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/13727 ### What is the issue? pto04@EPS-PTO-04:/tmp/test_ollama$ curl -L -H 'Cache-Control: no-cache' -o ollama-linux-amd64.tar.zst https://github.com/ollama/ollama/releases/download/v0.14.1/ollama-linux-amd64.tar.zst % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 1696M 100 1696M 0 0 5474k 0 0:05:17 0:05:17 --:--:-- 6807k pto04@EPS-PTO-04:/tmp/test_ollama$ mkdir -p /tmp/verify && cd /tmp/verify pto04@EPS-PTO-04:/tmp/verify$ tar -I zstd -xvf ../ollama-linux-amd64.tar.zst bin/ollama bin/ollama-mlx lib/ollama/cuda_v12/ lib/ollama/cuda_v12/libcudart.so.12.8.90 lib/ollama/cuda_v12/libcublasLt.so.12 lib/ollama/cuda_v12/libcublas.so.12.8.4.1 lib/ollama/cuda_v12/libcublas.so.12 lib/ollama/cuda_v12/libcublasLt.so.12.8.4.1 lib/ollama/cuda_v12/libggml-cuda.so lib/ollama/cuda_v12/libcudart.so.12 lib/ollama/cuda_v13/ lib/ollama/cuda_v13/libcublas.so.13 lib/ollama/cuda_v13/libcublas.so.13.1.0.3 lib/ollama/cuda_v13/libcublasLt.so.13.1.0.3 lib/ollama/cuda_v13/libcublasLt.so.13 lib/ollama/cuda_v13/libcudart.so.13 lib/ollama/cuda_v13/libggml-cuda.so lib/ollama/cuda_v13/libcudart.so.13.0.96 lib/ollama/libggml-base.so lib/ollama/libggml-base.so.0 lib/ollama/libggml-base.so.0.0.0 lib/ollama/libggml-cpu-alderlake.so lib/ollama/libggml-cpu-haswell.so lib/ollama/libggml-cpu-icelake.so lib/ollama/libggml-cpu-sandybridge.so lib/ollama/libggml-cpu-skylakex.so lib/ollama/libggml-cpu-sse42.so lib/ollama/libggml-cpu-x64.so lib/ollama/mlx_cuda_v13/ lib/ollama/mlx_cuda_v13/libcublas.so.13 lib/ollama/mlx_cuda_v13/libnvrtc.so.13.0.88 lib/ollama/mlx_cuda_v13/libcudnn_ops.so.9.17.1 lib/ollama/mlx_cuda_v13/libcudart.so lib/ollama/mlx_cuda_v13/libcublas.so.13.1.0.3 lib/ollama/mlx_cuda_v13/libcudnn_engines_runtime_compiled.so.9.17.1 lib/ollama/mlx_cuda_v13/libcublasLt.so.13.1.0.3 lib/ollama/mlx_cuda_v13/libnccl.so.2.29.2 lib/ollama/mlx_cuda_v13/libmlxc.so lib/ollama/mlx_cuda_v13/libmlx.so lib/ollama/mlx_cuda_v13/libcudnn_engines_precompiled.so.9.17.1 lib/ollama/mlx_cuda_v13/libcublas.so lib/ollama/mlx_cuda_v13/libcudnn_heuristic.so.9.17.1 lib/ollama/mlx_cuda_v13/libgfortran.so.5 lib/ollama/mlx_cuda_v13/libcudnn_heuristic.so.9 lib/ollama/mlx_cuda_v13/libnvrtc.so.13 lib/ollama/mlx_cuda_v13/libcublasLt.so.13 lib/ollama/mlx_cuda_v13/libcudnn_graph.so.9.17.1 lib/ollama/mlx_cuda_v13/libopenblas-r0.3.15.so lib/ollama/mlx_cuda_v13/libcudnn_engines_runtime_compiled.so.9 lib/ollama/mlx_cuda_v13/libcudnn_adv.so.9.17.1 lib/ollama/mlx_cuda_v13/libcudnn_cnn.so.9 lib/ollama/mlx_cuda_v13/libcudart.so.13 lib/ollama/mlx_cuda_v13/libcudnn.so.9.17.1 lib/ollama/mlx_cuda_v13/libcudnn_graph.so.9 lib/ollama/mlx_cuda_v13/libnccl.so.2 lib/ollama/mlx_cuda_v13/libcudnn.so.9 lib/ollama/mlx_cuda_v13/libgfortran.so.5.0.0 lib/ollama/mlx_cuda_v13/libcudnn_engines_precompiled.so.9 lib/ollama/mlx_cuda_v13/libcudnn_cnn.so.9.17.1 lib/ollama/mlx_cuda_v13/libcudart.so.13.0.96 lib/ollama/mlx_cuda_v13/libcudnn_adv.so.9 lib/ollama/mlx_cuda_v13/libcudnn_ops.so.9 lib/ollama/mlx_cuda_v13/libopenblas.so.0 lib/ollama/vulkan/ lib/ollama/vulkan/libvulkan.so.1 lib/ollama/vulkan/libvulkan.so.1.4.321 lib/ollama/vulkan/libggml-vulkan.so pto04@EPS-PTO-04:/tmp/verify$ ./bin/ollama --version ollama version is 0.13.5 Warning: client version is 0.14.1 pto04@EPS-PTO-04:/tmp/verify$ sha256sum /tmp/verify/../ollama-linux-amd64.tar.zst 93729e94dd4bb60889a1a4987fc09f932a2bdadc6ecc30a6194afb384e48bd0b /tmp/verify/../ollama-linux-amd64.tar.zst ### Relevant log output ```shell ``` ### OS _No response_ ### GPU _No response_ ### CPU _No response_ ### Ollama version _No response_
GiteaMirror added the bug label 2026-05-04 23:52:33 -05:00
Author
Owner

@rick-github commented on GitHub (Jan 15, 2026):

The server you are running is 0.13.5. The bundle you just downloaded (the client) is 0.14.1.

<!-- gh-comment-id:3753806625 --> @rick-github commented on GitHub (Jan 15, 2026): The server you are running is 0.13.5. The bundle you just downloaded (the client) is 0.14.1.
Author
Owner

@YuenSzeHong commented on GitHub (Jan 16, 2026):

The server you are running is 0.13.5. The bundle you just downloaded (the client) is 0.14.1.

I just pulled latest image, and I can confirm that the image contains version 0.13.5

hong@YUEN-DESKTOP:~$ podman exec -it ollama bash -c 'ollama run x/z-image-turbo'
pulling manifest
Error: pull model manifest: 412:

The model you are attempting to pull requires a newer version of Ollama.

Please download the latest version at:

        https://ollama.com/download


hong@YUEN-DESKTOP:~$ podman exec -it ollama bash -c 'ollama -v'
ollama version is 0.13.5
<!-- gh-comment-id:3760223515 --> @YuenSzeHong commented on GitHub (Jan 16, 2026): > The server you are running is 0.13.5. The bundle you just downloaded (the client) is 0.14.1. I just pulled latest image, and I can confirm that the image contains version 0.13.5 ```console hong@YUEN-DESKTOP:~$ podman exec -it ollama bash -c 'ollama run x/z-image-turbo' pulling manifest Error: pull model manifest: 412: The model you are attempting to pull requires a newer version of Ollama. Please download the latest version at: https://ollama.com/download hong@YUEN-DESKTOP:~$ podman exec -it ollama bash -c 'ollama -v' ollama version is 0.13.5 ```
Author
Owner

@rick-github commented on GitHub (Jan 16, 2026):

If you installed the official image and the client and the server return 0.13.5, you are running version 0.13.5.

<!-- gh-comment-id:3760462032 --> @rick-github commented on GitHub (Jan 16, 2026): If you installed the official image and the client and the server return 0.13.5, you are running version 0.13.5.
Author
Owner

@YuenSzeHong commented on GitHub (Jan 17, 2026):

If you installed the official image and the client and the server return 0.13.5, you are running version 0.13.5.

well, i pulled from ghcr, and i pulled latest, and the client binary say it is 0.13.5

<!-- gh-comment-id:3762338681 --> @YuenSzeHong commented on GitHub (Jan 17, 2026): > If you installed the official image and the client and the server return 0.13.5, you are running version 0.13.5. well, i pulled from ghcr, and i pulled latest, and the **client** binary say it is 0.13.5
Author
Owner

@rick-github commented on GitHub (Jan 17, 2026):

Yes, if the client binary says 0.13.5 then you are running 0.13.5. Have you restarted the container?

<!-- gh-comment-id:3762363136 --> @rick-github commented on GitHub (Jan 17, 2026): Yes, if the client binary says 0.13.5 then you are running 0.13.5. Have you restarted the container?
Author
Owner

@YuenSzeHong commented on GitHub (Jan 17, 2026):

Yes, if the client binary says 0.13.5 then you are running 0.13.5. Have you restarted the container?

probably podman didn't recreate it

<!-- gh-comment-id:3762663055 --> @YuenSzeHong commented on GitHub (Jan 17, 2026): > Yes, if the client binary says 0.13.5 then you are running 0.13.5. Have you restarted the container? probably podman didn't recreate it
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#71059