[GH-ISSUE #7260] Migrate off centos 7 for intermediate build layers in container image builds #4614

Closed
opened 2026-04-12 15:31:52 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @cazlo on GitHub (Oct 18, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7260

What

Centos is dead, long live centos stream (9)

Ollama should probably not be using centos 7 now that it is unsupported and at EOL.

Why

AMD and Nvidia are no longer publishing updates to their centos 7 flavor of dependencies.

See also https://rocm.docs.amd.com/en/docs-6.2.0/about/release-notes.html

ROCm 6.2.0 marks the end of support (EoS) for:
...
CentOS 7.9

See also https://docs.nvidia.com/cuda/cuda-installation-guide-linux/ not listing centos anywhere.

See also last image Nvidia published for centos 7 was ~6 months ago: https://hub.docker.com/r/nvidia/cuda/tags?name=centos

More info

Currently there are several intermediate build layers in the container image build which utilize centos 7:

Looking at the various Nvidia and AMD docs, it seems like both support the latest EL 9 version, so I would probably try to migrate to EL9 (rockylinux 9 ) to get the latest compatible versions of core dependencies like gcc and also avoid needing to update for a long time (EL 9 EOL is several years away still).

As a quick POC, I was able to get the rocm build migrated to rocky8 with very low effort. This build performance tested the same as the current HEAD of ollama, though I did not run it through the full suite of unit tests.

Originally created by @cazlo on GitHub (Oct 18, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7260 # What [Centos is dead](https://endoflife.date/centos), long live [centos stream (9)](https://endoflife.date/centos-stream) Ollama should probably not be using centos 7 now that it is unsupported and at EOL. # Why AMD and Nvidia are no longer publishing updates to their centos 7 flavor of dependencies. See also https://rocm.docs.amd.com/en/docs-6.2.0/about/release-notes.html > ROCm 6.2.0 marks the end of support (EoS) for: > ... > CentOS 7.9 See also https://docs.nvidia.com/cuda/cuda-installation-guide-linux/ not listing centos anywhere. See also last image Nvidia published for centos 7 was ~6 months ago: https://hub.docker.com/r/nvidia/cuda/tags?name=centos # More info Currently there are several intermediate build layers in the container image build which utilize centos 7: - [cuda-11-build-amd64](https://github.com/ollama/ollama/blob/bf4018b9ecd56a5deff0c22ca2fba242a8f0101b/Dockerfile#L15) - [cuda-12-build-amd64](https://github.com/ollama/ollama/blob/bf4018b9ecd56a5deff0c22ca2fba242a8f0101b/Dockerfile#L32) - [rocm-build-amd64](https://github.com/ollama/ollama/blob/bf4018b9ecd56a5deff0c22ca2fba242a8f0101b/Dockerfile#L85) - [cpu-builder-amd64](https://github.com/ollama/ollama/blob/bf4018b9ecd56a5deff0c22ca2fba242a8f0101b/Dockerfile#L101) - this one also has transitive layers which depend on it, `container-build-amd64`, `cpu-build-amd64`, `cpu_avx-build-amd64`, and `cpu_avx2-build-amd64`) Looking at the various Nvidia and AMD docs, it seems like both support the latest EL 9 version, so I would probably try to migrate to EL9 (rockylinux 9 ) to get the latest compatible versions of core dependencies like gcc and also avoid needing to update for a long time (EL 9 EOL is several years away still). As a quick POC, I was able to get the rocm build migrated to rocky8 with very low effort. This build performance tested the same as the current HEAD of ollama, though I did not run it through the full suite of unit tests.
GiteaMirror added the buildfeature request labels 2026-04-12 15:31:53 -05:00
Author
Owner

@dhiltgen commented on GitHub (Oct 22, 2024):

We've been hesitant to move off Centos 7 for compatibility reasons. For our official linux binaries, the base image we use in the Dockerfile drives the glibc version the binary is linked against, so centos 7 has the broadest compatibility with glibc v2.17. Eventually we do anticipate we'll need to move to a newer base OS, but in doing so, the binary will no longer run on older systems.

<!-- gh-comment-id:2430163079 --> @dhiltgen commented on GitHub (Oct 22, 2024): We've been hesitant to move off Centos 7 for compatibility reasons. For our official linux binaries, the base image we use in the Dockerfile drives the glibc version the binary is linked against, so centos 7 has the broadest compatibility with glibc v2.17. Eventually we do anticipate we'll need to move to a newer base OS, but in doing so, the binary will no longer run on older systems.
Author
Owner

@cazlo commented on GitHub (Oct 22, 2024):

For what it's worth, for older systems the option to use ollama through the container image exists and should provide the compatibility desired since the container image comes packaged with a known compatible glibc version (and any other dlls).

<!-- gh-comment-id:2430213181 --> @cazlo commented on GitHub (Oct 22, 2024): For what it's worth, for older systems the option to use ollama through the container image exists and should provide the compatibility desired since the container image comes packaged with a known compatible glibc version (and any other dlls).
Author
Owner

@imweijh commented on GitHub (Feb 12, 2025):

For maximum binary compatibility, we can refer to OpenJDK and use a new operating system along with the sysroot of an older version system:
https://wiki.openjdk.org/display/Build/Supported+Build+Platforms

<!-- gh-comment-id:2653891105 --> @imweijh commented on GitHub (Feb 12, 2025): For maximum binary compatibility, we can refer to OpenJDK and use a new operating system along with the sysroot of an older version system:[ https://wiki.openjdk.org/display/Build/Supported+Build+Platforms]( https://wiki.openjdk.org/display/Build/Supported+Build+Platforms)
Author
Owner

@rick-github commented on GitHub (Feb 9, 2026):

#8211

<!-- gh-comment-id:3872567211 --> @rick-github commented on GitHub (Feb 9, 2026): #8211
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#4614