[GH-ISSUE #9506] Ollama errors on older versions of Linux/GLIBC on 0.5.13 #68250

Open
opened 2026-05-04 12:59:58 -05:00 by GiteaMirror · 39 comments
Owner

Originally created by @scomper on GitHub (Mar 5, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/9506

What is the issue?

After updating to Ollama 0.5.13, running it on CentOS Linux release 7.9.2009 (Core) results in the following errors:

ollama: /lib64/libm.so.6: version `GLIBC_2.27' not found (required by ollama)  
ollama: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.25' not found (required by ollama)  
ollama: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.20' not found (required by ollama)  
ollama: /lib64/libstdc++.so.6: version `CXXABI_1.3.9' not found (required by ollama)  
ollama: /lib64/libstdc++.so.6: version `CXXABI_1.3.11' not found (required by ollama)  
ollama: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.21' not found (required by ollama)  
ollama: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.22' not found (required by ollama)  

This indicates that the current system's glibc and libstdc++ versions are too low to meet Ollama's dependencies. Could you please provide guidance on how to resolve this issue or consider adding support for older Linux distributions like CentOS 7?

Relevant log output


OS

CentOS Linux release 7.9.2009 (Core)

GPU

No response

CPU

No response

Ollama version

0.5.13

Originally created by @scomper on GitHub (Mar 5, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/9506 ### What is the issue? After updating to Ollama 0.5.13, running it on CentOS Linux release 7.9.2009 (Core) results in the following errors: ``` ollama: /lib64/libm.so.6: version `GLIBC_2.27' not found (required by ollama) ollama: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.25' not found (required by ollama) ollama: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.20' not found (required by ollama) ollama: /lib64/libstdc++.so.6: version `CXXABI_1.3.9' not found (required by ollama) ollama: /lib64/libstdc++.so.6: version `CXXABI_1.3.11' not found (required by ollama) ollama: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.21' not found (required by ollama) ollama: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.22' not found (required by ollama) ``` This indicates that the current system's glibc and libstdc++ versions are too low to meet Ollama's dependencies. Could you please provide guidance on how to resolve this issue or consider adding support for older Linux distributions like CentOS 7? ### Relevant log output ```shell ``` ### OS CentOS Linux release 7.9.2009 (Core) ### GPU _No response_ ### CPU _No response_ ### Ollama version 0.5.13
GiteaMirror added the bug label 2026-05-04 12:59:58 -05:00
Author
Owner

@lfandrh commented on GitHub (Mar 5, 2025):

I encountered the same problem. It seems that the glibc version does not meet the requirements(need 2.27), but the upgrade is complicated and will cause system instability.

<!-- gh-comment-id:2700017481 --> @lfandrh commented on GitHub (Mar 5, 2025): I encountered the same problem. It seems that the glibc version does not meet the requirements(need 2.27), but the upgrade is complicated and will cause system instability.
Author
Owner

@leslie2046 commented on GitHub (Mar 5, 2025):

+1

<!-- gh-comment-id:2700910937 --> @leslie2046 commented on GitHub (Mar 5, 2025): +1
Author
Owner

@jmorganca commented on GitHub (Mar 5, 2025):

Hi folks, so sorry about the error . 0.5.13 requires a newer version of Linux/glibc (starting with centos/rhel 8) – we'll see if we can revisit this to lower the requirements

<!-- gh-comment-id:2700944610 --> @jmorganca commented on GitHub (Mar 5, 2025): Hi folks, so sorry about the error . 0.5.13 requires a newer version of Linux/glibc (starting with centos/rhel 8) – we'll see if we can revisit this to lower the requirements
Author
Owner

@jmorganca commented on GitHub (Mar 5, 2025):

In the meantime the previous version can be downloaded with curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.5.12 sh

<!-- gh-comment-id:2700952308 --> @jmorganca commented on GitHub (Mar 5, 2025): In the meantime the previous version can be downloaded with `curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.5.12 sh`
Author
Owner

@DirtyKnightForVi commented on GitHub (Mar 6, 2025):

In the meantime the previous version can be downloaded with curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.5.12 sh

same errors. but i have to back to 0.5.10. maybe not the reasons that caused by version of Linux/glibc.

<!-- gh-comment-id:2703631236 --> @DirtyKnightForVi commented on GitHub (Mar 6, 2025): > In the meantime the previous version can be downloaded with `curl -fsSL https://ollama.com/install.sh | OLLAMA_VERSION=0.5.12 sh` same errors. but i have to back to 0.5.10. maybe not the reasons that caused by version of Linux/glibc.
Author
Owner

@Yaunghow commented on GitHub (Mar 7, 2025):

I have met the same problem.

<!-- gh-comment-id:2705878122 --> @Yaunghow commented on GitHub (Mar 7, 2025): I have met the same problem.
Author
Owner

@duffybelfield commented on GitHub (Mar 7, 2025):

+1

<!-- gh-comment-id:2706268068 --> @duffybelfield commented on GitHub (Mar 7, 2025): +1
Author
Owner

@Yaunghow commented on GitHub (Mar 7, 2025):

Hey, I found an alternative way to solve this problem by directly pulling a docker image of Ollama, without needing to change any system environment configurations. You can find more detail at: https://ollama.com/blog/ollama-is-now-available-as-an-official-docker-image. Hope this helps!

<!-- gh-comment-id:2706468490 --> @Yaunghow commented on GitHub (Mar 7, 2025): Hey, I found an alternative way to solve this problem by directly pulling a docker image of Ollama, without needing to change any system environment configurations. You can find more detail at: https://ollama.com/blog/ollama-is-now-available-as-an-official-docker-image. Hope this helps!
Author
Owner

@duffybelfield commented on GitHub (Mar 7, 2025):

It's the same issue on the latest tag, I'm using 0.5.12 container.

<!-- gh-comment-id:2706785369 --> @duffybelfield commented on GitHub (Mar 7, 2025): It's the same issue on the `latest` tag, I'm using 0.5.12 container.
Author
Owner

@brushknight commented on GitHub (Mar 10, 2025):

I have a similar issue with latest ollama docker image

docker run  --rm -it --runtime=nvidia --gpus all ollama/ollama
Couldn't find '/root/.ollama/id_ed25519'. Generating new private key.
Your new public key is:

ssh-ed25519 Redacted

2025/03/10 11:59:12 routes.go:1215: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:2048 OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://0.0.0.0:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/root/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES: http_proxy: https_proxy: no_proxy:]"
time=2025-03-10T11:59:12.733Z level=INFO source=images.go:432 msg="total blobs: 0"
time=2025-03-10T11:59:12.733Z level=INFO source=images.go:439 msg="total unused blobs removed: 0"
time=2025-03-10T11:59:12.734Z level=INFO source=routes.go:1277 msg="Listening on [::]:11434 (version 0.5.13)"
time=2025-03-10T11:59:12.734Z level=INFO source=gpu.go:217 msg="looking for compatible GPUs"
time=2025-03-10T11:59:12.736Z level=INFO source=gpu.go:612 msg="Unable to load cudart library /usr/lib/aarch64-linux-gnu/nvidia/libcuda.so.1.1: Unable to load /usr/lib/aarch64-linux-gnu/nvidia/libcuda.so.1.1 library to query for Nvidia GPUs: /usr/lib/aarch64-linux-gnu/libc.so.6: version `GLIBC_2.34' not found (required by /usr/lib/aarch64-linux-gnu/nvidia/libnvrm_gpu.so)"
time=2025-03-10T11:59:12.738Z level=INFO source=gpu.go:377 msg="no compatible GPUs were discovered"
time=2025-03-10T11:59:12.738Z level=INFO source=types.go:130 msg="inference compute" id=0 library=cpu variant="" compute="" driver=0.0 name="" total="61.4 GiB" available="49.2 GiB"

Here is the error line

time=2025-03-10T11:59:12.736Z level=INFO source=gpu.go:612 msg="Unable to load cudart library /usr/lib/aarch64-linux-gnu/nvidia/libcuda.so.1.1: Unable to load /usr/lib/aarch64-linux-gnu/nvidia/libcuda.so.1.1 library to query for Nvidia GPUs: /usr/lib/aarch64-linux-gnu/libc.so.6: version `GLIBC_2.34' not found (required by /usr/lib/aarch64-linux-gnu/nvidia/libnvrm_gpu.so)"

Here is the host Glibc

ldd --version
ldd (Ubuntu GLIBC 2.35-0ubuntu3) 2.35
Copyright (C) 2022 Free Software Foundation, Inc.
This is free software; see the source for copying conditions.  There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
Written by Roland McGrath and Ulrich Drepper.

Running Ubuntu 22.04 with Orin AGX

cat /etc/os-release
PRETTY_NAME="Ubuntu 22.04 LTS"
NAME="Ubuntu"
VERSION_ID="22.04"
VERSION="22.04 (Jammy Jellyfish)"
VERSION_CODENAME=jammy
ID=ubuntu
ID_LIKE=debian
HOME_URL="https://www.ubuntu.com/"
SUPPORT_URL="https://help.ubuntu.com/"
BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/"
PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy"
UBUNTU_CODENAME=jammy

Do you have any recommendations how to resolve this issue without changing host environment? Should we build this container from sources on that machine or some other work arounds? Thank you in advance!

Updated:
The last image that works fine is 0.5.7 for arm64

<!-- gh-comment-id:2710353813 --> @brushknight commented on GitHub (Mar 10, 2025): I have a similar issue with latest ollama docker image ``` docker run --rm -it --runtime=nvidia --gpus all ollama/ollama Couldn't find '/root/.ollama/id_ed25519'. Generating new private key. Your new public key is: ssh-ed25519 Redacted 2025/03/10 11:59:12 routes.go:1215: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:2048 OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://0.0.0.0:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/root/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES: http_proxy: https_proxy: no_proxy:]" time=2025-03-10T11:59:12.733Z level=INFO source=images.go:432 msg="total blobs: 0" time=2025-03-10T11:59:12.733Z level=INFO source=images.go:439 msg="total unused blobs removed: 0" time=2025-03-10T11:59:12.734Z level=INFO source=routes.go:1277 msg="Listening on [::]:11434 (version 0.5.13)" time=2025-03-10T11:59:12.734Z level=INFO source=gpu.go:217 msg="looking for compatible GPUs" time=2025-03-10T11:59:12.736Z level=INFO source=gpu.go:612 msg="Unable to load cudart library /usr/lib/aarch64-linux-gnu/nvidia/libcuda.so.1.1: Unable to load /usr/lib/aarch64-linux-gnu/nvidia/libcuda.so.1.1 library to query for Nvidia GPUs: /usr/lib/aarch64-linux-gnu/libc.so.6: version `GLIBC_2.34' not found (required by /usr/lib/aarch64-linux-gnu/nvidia/libnvrm_gpu.so)" time=2025-03-10T11:59:12.738Z level=INFO source=gpu.go:377 msg="no compatible GPUs were discovered" time=2025-03-10T11:59:12.738Z level=INFO source=types.go:130 msg="inference compute" id=0 library=cpu variant="" compute="" driver=0.0 name="" total="61.4 GiB" available="49.2 GiB" ``` Here is the error line ``` time=2025-03-10T11:59:12.736Z level=INFO source=gpu.go:612 msg="Unable to load cudart library /usr/lib/aarch64-linux-gnu/nvidia/libcuda.so.1.1: Unable to load /usr/lib/aarch64-linux-gnu/nvidia/libcuda.so.1.1 library to query for Nvidia GPUs: /usr/lib/aarch64-linux-gnu/libc.so.6: version `GLIBC_2.34' not found (required by /usr/lib/aarch64-linux-gnu/nvidia/libnvrm_gpu.so)" ``` Here is the host Glibc ``` ldd --version ldd (Ubuntu GLIBC 2.35-0ubuntu3) 2.35 Copyright (C) 2022 Free Software Foundation, Inc. This is free software; see the source for copying conditions. There is NO warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. Written by Roland McGrath and Ulrich Drepper. ``` Running Ubuntu 22.04 with Orin AGX ``` cat /etc/os-release PRETTY_NAME="Ubuntu 22.04 LTS" NAME="Ubuntu" VERSION_ID="22.04" VERSION="22.04 (Jammy Jellyfish)" VERSION_CODENAME=jammy ID=ubuntu ID_LIKE=debian HOME_URL="https://www.ubuntu.com/" SUPPORT_URL="https://help.ubuntu.com/" BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/" PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy" UBUNTU_CODENAME=jammy ``` Do you have any recommendations how to resolve this issue without changing host environment? Should we build this container from sources on that machine or some other work arounds? Thank you in advance! Updated: The last image that works fine is [0.5.7 for arm64](https://hub.docker.com/layers/ollama/ollama/0.5.7/images/sha256-0c95395d59f7810f9cd6a547839dd5d692eca99d698c9b582ddd860f1af19fed)
Author
Owner

@december-soul commented on GitHub (Mar 14, 2025):

+1

<!-- gh-comment-id:2723833178 --> @december-soul commented on GitHub (Mar 14, 2025): +1
Author
Owner

@myoldcat commented on GitHub (Mar 17, 2025):

+1

<!-- gh-comment-id:2727813802 --> @myoldcat commented on GitHub (Mar 17, 2025): +1
Author
Owner

@itshaungmu commented on GitHub (Mar 17, 2025):

+1
ldd (GNU libc) 2.28
/lib64/libstdc++.so.6: version `GLIBCXX_3.4.25' not found

<!-- gh-comment-id:2729244957 --> @itshaungmu commented on GitHub (Mar 17, 2025): +1 ldd (GNU libc) 2.28 /lib64/libstdc++.so.6: version `GLIBCXX_3.4.25' not found
Author
Owner

@Decarryee commented on GitHub (Mar 18, 2025):

+1

<!-- gh-comment-id:2731353538 --> @Decarryee commented on GitHub (Mar 18, 2025): +1
Author
Owner

@s7lx commented on GitHub (Mar 19, 2025):

+1

ollama: /lib64/libm.so.6: version `GLIBC_2.27' not found (required by ollama)
ollama: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.25' not found (required by ollama)
ollama: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.20' not found (required by ollama)
ollama: /lib64/libstdc++.so.6: version `CXXABI_1.3.9' not found (required by ollama)
ollama: /lib64/libstdc++.so.6: version `CXXABI_1.3.11' not found (required by ollama)
ollama: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.21' not found (required by ollama)
ollama: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.22' not found (required by ollama)

CentOS Linux release 7.9.2009 (Core)

<!-- gh-comment-id:2735027617 --> @s7lx commented on GitHub (Mar 19, 2025): +1 ``` ollama: /lib64/libm.so.6: version `GLIBC_2.27' not found (required by ollama) ollama: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.25' not found (required by ollama) ollama: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.20' not found (required by ollama) ollama: /lib64/libstdc++.so.6: version `CXXABI_1.3.9' not found (required by ollama) ollama: /lib64/libstdc++.so.6: version `CXXABI_1.3.11' not found (required by ollama) ollama: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.21' not found (required by ollama) ollama: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.22' not found (required by ollama) ``` CentOS Linux release 7.9.2009 (Core)
Author
Owner

@abbaskss commented on GitHub (Mar 19, 2025):

+1

<!-- gh-comment-id:2735778657 --> @abbaskss commented on GitHub (Mar 19, 2025): +1
Author
Owner

@liaozusheng commented on GitHub (Mar 19, 2025):

+1

<!-- gh-comment-id:2736269856 --> @liaozusheng commented on GitHub (Mar 19, 2025): +1
Author
Owner

@thyarles commented on GitHub (Mar 20, 2025):

+1

<!-- gh-comment-id:2741340873 --> @thyarles commented on GitHub (Mar 20, 2025): +1
Author
Owner

@xiaoshandegithub commented on GitHub (Mar 24, 2025):

+1

<!-- gh-comment-id:2747273773 --> @xiaoshandegithub commented on GitHub (Mar 24, 2025): +1
Author
Owner

@HappyClint commented on GitHub (Mar 24, 2025):

+1

<!-- gh-comment-id:2748424602 --> @HappyClint commented on GitHub (Mar 24, 2025): +1
Author
Owner

@Siri-2001 commented on GitHub (Mar 25, 2025):

+1

<!-- gh-comment-id:2750484550 --> @Siri-2001 commented on GitHub (Mar 25, 2025): +1
Author
Owner

@Yamcanda commented on GitHub (Mar 26, 2025):

+1

<!-- gh-comment-id:2752986024 --> @Yamcanda commented on GitHub (Mar 26, 2025): +1
Author
Owner

@raghav45p commented on GitHub (Mar 26, 2025):

Got Same issue, its complicated and risky to update glib, please try to lower the requirements

<!-- gh-comment-id:2755056521 --> @raghav45p commented on GitHub (Mar 26, 2025): Got Same issue, its complicated and risky to update glib, please try to lower the requirements
Author
Owner

@wangjiawen2013 commented on GitHub (Mar 27, 2025):

+1

<!-- gh-comment-id:2756515372 --> @wangjiawen2013 commented on GitHub (Mar 27, 2025): +1
Author
Owner

@opser-gavin commented on GitHub (Mar 27, 2025):

+1

<!-- gh-comment-id:2757331974 --> @opser-gavin commented on GitHub (Mar 27, 2025): +1
Author
Owner

@anzhexe commented on GitHub (Mar 27, 2025):

+1

<!-- gh-comment-id:2759841561 --> @anzhexe commented on GitHub (Mar 27, 2025): +1
Author
Owner

@weim-mkt commented on GitHub (Mar 31, 2025):

+1

<!-- gh-comment-id:2766377968 --> @weim-mkt commented on GitHub (Mar 31, 2025): +1
Author
Owner

@divyeshgaur93 commented on GitHub (Mar 31, 2025):

+1

<!-- gh-comment-id:2767224055 --> @divyeshgaur93 commented on GitHub (Mar 31, 2025): +1
Author
Owner

@psocik commented on GitHub (Apr 7, 2025):

@jmorganca same issue in #9503

<!-- gh-comment-id:2784496798 --> @psocik commented on GitHub (Apr 7, 2025): @jmorganca same issue in #9503
Author
Owner

@BilibalaX commented on GitHub (Apr 20, 2025):

Same issue, updating glib is very complex and unstable. Could you lower the requirements? Also using the 0.5.12 version meaning Gemma3 is not available. Is there a way to bypass?

[ ]$ ollama run hf.co/unsloth/gemma-3-27b-it-GGUF:Q8_0
Error: llama runner process has terminated: this model is not supported by your version of Ollama. You may need to upgrade
<!-- gh-comment-id:2817047103 --> @BilibalaX commented on GitHub (Apr 20, 2025): Same issue, updating glib is very complex and unstable. Could you lower the requirements? Also using the 0.5.12 version meaning Gemma3 is not available. Is there a way to bypass? ```bash [ ]$ ollama run hf.co/unsloth/gemma-3-27b-it-GGUF:Q8_0 Error: llama runner process has terminated: this model is not supported by your version of Ollama. You may need to upgrade ```
Author
Owner

@chuangching commented on GitHub (Apr 20, 2025):

This my way to resolve this issue:
My os is RHEL7.9

  1. Download latest binutils, gmp, make, mpc src packages, set prefix path to a independ path like "/home/ollama/tools" when config, compile.
  2. set LD_LIBRAYR_PATH and PATH to the independ path before compile and install gcc&glibc.
  3. Download and install gcc to the independ path.
  4. Download and install glibc to the independ path.
  5. Download patchelf.
  6. Add "Environment="LD_LIBRARY_PATH=/home/ollama/tools/lib:/home/ollama/tools/lib64" in /etc/systemd/system/ollama.service
  7. run patchelf --set-interpreter /home/ollama/tools/lib/ld-linux-x86-64.so.2 /path/to/ollama
  8. run systemctl daemon-reload and systemctl restart ollama
<!-- gh-comment-id:2817057340 --> @chuangching commented on GitHub (Apr 20, 2025): This my way to resolve this issue: My os is RHEL7.9 1. Download latest binutils, gmp, make, mpc src packages, set prefix path to a independ path like "/home/ollama/tools" when config, compile. 2. set LD_LIBRAYR_PATH and PATH to the independ path before compile and install gcc&glibc. 3. Download and install gcc to the independ path. 4. Download and install glibc to the independ path. 5. Download patchelf. 6. Add "Environment="LD_LIBRARY_PATH=/home/ollama/tools/lib:/home/ollama/tools/lib64" in /etc/systemd/system/ollama.service 7. run patchelf --set-interpreter /home/ollama/tools/lib/ld-linux-x86-64.so.2 /path/to/ollama 8. run systemctl daemon-reload and systemctl restart ollama
Author
Owner

@DecentMakeover commented on GitHub (Apr 24, 2025):

+1

<!-- gh-comment-id:2826448073 --> @DecentMakeover commented on GitHub (Apr 24, 2025): +1
Author
Owner

@ViViCodeMania commented on GitHub (Apr 29, 2025):

+1

<!-- gh-comment-id:2837271961 --> @ViViCodeMania commented on GitHub (Apr 29, 2025): +1
Author
Owner

@tcchristianson commented on GitHub (Apr 29, 2025):

+1

<!-- gh-comment-id:2837317040 --> @tcchristianson commented on GitHub (Apr 29, 2025): +1
Author
Owner

@garychanchan commented on GitHub (Apr 29, 2025):

+1

<!-- gh-comment-id:2837842903 --> @garychanchan commented on GitHub (Apr 29, 2025): +1
Author
Owner

@1305682748 commented on GitHub (Apr 30, 2025):

+1

<!-- gh-comment-id:2841130751 --> @1305682748 commented on GitHub (Apr 30, 2025): +1
Author
Owner

@Sunjoe29 commented on GitHub (May 6, 2025):

+1

<!-- gh-comment-id:2853479205 --> @Sunjoe29 commented on GitHub (May 6, 2025): +1
Author
Owner

@wxzwxz131 commented on GitHub (May 7, 2025):

+1

<!-- gh-comment-id:2857718585 --> @wxzwxz131 commented on GitHub (May 7, 2025): +1
Author
Owner

@AlivenYang commented on GitHub (May 9, 2025):

+1

<!-- gh-comment-id:2864795684 --> @AlivenYang commented on GitHub (May 9, 2025): +1
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#68250