[GH-ISSUE #2285] EOF Error When Running A Model #1315

Closed
opened 2026-04-12 11:08:43 -05:00 by GiteaMirror · 10 comments
Owner

Originally created by @meminens on GitHub (Jan 31, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2285

Originally assigned to: @dhiltgen on GitHub.

Running the command ollama run mistral results in the error Error: Post "http://127.0.0.1:11434/api/chat": EOF

Output of journal -u ollama:

Jan 30 22:13:35 arch ollama[14727]: 2024/01/30 22:13:35 cpu_common.go:11: INFO CPU has AVX2 
Jan 30 22:13:35 arch ollama[14727]: 2024/01/30 22:13:35 dyn_ext_server.go:90: INFO Loading Dynamic llm server: /tmp/ollama519289987/rocm_v5/libext_server.so 
Jan 30 22:13:35 arch ollama[14727]: 2024/01/30 22:13:35 dyn_ext_server.go:145: INFO Initializing llama server 
Jan 30 22:13:35 arch ollama[14727]: free(): invalid pointer 
Jan 30 22:13:35 arch systemd[1]: ollama.service: Main process exited, code=dumped, status=6/ABRT 
Jan 30 22:13:35 arch systemd[1]: ollama.service: Failed with result 'core-dump'. 
Jan 30 22:13:35 arch systemd[1]: ollama.service: Consumed 17.709s CPU time. 
Jan 30 22:13:38 arch systemd[1]: ollama.service: Scheduled restart job, restart counter is at 1. 
Jan 30 22:13:38 arch systemd[1]: Started Ollama Service. 
Jan 30 22:13:38 arch ollama[14973]: 2024/01/30 22:13:38 images.go:857: INFO total blobs: 5 
Jan 30 22:13:38 arch ollama[14973]: 2024/01/30 22:13:38 images.go:864: INFO total unused blobs removed: 0 
Jan 30 22:13:38 arch ollama[14973]: 2024/01/30 22:13:38 routes.go:950: INFO Listening on 127.0.0.1:11434 (version 0.1.22) 
Jan 30 22:13:38 arch ollama[14973]: 2024/01/30 22:13:38 payload_common.go:106: INFO Extracting dynamic libraries... 
Jan 30 22:13:40 arch ollama[14973]: 2024/01/30 22:13:40 payload_common.go:145: INFO Dynamic LLM libraries [cpu_avx rocm_v6 cpu cuda_v11 cpu_avx2 rocm_v5] 
Jan 30 22:13:40 arch ollama[14973]: 2024/01/30 22:13:40 gpu.go:94: INFO Detecting GPU type 
Jan 30 22:13:40 arch ollama[14973]: 2024/01/30 22:13:40 gpu.go:236: INFO Searching for GPU management library libnvidia-ml.so 
Jan 30 22:13:40 arch ollama[14973]: 2024/01/30 22:13:40 gpu.go:282: INFO Discovered GPU libraries: [] 
Jan 30 22:13:40 arch ollama[14973]: 2024/01/30 22:13:40 gpu.go:236: INFO Searching for GPU management library librocm_smi64.so 
Jan 30 22:13:40 arch ollama[14973]: 2024/01/30 22:13:40 gpu.go:282: INFO Discovered GPU libraries: [/opt/rocm/lib/librocm_smi64.so.5.0] 
Jan 30 22:13:40 arch ollama[14973]: 2024/01/30 22:13:40 gpu.go:109: INFO Radeon GPU detected 

System info:

                   -`                    misaligar@arch 
                  .o+`                   --------- 
                 `ooo/                   OS: Arch Linux x86_64 
                `+oooo:                  Host: B650 AORUS ELITE AX 
               `+oooooo:                 Kernel: 6.7.2-arch1-1 
               -+oooooo+:                Uptime: 28 mins 
             `/:-:++oooo+:               Packages: 1073 (pacman), 7 (flatpak) 
            `/++++/+++++++:              Shell: bash 5.2.26 
           `/++++++++++++++:             Resolution: 2560x1440 
          `/+++ooooooooooooo/`           DE: Plasma 5.27.10 
         ./ooosssso++osssssso+`          WM: kwin 
        .oossssso-````/ossssss+`         Theme: [Plasma], Breeze [GTK2/3] 
       -osssssso.      :ssssssso.        Icons: kora [Plasma], kora [GTK2/3] 
      :osssssss/        osssso+++.       Terminal: konsole 
     /ossssssss/        +ssssooo/-       Terminal Font: Hack Nerd Font Mono 10 
   `/ossssso+/:-        -:/+osssso+-     CPU: AMD Ryzen 9 7900X (24) @ 5.733GHz 
  `+sso+:-`                 `.-/+oso:    GPU: AMD ATI Radeon RX 7900 XT/7900 XTX 
 `++:.                           `-/+/   Memory: 8687MiB / 63942MiB 
 .`                                 `/

I have installed ollama manually as per the instructions here: https://github.com/ollama/ollama/blob/main/docs/linux.md

This error started after I disabled the integrated GPU in BIOS. If I keep it enabled, there are no error messages. However, ollama does not use the external GPU, 7900 XTX, even though all the required ROCm packages are installed.

Thanks!

Originally created by @meminens on GitHub (Jan 31, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2285 Originally assigned to: @dhiltgen on GitHub. Running the command `ollama run mistral` results in the error `Error: Post "http://127.0.0.1:11434/api/chat": EOF` Output of `journal -u ollama`: ``` Jan 30 22:13:35 arch ollama[14727]: 2024/01/30 22:13:35 cpu_common.go:11: INFO CPU has AVX2 Jan 30 22:13:35 arch ollama[14727]: 2024/01/30 22:13:35 dyn_ext_server.go:90: INFO Loading Dynamic llm server: /tmp/ollama519289987/rocm_v5/libext_server.so Jan 30 22:13:35 arch ollama[14727]: 2024/01/30 22:13:35 dyn_ext_server.go:145: INFO Initializing llama server Jan 30 22:13:35 arch ollama[14727]: free(): invalid pointer Jan 30 22:13:35 arch systemd[1]: ollama.service: Main process exited, code=dumped, status=6/ABRT Jan 30 22:13:35 arch systemd[1]: ollama.service: Failed with result 'core-dump'. Jan 30 22:13:35 arch systemd[1]: ollama.service: Consumed 17.709s CPU time. Jan 30 22:13:38 arch systemd[1]: ollama.service: Scheduled restart job, restart counter is at 1. Jan 30 22:13:38 arch systemd[1]: Started Ollama Service. Jan 30 22:13:38 arch ollama[14973]: 2024/01/30 22:13:38 images.go:857: INFO total blobs: 5 Jan 30 22:13:38 arch ollama[14973]: 2024/01/30 22:13:38 images.go:864: INFO total unused blobs removed: 0 Jan 30 22:13:38 arch ollama[14973]: 2024/01/30 22:13:38 routes.go:950: INFO Listening on 127.0.0.1:11434 (version 0.1.22) Jan 30 22:13:38 arch ollama[14973]: 2024/01/30 22:13:38 payload_common.go:106: INFO Extracting dynamic libraries... Jan 30 22:13:40 arch ollama[14973]: 2024/01/30 22:13:40 payload_common.go:145: INFO Dynamic LLM libraries [cpu_avx rocm_v6 cpu cuda_v11 cpu_avx2 rocm_v5] Jan 30 22:13:40 arch ollama[14973]: 2024/01/30 22:13:40 gpu.go:94: INFO Detecting GPU type Jan 30 22:13:40 arch ollama[14973]: 2024/01/30 22:13:40 gpu.go:236: INFO Searching for GPU management library libnvidia-ml.so Jan 30 22:13:40 arch ollama[14973]: 2024/01/30 22:13:40 gpu.go:282: INFO Discovered GPU libraries: [] Jan 30 22:13:40 arch ollama[14973]: 2024/01/30 22:13:40 gpu.go:236: INFO Searching for GPU management library librocm_smi64.so Jan 30 22:13:40 arch ollama[14973]: 2024/01/30 22:13:40 gpu.go:282: INFO Discovered GPU libraries: [/opt/rocm/lib/librocm_smi64.so.5.0] Jan 30 22:13:40 arch ollama[14973]: 2024/01/30 22:13:40 gpu.go:109: INFO Radeon GPU detected ``` System info: ``` -` misaligar@arch .o+` --------- `ooo/ OS: Arch Linux x86_64 `+oooo: Host: B650 AORUS ELITE AX `+oooooo: Kernel: 6.7.2-arch1-1 -+oooooo+: Uptime: 28 mins `/:-:++oooo+: Packages: 1073 (pacman), 7 (flatpak) `/++++/+++++++: Shell: bash 5.2.26 `/++++++++++++++: Resolution: 2560x1440 `/+++ooooooooooooo/` DE: Plasma 5.27.10 ./ooosssso++osssssso+` WM: kwin .oossssso-````/ossssss+` Theme: [Plasma], Breeze [GTK2/3] -osssssso. :ssssssso. Icons: kora [Plasma], kora [GTK2/3] :osssssss/ osssso+++. Terminal: konsole /ossssssss/ +ssssooo/- Terminal Font: Hack Nerd Font Mono 10 `/ossssso+/:- -:/+osssso+- CPU: AMD Ryzen 9 7900X (24) @ 5.733GHz `+sso+:-` `.-/+oso: GPU: AMD ATI Radeon RX 7900 XT/7900 XTX `++:. `-/+/ Memory: 8687MiB / 63942MiB .` `/ ``` I have installed ollama manually as per the instructions here: https://github.com/ollama/ollama/blob/main/docs/linux.md This error started after I disabled the integrated GPU in BIOS. If I keep it enabled, there are no error messages. However, ollama does not use the external GPU, 7900 XTX, even though all the required ROCm packages are installed. Thanks!
GiteaMirror added the amd label 2026-04-12 11:08:43 -05:00
Author
Owner

@meminens commented on GitHub (Jan 31, 2024):

Note that I have been able to manually build ollama from source with AMDGPU_TARGETS=gfx1100. So the issue appears to be related to the official binary not recognizing my GPU.

<!-- gh-comment-id:1918325947 --> @meminens commented on GitHub (Jan 31, 2024): Note that I have been able to manually build ollama from source with `AMDGPU_TARGETS=gfx1100`. So the issue appears to be related to the official binary not recognizing my GPU.
Author
Owner

@pdevine commented on GitHub (Jan 31, 2024):

cc @dhiltgen

<!-- gh-comment-id:1918363278 --> @pdevine commented on GitHub (Jan 31, 2024): cc @dhiltgen
Author
Owner

@dhiltgen commented on GitHub (Jan 31, 2024):

@misaligar can you attach more of the log showing the server crash? My suspicion is you hit #2165 but would need to see the log to confirm

<!-- gh-comment-id:1919523737 --> @dhiltgen commented on GitHub (Jan 31, 2024): @misaligar can you attach more of the log showing the server crash? My suspicion is you hit #2165 but would need to see the log to confirm
Author
Owner

@meminens commented on GitHub (Jan 31, 2024):

@misaligar can you attach more of the log showing the server crash? My suspicion is you hit #2165 but would need to see the log to confirm

Hope this helps!

misal@arch:~$ OLLAMA_DEBUG=1 ollama serve
time=2024-01-31T13:22:32.465-05:00 level=DEBUG source=/go/src/github.com/jmorganca/ollama/server/routes.go:926 msg="Debug logging enabled"
time=2024-01-31T13:22:32.465-05:00 level=INFO source=/go/src/github.com/jmorganca/ollama/server/images.go:857 msg="total blobs: 5"
time=2024-01-31T13:22:32.465-05:00 level=INFO source=/go/src/github.com/jmorganca/ollama/server/images.go:864 msg="total unused blobs removed: 0"
time=2024-01-31T13:22:32.465-05:00 level=INFO source=/go/src/github.com/jmorganca/ollama/server/routes.go:950 msg="Listening on 127.0.0.1:11434 (version 0.1.22)"
time=2024-01-31T13:22:32.465-05:00 level=INFO source=/go/src/github.com/jmorganca/ollama/llm/payload_common.go:106 msg="Extracting dynamic libraries..."
time=2024-01-31T13:22:33.819-05:00 level=INFO source=/go/src/github.com/jmorganca/ollama/llm/payload_common.go:145 msg="Dynamic LLM libraries [cuda_v11 rocm_v6 cpu rocm_v5 cpu_avx cpu_avx2]"
time=2024-01-31T13:22:33.819-05:00 level=DEBUG source=/go/src/github.com/jmorganca/ollama/llm/payload_common.go:146 msg="Override detection logic by setting OLLAMA_LLM_LIBRARY"
time=2024-01-31T13:22:33.819-05:00 level=INFO source=/go/src/github.com/jmorganca/ollama/gpu/gpu.go:94 msg="Detecting GPU type"
time=2024-01-31T13:22:33.819-05:00 level=INFO source=/go/src/github.com/jmorganca/ollama/gpu/gpu.go:236 msg="Searching for GPU management library libnvidia-ml.so"
time=2024-01-31T13:22:33.819-05:00 level=DEBUG source=/go/src/github.com/jmorganca/ollama/gpu/gpu.go:254 msg="gpu management search paths: [/usr/local/cuda/lib64/libnvidia-ml.so* /usr/lib/x86_64-linux-gnu/nvidia/current/libnvidia-ml.so* /usr/lib/x86_64-linux-gnu/libnvidia-ml.so* /usr/lib/wsl/lib/libnvidia-ml.so* /usr/lib/wsl/drivers/*/libnvidia-ml.so* /opt/cuda/lib64/libnvidia-ml.so* /usr/lib*/libnvidia-ml.so* /usr/local/lib*/libnvidia-ml.so* /usr/lib/aarch64-linux-gnu/nvidia/current/libnvidia-ml.so* /usr/lib/aarch64-linux-gnu/libnvidia-ml.so* /opt/cuda/targets/x86_64-linux/lib/stubs/libnvidia-ml.so* /home/misal/libnvidia-ml.so*]"
time=2024-01-31T13:22:33.823-05:00 level=INFO source=/go/src/github.com/jmorganca/ollama/gpu/gpu.go:282 msg="Discovered GPU libraries: []"
time=2024-01-31T13:22:33.823-05:00 level=INFO source=/go/src/github.com/jmorganca/ollama/gpu/gpu.go:236 msg="Searching for GPU management library librocm_smi64.so"
time=2024-01-31T13:22:33.823-05:00 level=DEBUG source=/go/src/github.com/jmorganca/ollama/gpu/gpu.go:254 msg="gpu management search paths: [/opt/rocm*/lib*/librocm_smi64.so* /home/misal/librocm_smi64.so*]"
time=2024-01-31T13:22:33.823-05:00 level=INFO source=/go/src/github.com/jmorganca/ollama/gpu/gpu.go:282 msg="Discovered GPU libraries: [/opt/rocm/lib/librocm_smi64.so.5.0]"
wiring rocm management library functions in /opt/rocm/lib/librocm_smi64.so.5.0
dlsym: rsmi_init
dlsym: rsmi_shut_down
dlsym: rsmi_dev_memory_total_get
dlsym: rsmi_dev_memory_usage_get
dlsym: rsmi_version_get
dlsym: rsmi_num_monitor_devices
dlsym: rsmi_dev_id_get
dlsym: rsmi_dev_name_get
dlsym: rsmi_dev_brand_get
dlsym: rsmi_dev_vendor_name_get
dlsym: rsmi_dev_vram_vendor_get
dlsym: rsmi_dev_serial_number_get
dlsym: rsmi_dev_subsystem_name_get
dlsym: rsmi_dev_vbios_version_get
time=2024-01-31T13:22:33.825-05:00 level=INFO source=/go/src/github.com/jmorganca/ollama/gpu/gpu.go:109 msg="Radeon GPU detected"
discovered 1 ROCm GPU Devices
[0] ROCm device name: Navi 31 [Radeon RX 7900 XT/7900 XTX]
[0] ROCm brand: Navi 31 [Radeon RX 7900 XT/7900 XTX]
[0] ROCm vendor: Advanced Micro Devices, Inc. [AMD/ATI]
[0] ROCm VRAM vendor: samsung
rsmi_dev_serial_number_get failed: 2
[0] ROCm subsystem name: RX-79XMERCB9 [SPEEDSTER MERC 310 RX 7900 XTX]
[0] ROCm vbios version: 113-31XFSHBS1-L02
[0] ROCm totalMem 25753026560
[0] ROCm usedMem 1036677120
time=2024-01-31T13:22:33.826-05:00 level=DEBUG source=/go/src/github.com/jmorganca/ollama/gpu/gpu.go:225 msg="rocm detected 1 devices with 21214M available memory"
[GIN] 2024/01/31 - 13:22:50 | 200 |       26.56µs |       127.0.0.1 | HEAD     "/"
[GIN] 2024/01/31 - 13:22:50 | 200 |     387.643µs |       127.0.0.1 | POST     "/api/show"
[GIN] 2024/01/31 - 13:22:50 | 200 |     286.681µs |       127.0.0.1 | POST     "/api/show"
discovered 1 ROCm GPU Devices
[0] ROCm device name: Navi 31 [Radeon RX 7900 XT/7900 XTX]
[0] ROCm brand: Navi 31 [Radeon RX 7900 XT/7900 XTX]
[0] ROCm vendor: Advanced Micro Devices, Inc. [AMD/ATI]
[0] ROCm VRAM vendor: samsung
rsmi_dev_serial_number_get failed: 2
[0] ROCm subsystem name: RX-79XMERCB9 [SPEEDSTER MERC 310 RX 7900 XTX]
[0] ROCm vbios version: 113-31XFSHBS1-L02
[0] ROCm totalMem 25753026560
[0] ROCm usedMem 1036681216
time=2024-01-31T13:22:50.992-05:00 level=DEBUG source=/go/src/github.com/jmorganca/ollama/gpu/gpu.go:225 msg="rocm detected 1 devices with 21214M available memory"
discovered 1 ROCm GPU Devices
[0] ROCm device name: Navi 31 [Radeon RX 7900 XT/7900 XTX]
[0] ROCm brand: Navi 31 [Radeon RX 7900 XT/7900 XTX]
[0] ROCm vendor: Advanced Micro Devices, Inc. [AMD/ATI]
[0] ROCm VRAM vendor: samsung
rsmi_dev_serial_number_get failed: 2
[0] ROCm subsystem name: RX-79XMERCB9 [SPEEDSTER MERC 310 RX 7900 XTX]
[0] ROCm vbios version: 113-31XFSHBS1-L02
[0] ROCm totalMem 25753026560
[0] ROCm usedMem 1036681216
time=2024-01-31T13:22:50.993-05:00 level=INFO source=/go/src/github.com/jmorganca/ollama/gpu/cpu_common.go:11 msg="CPU has AVX2"
loading library /tmp/ollama1038475767/rocm_v5/libext_server.so
time=2024-01-31T13:22:51.019-05:00 level=INFO source=/go/src/github.com/jmorganca/ollama/llm/dyn_ext_server.go:90 msg="Loading Dynamic llm server: /tmp/ollama1038475767/rocm_v5/libext_server.so"
time=2024-01-31T13:22:51.019-05:00 level=INFO source=/go/src/github.com/jmorganca/ollama/llm/dyn_ext_server.go:145 msg="Initializing llama server"
[1706725371] system info: AVX = 1 | AVX_VNNI = 0 | AVX2 = 0 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | FMA = 0 | NEON = 0 | ARM_FMA = 0 | F16C = 0 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 1 | SSE3 = 1 | SSSE3 = 1 | VSX = 0 | 
[1706725371] Performing pre-initialization of GPU
free(): invalid pointer
Aborted (core dumped)
misal@arch:~$ OLLAMA_DEBUG=1 ollama run mistral
Error: could not connect to ollama server, run 'ollama serve' to start it
<!-- gh-comment-id:1919687304 --> @meminens commented on GitHub (Jan 31, 2024): > @misaligar can you attach more of the log showing the server crash? My suspicion is you hit #2165 but would need to see the log to confirm Hope this helps! ``` misal@arch:~$ OLLAMA_DEBUG=1 ollama serve time=2024-01-31T13:22:32.465-05:00 level=DEBUG source=/go/src/github.com/jmorganca/ollama/server/routes.go:926 msg="Debug logging enabled" time=2024-01-31T13:22:32.465-05:00 level=INFO source=/go/src/github.com/jmorganca/ollama/server/images.go:857 msg="total blobs: 5" time=2024-01-31T13:22:32.465-05:00 level=INFO source=/go/src/github.com/jmorganca/ollama/server/images.go:864 msg="total unused blobs removed: 0" time=2024-01-31T13:22:32.465-05:00 level=INFO source=/go/src/github.com/jmorganca/ollama/server/routes.go:950 msg="Listening on 127.0.0.1:11434 (version 0.1.22)" time=2024-01-31T13:22:32.465-05:00 level=INFO source=/go/src/github.com/jmorganca/ollama/llm/payload_common.go:106 msg="Extracting dynamic libraries..." time=2024-01-31T13:22:33.819-05:00 level=INFO source=/go/src/github.com/jmorganca/ollama/llm/payload_common.go:145 msg="Dynamic LLM libraries [cuda_v11 rocm_v6 cpu rocm_v5 cpu_avx cpu_avx2]" time=2024-01-31T13:22:33.819-05:00 level=DEBUG source=/go/src/github.com/jmorganca/ollama/llm/payload_common.go:146 msg="Override detection logic by setting OLLAMA_LLM_LIBRARY" time=2024-01-31T13:22:33.819-05:00 level=INFO source=/go/src/github.com/jmorganca/ollama/gpu/gpu.go:94 msg="Detecting GPU type" time=2024-01-31T13:22:33.819-05:00 level=INFO source=/go/src/github.com/jmorganca/ollama/gpu/gpu.go:236 msg="Searching for GPU management library libnvidia-ml.so" time=2024-01-31T13:22:33.819-05:00 level=DEBUG source=/go/src/github.com/jmorganca/ollama/gpu/gpu.go:254 msg="gpu management search paths: [/usr/local/cuda/lib64/libnvidia-ml.so* /usr/lib/x86_64-linux-gnu/nvidia/current/libnvidia-ml.so* /usr/lib/x86_64-linux-gnu/libnvidia-ml.so* /usr/lib/wsl/lib/libnvidia-ml.so* /usr/lib/wsl/drivers/*/libnvidia-ml.so* /opt/cuda/lib64/libnvidia-ml.so* /usr/lib*/libnvidia-ml.so* /usr/local/lib*/libnvidia-ml.so* /usr/lib/aarch64-linux-gnu/nvidia/current/libnvidia-ml.so* /usr/lib/aarch64-linux-gnu/libnvidia-ml.so* /opt/cuda/targets/x86_64-linux/lib/stubs/libnvidia-ml.so* /home/misal/libnvidia-ml.so*]" time=2024-01-31T13:22:33.823-05:00 level=INFO source=/go/src/github.com/jmorganca/ollama/gpu/gpu.go:282 msg="Discovered GPU libraries: []" time=2024-01-31T13:22:33.823-05:00 level=INFO source=/go/src/github.com/jmorganca/ollama/gpu/gpu.go:236 msg="Searching for GPU management library librocm_smi64.so" time=2024-01-31T13:22:33.823-05:00 level=DEBUG source=/go/src/github.com/jmorganca/ollama/gpu/gpu.go:254 msg="gpu management search paths: [/opt/rocm*/lib*/librocm_smi64.so* /home/misal/librocm_smi64.so*]" time=2024-01-31T13:22:33.823-05:00 level=INFO source=/go/src/github.com/jmorganca/ollama/gpu/gpu.go:282 msg="Discovered GPU libraries: [/opt/rocm/lib/librocm_smi64.so.5.0]" wiring rocm management library functions in /opt/rocm/lib/librocm_smi64.so.5.0 dlsym: rsmi_init dlsym: rsmi_shut_down dlsym: rsmi_dev_memory_total_get dlsym: rsmi_dev_memory_usage_get dlsym: rsmi_version_get dlsym: rsmi_num_monitor_devices dlsym: rsmi_dev_id_get dlsym: rsmi_dev_name_get dlsym: rsmi_dev_brand_get dlsym: rsmi_dev_vendor_name_get dlsym: rsmi_dev_vram_vendor_get dlsym: rsmi_dev_serial_number_get dlsym: rsmi_dev_subsystem_name_get dlsym: rsmi_dev_vbios_version_get time=2024-01-31T13:22:33.825-05:00 level=INFO source=/go/src/github.com/jmorganca/ollama/gpu/gpu.go:109 msg="Radeon GPU detected" discovered 1 ROCm GPU Devices [0] ROCm device name: Navi 31 [Radeon RX 7900 XT/7900 XTX] [0] ROCm brand: Navi 31 [Radeon RX 7900 XT/7900 XTX] [0] ROCm vendor: Advanced Micro Devices, Inc. [AMD/ATI] [0] ROCm VRAM vendor: samsung rsmi_dev_serial_number_get failed: 2 [0] ROCm subsystem name: RX-79XMERCB9 [SPEEDSTER MERC 310 RX 7900 XTX] [0] ROCm vbios version: 113-31XFSHBS1-L02 [0] ROCm totalMem 25753026560 [0] ROCm usedMem 1036677120 time=2024-01-31T13:22:33.826-05:00 level=DEBUG source=/go/src/github.com/jmorganca/ollama/gpu/gpu.go:225 msg="rocm detected 1 devices with 21214M available memory" [GIN] 2024/01/31 - 13:22:50 | 200 | 26.56µs | 127.0.0.1 | HEAD "/" [GIN] 2024/01/31 - 13:22:50 | 200 | 387.643µs | 127.0.0.1 | POST "/api/show" [GIN] 2024/01/31 - 13:22:50 | 200 | 286.681µs | 127.0.0.1 | POST "/api/show" discovered 1 ROCm GPU Devices [0] ROCm device name: Navi 31 [Radeon RX 7900 XT/7900 XTX] [0] ROCm brand: Navi 31 [Radeon RX 7900 XT/7900 XTX] [0] ROCm vendor: Advanced Micro Devices, Inc. [AMD/ATI] [0] ROCm VRAM vendor: samsung rsmi_dev_serial_number_get failed: 2 [0] ROCm subsystem name: RX-79XMERCB9 [SPEEDSTER MERC 310 RX 7900 XTX] [0] ROCm vbios version: 113-31XFSHBS1-L02 [0] ROCm totalMem 25753026560 [0] ROCm usedMem 1036681216 time=2024-01-31T13:22:50.992-05:00 level=DEBUG source=/go/src/github.com/jmorganca/ollama/gpu/gpu.go:225 msg="rocm detected 1 devices with 21214M available memory" discovered 1 ROCm GPU Devices [0] ROCm device name: Navi 31 [Radeon RX 7900 XT/7900 XTX] [0] ROCm brand: Navi 31 [Radeon RX 7900 XT/7900 XTX] [0] ROCm vendor: Advanced Micro Devices, Inc. [AMD/ATI] [0] ROCm VRAM vendor: samsung rsmi_dev_serial_number_get failed: 2 [0] ROCm subsystem name: RX-79XMERCB9 [SPEEDSTER MERC 310 RX 7900 XTX] [0] ROCm vbios version: 113-31XFSHBS1-L02 [0] ROCm totalMem 25753026560 [0] ROCm usedMem 1036681216 time=2024-01-31T13:22:50.993-05:00 level=INFO source=/go/src/github.com/jmorganca/ollama/gpu/cpu_common.go:11 msg="CPU has AVX2" loading library /tmp/ollama1038475767/rocm_v5/libext_server.so time=2024-01-31T13:22:51.019-05:00 level=INFO source=/go/src/github.com/jmorganca/ollama/llm/dyn_ext_server.go:90 msg="Loading Dynamic llm server: /tmp/ollama1038475767/rocm_v5/libext_server.so" time=2024-01-31T13:22:51.019-05:00 level=INFO source=/go/src/github.com/jmorganca/ollama/llm/dyn_ext_server.go:145 msg="Initializing llama server" [1706725371] system info: AVX = 1 | AVX_VNNI = 0 | AVX2 = 0 | AVX512 = 0 | AVX512_VBMI = 0 | AVX512_VNNI = 0 | FMA = 0 | NEON = 0 | ARM_FMA = 0 | F16C = 0 | FP16_VA = 0 | WASM_SIMD = 0 | BLAS = 1 | SSE3 = 1 | SSSE3 = 1 | VSX = 0 | [1706725371] Performing pre-initialization of GPU free(): invalid pointer Aborted (core dumped) ``` ``` misal@arch:~$ OLLAMA_DEBUG=1 ollama run mistral Error: could not connect to ollama server, run 'ollama serve' to start it ```
Author
Owner

@dhiltgen commented on GitHub (Jan 31, 2024):

Yup. This is a dup of #2165

[1706725371] Performing pre-initialization of GPU
free(): invalid pointer
Aborted (core dumped)

Since you're building from source, I'm still looking for someone on arch who's hitting this problem to help out testing out a potential fix... https://github.com/ollama/ollama/issues/2165#issuecomment-1913308992

<!-- gh-comment-id:1919724877 --> @dhiltgen commented on GitHub (Jan 31, 2024): Yup. This is a dup of #2165 ``` [1706725371] Performing pre-initialization of GPU free(): invalid pointer Aborted (core dumped) ``` Since you're building from source, I'm still looking for someone on arch who's hitting this problem to help out testing out a potential fix... https://github.com/ollama/ollama/issues/2165#issuecomment-1913308992
Author
Owner

@meminens commented on GitHub (Jan 31, 2024):

@dhiltgen For the logs, I installed using the bash script on the ollama website. Uninstalled it after getting the logs. Using my own build now.

<!-- gh-comment-id:1919780584 --> @meminens commented on GitHub (Jan 31, 2024): @dhiltgen For the logs, I installed using the bash script on the ollama website. Uninstalled it after getting the logs. Using my own build now.
Author
Owner

@bannert1337 commented on GitHub (Mar 6, 2024):

I have a laptop with RX 6600M and 5600H, running EndeavourOS (Arch). When I try to run any model, I get

Error: Post "http://127.0.0.1:11434/api/chat": EOF
<!-- gh-comment-id:1981349100 --> @bannert1337 commented on GitHub (Mar 6, 2024): I have a laptop with RX 6600M and 5600H, running EndeavourOS (Arch). When I try to run any model, I get ``` Error: Post "http://127.0.0.1:11434/api/chat": EOF ```
Author
Owner

@bannert1337 commented on GitHub (Mar 6, 2024):

I used the ollama-rocm-git from AUR (https://aur.archlinux.org/packages/ollama-rocm-git).
I modified the ollama service file using

sudo systemctl edit ollama.service
### Editing /etc/systemd/system/ollama.service.d/override.conf
### Anything between here and the comment below will become the contents of the drop-in file

[Service]
Environment="HOME=/var/lib/ollama" "GIN_MODE=release" "HSA_OVERRIDE_GFX_VERSION=11.0.2"

### Edits below this comment will be discarded


### /usr/lib/systemd/system/ollama.service
# [Unit]
# Description=Ollama Service
# Wants=network-online.target
# After=network.target network-online.target
#
# [Service]
# ExecStart=/usr/bin/ollama serve
# WorkingDirectory=/var/lib/ollama
# Environment="HOME=/var/lib/ollama" "GIN_MODE=release"
# User=ollama
# Group=ollama
# Restart=on-failure
# RestartSec=3
# Type=simple
# PrivateTmp=yes
# ProtectSystem=full
# ProtectHome=yes
#
# [Install]
# WantedBy=multi-user.target
<!-- gh-comment-id:1981359210 --> @bannert1337 commented on GitHub (Mar 6, 2024): I used the ollama-rocm-git from AUR (https://aur.archlinux.org/packages/ollama-rocm-git). I modified the ollama service file using ``` sudo systemctl edit ollama.service ``` ``` ### Editing /etc/systemd/system/ollama.service.d/override.conf ### Anything between here and the comment below will become the contents of the drop-in file [Service] Environment="HOME=/var/lib/ollama" "GIN_MODE=release" "HSA_OVERRIDE_GFX_VERSION=11.0.2" ### Edits below this comment will be discarded ### /usr/lib/systemd/system/ollama.service # [Unit] # Description=Ollama Service # Wants=network-online.target # After=network.target network-online.target # # [Service] # ExecStart=/usr/bin/ollama serve # WorkingDirectory=/var/lib/ollama # Environment="HOME=/var/lib/ollama" "GIN_MODE=release" # User=ollama # Group=ollama # Restart=on-failure # RestartSec=3 # Type=simple # PrivateTmp=yes # ProtectSystem=full # ProtectHome=yes # # [Install] # WantedBy=multi-user.target ```
Author
Owner

@bannert1337 commented on GitHub (Mar 6, 2024):

Mär 06 01:52:58 laptop systemd[1]: Started Ollama Service.
Mär 06 01:52:58 laptop ollama[79654]: Couldn't find '/var/lib/ollama/.ollama/id_ed25519'. Generating new private key.
Mär 06 01:52:58 laptop ollama[79654]: Your new public key is:
Mär 06 01:52:58 laptop ollama[79654]: ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBv2z/DllTUV3hOMyf5UwxFj8DhzyMQH0V9SdMoEWR7h
Mär 06 01:52:58 laptop ollama[79654]: time=2024-03-06T01:52:58.346+01:00 level=INFO source=images.go:710 msg="total blobs: 0"
Mär 06 01:52:58 laptop ollama[79654]: time=2024-03-06T01:52:58.346+01:00 level=INFO source=images.go:717 msg="total unused blobs removed: 0"
Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] [WARNING] Creating an Engine instance with the Logger and Recovery middleware already attached.
Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] [WARNING] Running in "debug" mode. Switch to "release" mode in production.
Mär 06 01:52:58 laptop ollama[79654]:  - using env:        export GIN_MODE=release
Mär 06 01:52:58 laptop ollama[79654]:  - using code:        gin.SetMode(gin.ReleaseMode)
Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] POST   /api/pull                 --> github.com/jmorganca/ollama/server.PullModelHandler (5 handlers)
Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] POST   /api/generate             --> github.com/jmorganca/ollama/server.GenerateHandler (5 handlers)
Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] POST   /api/chat                 --> github.com/jmorganca/ollama/server.ChatHandler (5 handlers)
Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] POST   /api/embeddings           --> github.com/jmorganca/ollama/server.EmbeddingsHandler (5 handlers)
Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] POST   /api/create               --> github.com/jmorganca/ollama/server.CreateModelHandler (5 handlers)
Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] POST   /api/push                 --> github.com/jmorganca/ollama/server.PushModelHandler (5 handlers)
Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] POST   /api/copy                 --> github.com/jmorganca/ollama/server.CopyModelHandler (5 handlers)
Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] DELETE /api/delete               --> github.com/jmorganca/ollama/server.DeleteModelHandler (5 handlers)
Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] POST   /api/show                 --> github.com/jmorganca/ollama/server.ShowModelHandler (5 handlers)
Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] POST   /api/blobs/:digest        --> github.com/jmorganca/ollama/server.CreateBlobHandler (5 handlers)
Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] HEAD   /api/blobs/:digest        --> github.com/jmorganca/ollama/server.HeadBlobHandler (5 handlers)
Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] POST   /v1/chat/completions      --> github.com/jmorganca/ollama/server.ChatHandler (6 handlers)
Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] GET    /                         --> github.com/jmorganca/ollama/server.(*Server).GenerateRoutes.func2 (5 handlers)
Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] GET    /api/tags                 --> github.com/jmorganca/ollama/server.ListModelsHandler (5 handlers)
Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] GET    /api/version              --> github.com/jmorganca/ollama/server.(*Server).GenerateRoutes.func3 (5 handlers)
Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] HEAD   /                         --> github.com/jmorganca/ollama/server.(*Server).GenerateRoutes.func2 (5 handlers)
Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] HEAD   /api/tags                 --> github.com/jmorganca/ollama/server.ListModelsHandler (5 handlers)
Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] HEAD   /api/version              --> github.com/jmorganca/ollama/server.(*Server).GenerateRoutes.func3 (5 handlers)
Mär 06 01:52:58 laptop ollama[79654]: time=2024-03-06T01:52:58.346+01:00 level=INFO source=routes.go:1021 msg="Listening on 127.0.0.1:11434 (version 0.1.28.gce9f7c46)"
Mär 06 01:52:58 laptop ollama[79654]: time=2024-03-06T01:52:58.347+01:00 level=INFO source=payload_common.go:107 msg="Extracting dynamic libraries..."
Mär 06 01:52:58 laptop ollama[79654]: time=2024-03-06T01:52:58.575+01:00 level=INFO source=payload_common.go:146 msg="Dynamic LLM libraries [cpu cpu_avx2 cpu_avx rocm]"
Mär 06 01:52:58 laptop ollama[79654]: time=2024-03-06T01:52:58.575+01:00 level=INFO source=gpu.go:94 msg="Detecting GPU type"
Mär 06 01:52:58 laptop ollama[79654]: time=2024-03-06T01:52:58.575+01:00 level=INFO source=gpu.go:265 msg="Searching for GPU management library libnvidia-ml.so"
Mär 06 01:52:58 laptop ollama[79654]: time=2024-03-06T01:52:58.581+01:00 level=INFO source=gpu.go:311 msg="Discovered GPU libraries: []"
Mär 06 01:52:58 laptop ollama[79654]: time=2024-03-06T01:52:58.581+01:00 level=INFO source=gpu.go:265 msg="Searching for GPU management library librocm_smi64.so"
Mär 06 01:52:58 laptop ollama[79654]: time=2024-03-06T01:52:58.581+01:00 level=INFO source=gpu.go:311 msg="Discovered GPU libraries: [/opt/rocm/lib/librocm_smi64.so.1.0]"
Mär 06 01:52:58 laptop ollama[79654]: time=2024-03-06T01:52:58.587+01:00 level=INFO source=gpu.go:109 msg="Radeon GPU detected"
Mär 06 01:52:58 laptop ollama[79654]: time=2024-03-06T01:52:58.587+01:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
Mär 06 01:53:03 laptop ollama[79654]: [GIN] 2024/03/06 - 01:53:03 | 200 |      97.999µs |       127.0.0.1 | HEAD     "/"
Mär 06 01:53:03 laptop ollama[79654]: [GIN] 2024/03/06 - 01:53:03 | 404 |     100.164µs |       127.0.0.1 | POST     "/api/show"
Mär 06 01:53:05 laptop ollama[79654]: time=2024-03-06T01:53:05.869+01:00 level=INFO source=download.go:136 msg="downloading 338f3218c250 in 65 141 MB part(s)"
Mär 06 01:55:42 laptop ollama[79654]: time=2024-03-06T01:55:42.998+01:00 level=INFO source=download.go:136 msg="downloading 4ec42cd966c9 in 1 12 KB part(s)"
Mär 06 01:55:46 laptop ollama[79654]: time=2024-03-06T01:55:46.423+01:00 level=INFO source=download.go:136 msg="downloading 62fbfd9ed093 in 1 182 B part(s)"
Mär 06 01:55:50 laptop ollama[79654]: time=2024-03-06T01:55:50.428+01:00 level=INFO source=download.go:136 msg="downloading a702f7302290 in 1 57 B part(s)"
Mär 06 01:55:53 laptop ollama[79654]: time=2024-03-06T01:55:53.796+01:00 level=INFO source=download.go:136 msg="downloading d5f042e0ae3d in 1 494 B part(s)"
Mär 06 01:56:01 laptop ollama[79654]: [GIN] 2024/03/06 - 01:56:01 | 200 |         2m57s |       127.0.0.1 | POST     "/api/pull"
Mär 06 01:56:01 laptop ollama[79654]: [GIN] 2024/03/06 - 01:56:01 | 200 |      467.85µs |       127.0.0.1 | POST     "/api/show"
Mär 06 01:56:01 laptop ollama[79654]: [GIN] 2024/03/06 - 01:56:01 | 200 |     297.146µs |       127.0.0.1 | POST     "/api/show"
Mär 06 01:56:01 laptop ollama[79654]: time=2024-03-06T01:56:01.551+01:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
Mär 06 01:56:01 laptop ollama[79654]: time=2024-03-06T01:56:01.552+01:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
Mär 06 01:56:01 laptop ollama[79654]: time=2024-03-06T01:56:01.552+01:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
Mär 06 01:56:01 laptop ollama[79654]: time=2024-03-06T01:56:01.690+01:00 level=INFO source=dyn_ext_server.go:90 msg="Loading Dynamic llm server: /tmp/ollama1279054405/rocm/libext_server.so"
Mär 06 01:56:01 laptop ollama[79654]: time=2024-03-06T01:56:01.690+01:00 level=INFO source=dyn_ext_server.go:150 msg="Initializing llama server"
Mär 06 01:56:04 laptop ollama[79654]: rocBLAS error: Cannot read /opt/rocm/lib/rocblas/library/TensileLibrary.dat: Illegal seek for GPU arch : gfx1032
Mär 06 01:56:04 laptop ollama[79654]:  List of available TensileLibrary Files :
Mär 06 01:56:04 laptop ollama[79654]: "/opt/rocm/lib/rocblas/library/TensileLibrary_lazy_gfx908.dat"
Mär 06 01:56:04 laptop ollama[79654]: "/opt/rocm/lib/rocblas/library/TensileLibrary_lazy_gfx1100.dat"
Mär 06 01:56:04 laptop ollama[79654]: "/opt/rocm/lib/rocblas/library/TensileLibrary_lazy_gfx1030.dat"
Mär 06 01:56:04 laptop ollama[79654]: "/opt/rocm/lib/rocblas/library/TensileLibrary_lazy_gfx1102.dat"
Mär 06 01:56:04 laptop ollama[79654]: "/opt/rocm/lib/rocblas/library/TensileLibrary_lazy_gfx940.dat"
Mär 06 01:56:04 laptop ollama[79654]: "/opt/rocm/lib/rocblas/library/TensileLibrary_lazy_gfx941.dat"
Mär 06 01:56:04 laptop ollama[79654]: "/opt/rocm/lib/rocblas/library/TensileLibrary_lazy_gfx906.dat"
Mär 06 01:56:04 laptop ollama[79654]: "/opt/rocm/lib/rocblas/library/TensileLibrary_lazy_gfx1101.dat"
...skipping...
Mär 06 18:00:23 laptop ollama[8183]: [GIN-debug] GET    /api/tags                 --> github.com/jmorganca/ollama/server.ListModelsHandler (5 handlers)
Mär 06 18:00:23 laptop ollama[8183]: [GIN-debug] GET    /api/version              --> github.com/jmorganca/ollama/server.(*Server).GenerateRoutes.func3 (5 handlers)
Mär 06 18:00:23 laptop ollama[8183]: [GIN-debug] HEAD   /                         --> github.com/jmorganca/ollama/server.(*Server).GenerateRoutes.func2 (5 handlers)
Mär 06 18:00:23 laptop ollama[8183]: [GIN-debug] HEAD   /api/tags                 --> github.com/jmorganca/ollama/server.ListModelsHandler (5 handlers)
Mär 06 18:00:23 laptop ollama[8183]: [GIN-debug] HEAD   /api/version              --> github.com/jmorganca/ollama/server.(*Server).GenerateRoutes.func3 (5 handlers)
Mär 06 18:00:23 laptop ollama[8183]: time=2024-03-06T18:00:23.317+01:00 level=INFO source=routes.go:1021 msg="Listening on 127.0.0.1:11434 (version 0.1.28.gce9f7c46)"
Mär 06 18:00:23 laptop ollama[8183]: time=2024-03-06T18:00:23.318+01:00 level=INFO source=payload_common.go:107 msg="Extracting dynamic libraries..."
Mär 06 18:00:23 laptop ollama[8183]: time=2024-03-06T18:00:23.550+01:00 level=INFO source=payload_common.go:146 msg="Dynamic LLM libraries [rocm cpu_avx cpu_avx2 cpu]"
Mär 06 18:00:23 laptop ollama[8183]: time=2024-03-06T18:00:23.550+01:00 level=INFO source=gpu.go:94 msg="Detecting GPU type"
Mär 06 18:00:23 laptop ollama[8183]: time=2024-03-06T18:00:23.550+01:00 level=INFO source=gpu.go:265 msg="Searching for GPU management library libnvidia-ml.so"
Mär 06 18:00:23 laptop ollama[8183]: time=2024-03-06T18:00:23.556+01:00 level=INFO source=gpu.go:311 msg="Discovered GPU libraries: []"
Mär 06 18:00:23 laptop ollama[8183]: time=2024-03-06T18:00:23.556+01:00 level=INFO source=gpu.go:265 msg="Searching for GPU management library librocm_smi64.so"
Mär 06 18:00:23 laptop ollama[8183]: time=2024-03-06T18:00:23.556+01:00 level=INFO source=gpu.go:311 msg="Discovered GPU libraries: [/opt/rocm/lib/librocm_smi64.so.1.0]"
Mär 06 18:00:23 laptop ollama[8183]: time=2024-03-06T18:00:23.564+01:00 level=INFO source=gpu.go:109 msg="Radeon GPU detected"
Mär 06 18:00:23 laptop ollama[8183]: time=2024-03-06T18:00:23.564+01:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
Mär 06 18:00:23 laptop ollama[8183]: [GIN] 2024/03/06 - 18:00:23 | 200 |      44.498µs |       127.0.0.1 | HEAD     "/"
Mär 06 18:00:23 laptop ollama[8183]: [GIN] 2024/03/06 - 18:00:23 | 200 |     590.002µs |       127.0.0.1 | POST     "/api/show"
Mär 06 18:00:23 laptop ollama[8183]: [GIN] 2024/03/06 - 18:00:23 | 200 |    1.097854ms |       127.0.0.1 | POST     "/api/show"
Mär 06 18:00:24 laptop ollama[8183]: time=2024-03-06T18:00:24.192+01:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
Mär 06 18:00:24 laptop ollama[8183]: time=2024-03-06T18:00:24.193+01:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
Mär 06 18:00:24 laptop ollama[8183]: time=2024-03-06T18:00:24.193+01:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
Mär 06 18:00:24 laptop ollama[8183]: time=2024-03-06T18:00:24.351+01:00 level=INFO source=dyn_ext_server.go:90 msg="Loading Dynamic llm server: /tmp/ollama3662053751/rocm/libext_server.so"
Mär 06 18:00:24 laptop ollama[8183]: time=2024-03-06T18:00:24.351+01:00 level=INFO source=dyn_ext_server.go:150 msg="Initializing llama server"
Mär 06 18:00:31 laptop ollama[8183]: Memory access fault by GPU node-1 (Agent handle: 0x7d83e49250e0) on address 0x7d8475a03000. Reason: Page not present or supervisor privilege.
Mär 06 18:00:32 laptop systemd[1]: ollama.service: Main process exited, code=dumped, status=6/ABRT
Mär 06 18:00:32 laptop systemd[1]: ollama.service: Failed with result 'core-dump'.
Mär 06 18:00:32 laptop systemd[1]: ollama.service: Consumed 5.543s CPU time.
Mär 06 18:00:35 laptop systemd[1]: ollama.service: Scheduled restart job, restart counter is at 3.
Mär 06 18:00:35 laptop systemd[1]: Started Ollama Service.
Mär 06 18:00:35 laptop ollama[8262]: time=2024-03-06T18:00:35.331+01:00 level=INFO source=images.go:710 msg="total blobs: 11"
Mär 06 18:00:35 laptop ollama[8262]: time=2024-03-06T18:00:35.331+01:00 level=INFO source=images.go:717 msg="total unused blobs removed: 0"
Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] [WARNING] Creating an Engine instance with the Logger and Recovery middleware already attached.
Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] [WARNING] Running in "debug" mode. Switch to "release" mode in production.
Mär 06 18:00:35 laptop ollama[8262]:  - using env:        export GIN_MODE=release
Mär 06 18:00:35 laptop ollama[8262]:  - using code:        gin.SetMode(gin.ReleaseMode)
Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] POST   /api/pull                 --> github.com/jmorganca/ollama/server.PullModelHandler (5 handlers)
Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] POST   /api/generate             --> github.com/jmorganca/ollama/server.GenerateHandler (5 handlers)
Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] POST   /api/chat                 --> github.com/jmorganca/ollama/server.ChatHandler (5 handlers)
Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] POST   /api/embeddings           --> github.com/jmorganca/ollama/server.EmbeddingsHandler (5 handlers)
Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] POST   /api/create               --> github.com/jmorganca/ollama/server.CreateModelHandler (5 handlers)
Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] POST   /api/push                 --> github.com/jmorganca/ollama/server.PushModelHandler (5 handlers)
Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] POST   /api/copy                 --> github.com/jmorganca/ollama/server.CopyModelHandler (5 handlers)
Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] DELETE /api/delete               --> github.com/jmorganca/ollama/server.DeleteModelHandler (5 handlers)
Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] POST   /api/show                 --> github.com/jmorganca/ollama/server.ShowModelHandler (5 handlers)
Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] POST   /api/blobs/:digest        --> github.com/jmorganca/ollama/server.CreateBlobHandler (5 handlers)
Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] HEAD   /api/blobs/:digest        --> github.com/jmorganca/ollama/server.HeadBlobHandler (5 handlers)
Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] POST   /v1/chat/completions      --> github.com/jmorganca/ollama/server.ChatHandler (6 handlers)
Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] GET    /                         --> github.com/jmorganca/ollama/server.(*Server).GenerateRoutes.func2 (5 handlers)
Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] GET    /api/tags                 --> github.com/jmorganca/ollama/server.ListModelsHandler (5 handlers)
Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] GET    /api/version              --> github.com/jmorganca/ollama/server.(*Server).GenerateRoutes.func3 (5 handlers)
Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] HEAD   /                         --> github.com/jmorganca/ollama/server.(*Server).GenerateRoutes.func2 (5 handlers)
Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] HEAD   /api/tags                 --> github.com/jmorganca/ollama/server.ListModelsHandler (5 handlers)
Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] HEAD   /api/version              --> github.com/jmorganca/ollama/server.(*Server).GenerateRoutes.func3 (5 handlers)
Mär 06 18:00:35 laptop ollama[8262]: time=2024-03-06T18:00:35.332+01:00 level=INFO source=routes.go:1021 msg="Listening on 127.0.0.1:11434 (version 0.1.28.gce9f7c46)"
Mär 06 18:00:35 laptop ollama[8262]: time=2024-03-06T18:00:35.332+01:00 level=INFO source=payload_common.go:107 msg="Extracting dynamic libraries..."
Mär 06 18:00:35 laptop ollama[8262]: time=2024-03-06T18:00:35.562+01:00 level=INFO source=payload_common.go:146 msg="Dynamic LLM libraries [cpu_avx rocm cpu]"
Mär 06 18:00:35 laptop ollama[8262]: time=2024-03-06T18:00:35.562+01:00 level=INFO source=gpu.go:94 msg="Detecting GPU type"
Mär 06 18:00:35 laptop ollama[8262]: time=2024-03-06T18:00:35.562+01:00 level=INFO source=gpu.go:265 msg="Searching for GPU management library libnvidia-ml.so"
Mär 06 18:00:35 laptop ollama[8262]: time=2024-03-06T18:00:35.568+01:00 level=INFO source=gpu.go:311 msg="Discovered GPU libraries: []"
Mär 06 18:00:35 laptop ollama[8262]: time=2024-03-06T18:00:35.568+01:00 level=INFO source=gpu.go:265 msg="Searching for GPU management library librocm_smi64.so"
Mär 06 18:00:35 laptop ollama[8262]: time=2024-03-06T18:00:35.569+01:00 level=INFO source=gpu.go:311 msg="Discovered GPU libraries: [/opt/rocm/lib/librocm_smi64.so.1.0]"
Mär 06 18:00:35 laptop ollama[8262]: time=2024-03-06T18:00:35.575+01:00 level=INFO source=gpu.go:109 msg="Radeon GPU detected"
Mär 06 18:00:35 laptop ollama[8262]: time=2024-03-06T18:00:35.575+01:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
<!-- gh-comment-id:1981363215 --> @bannert1337 commented on GitHub (Mar 6, 2024): ```python Mär 06 01:52:58 laptop systemd[1]: Started Ollama Service. Mär 06 01:52:58 laptop ollama[79654]: Couldn't find '/var/lib/ollama/.ollama/id_ed25519'. Generating new private key. Mär 06 01:52:58 laptop ollama[79654]: Your new public key is: Mär 06 01:52:58 laptop ollama[79654]: ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBv2z/DllTUV3hOMyf5UwxFj8DhzyMQH0V9SdMoEWR7h Mär 06 01:52:58 laptop ollama[79654]: time=2024-03-06T01:52:58.346+01:00 level=INFO source=images.go:710 msg="total blobs: 0" Mär 06 01:52:58 laptop ollama[79654]: time=2024-03-06T01:52:58.346+01:00 level=INFO source=images.go:717 msg="total unused blobs removed: 0" Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] [WARNING] Creating an Engine instance with the Logger and Recovery middleware already attached. Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] [WARNING] Running in "debug" mode. Switch to "release" mode in production. Mär 06 01:52:58 laptop ollama[79654]: - using env: export GIN_MODE=release Mär 06 01:52:58 laptop ollama[79654]: - using code: gin.SetMode(gin.ReleaseMode) Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] POST /api/pull --> github.com/jmorganca/ollama/server.PullModelHandler (5 handlers) Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] POST /api/generate --> github.com/jmorganca/ollama/server.GenerateHandler (5 handlers) Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] POST /api/chat --> github.com/jmorganca/ollama/server.ChatHandler (5 handlers) Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] POST /api/embeddings --> github.com/jmorganca/ollama/server.EmbeddingsHandler (5 handlers) Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] POST /api/create --> github.com/jmorganca/ollama/server.CreateModelHandler (5 handlers) Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] POST /api/push --> github.com/jmorganca/ollama/server.PushModelHandler (5 handlers) Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] POST /api/copy --> github.com/jmorganca/ollama/server.CopyModelHandler (5 handlers) Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] DELETE /api/delete --> github.com/jmorganca/ollama/server.DeleteModelHandler (5 handlers) Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] POST /api/show --> github.com/jmorganca/ollama/server.ShowModelHandler (5 handlers) Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] POST /api/blobs/:digest --> github.com/jmorganca/ollama/server.CreateBlobHandler (5 handlers) Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] HEAD /api/blobs/:digest --> github.com/jmorganca/ollama/server.HeadBlobHandler (5 handlers) Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] POST /v1/chat/completions --> github.com/jmorganca/ollama/server.ChatHandler (6 handlers) Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] GET / --> github.com/jmorganca/ollama/server.(*Server).GenerateRoutes.func2 (5 handlers) Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] GET /api/tags --> github.com/jmorganca/ollama/server.ListModelsHandler (5 handlers) Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] GET /api/version --> github.com/jmorganca/ollama/server.(*Server).GenerateRoutes.func3 (5 handlers) Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] HEAD / --> github.com/jmorganca/ollama/server.(*Server).GenerateRoutes.func2 (5 handlers) Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] HEAD /api/tags --> github.com/jmorganca/ollama/server.ListModelsHandler (5 handlers) Mär 06 01:52:58 laptop ollama[79654]: [GIN-debug] HEAD /api/version --> github.com/jmorganca/ollama/server.(*Server).GenerateRoutes.func3 (5 handlers) Mär 06 01:52:58 laptop ollama[79654]: time=2024-03-06T01:52:58.346+01:00 level=INFO source=routes.go:1021 msg="Listening on 127.0.0.1:11434 (version 0.1.28.gce9f7c46)" Mär 06 01:52:58 laptop ollama[79654]: time=2024-03-06T01:52:58.347+01:00 level=INFO source=payload_common.go:107 msg="Extracting dynamic libraries..." Mär 06 01:52:58 laptop ollama[79654]: time=2024-03-06T01:52:58.575+01:00 level=INFO source=payload_common.go:146 msg="Dynamic LLM libraries [cpu cpu_avx2 cpu_avx rocm]" Mär 06 01:52:58 laptop ollama[79654]: time=2024-03-06T01:52:58.575+01:00 level=INFO source=gpu.go:94 msg="Detecting GPU type" Mär 06 01:52:58 laptop ollama[79654]: time=2024-03-06T01:52:58.575+01:00 level=INFO source=gpu.go:265 msg="Searching for GPU management library libnvidia-ml.so" Mär 06 01:52:58 laptop ollama[79654]: time=2024-03-06T01:52:58.581+01:00 level=INFO source=gpu.go:311 msg="Discovered GPU libraries: []" Mär 06 01:52:58 laptop ollama[79654]: time=2024-03-06T01:52:58.581+01:00 level=INFO source=gpu.go:265 msg="Searching for GPU management library librocm_smi64.so" Mär 06 01:52:58 laptop ollama[79654]: time=2024-03-06T01:52:58.581+01:00 level=INFO source=gpu.go:311 msg="Discovered GPU libraries: [/opt/rocm/lib/librocm_smi64.so.1.0]" Mär 06 01:52:58 laptop ollama[79654]: time=2024-03-06T01:52:58.587+01:00 level=INFO source=gpu.go:109 msg="Radeon GPU detected" Mär 06 01:52:58 laptop ollama[79654]: time=2024-03-06T01:52:58.587+01:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2" Mär 06 01:53:03 laptop ollama[79654]: [GIN] 2024/03/06 - 01:53:03 | 200 | 97.999µs | 127.0.0.1 | HEAD "/" Mär 06 01:53:03 laptop ollama[79654]: [GIN] 2024/03/06 - 01:53:03 | 404 | 100.164µs | 127.0.0.1 | POST "/api/show" Mär 06 01:53:05 laptop ollama[79654]: time=2024-03-06T01:53:05.869+01:00 level=INFO source=download.go:136 msg="downloading 338f3218c250 in 65 141 MB part(s)" Mär 06 01:55:42 laptop ollama[79654]: time=2024-03-06T01:55:42.998+01:00 level=INFO source=download.go:136 msg="downloading 4ec42cd966c9 in 1 12 KB part(s)" Mär 06 01:55:46 laptop ollama[79654]: time=2024-03-06T01:55:46.423+01:00 level=INFO source=download.go:136 msg="downloading 62fbfd9ed093 in 1 182 B part(s)" Mär 06 01:55:50 laptop ollama[79654]: time=2024-03-06T01:55:50.428+01:00 level=INFO source=download.go:136 msg="downloading a702f7302290 in 1 57 B part(s)" Mär 06 01:55:53 laptop ollama[79654]: time=2024-03-06T01:55:53.796+01:00 level=INFO source=download.go:136 msg="downloading d5f042e0ae3d in 1 494 B part(s)" Mär 06 01:56:01 laptop ollama[79654]: [GIN] 2024/03/06 - 01:56:01 | 200 | 2m57s | 127.0.0.1 | POST "/api/pull" Mär 06 01:56:01 laptop ollama[79654]: [GIN] 2024/03/06 - 01:56:01 | 200 | 467.85µs | 127.0.0.1 | POST "/api/show" Mär 06 01:56:01 laptop ollama[79654]: [GIN] 2024/03/06 - 01:56:01 | 200 | 297.146µs | 127.0.0.1 | POST "/api/show" Mär 06 01:56:01 laptop ollama[79654]: time=2024-03-06T01:56:01.551+01:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2" Mär 06 01:56:01 laptop ollama[79654]: time=2024-03-06T01:56:01.552+01:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2" Mär 06 01:56:01 laptop ollama[79654]: time=2024-03-06T01:56:01.552+01:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2" Mär 06 01:56:01 laptop ollama[79654]: time=2024-03-06T01:56:01.690+01:00 level=INFO source=dyn_ext_server.go:90 msg="Loading Dynamic llm server: /tmp/ollama1279054405/rocm/libext_server.so" Mär 06 01:56:01 laptop ollama[79654]: time=2024-03-06T01:56:01.690+01:00 level=INFO source=dyn_ext_server.go:150 msg="Initializing llama server" Mär 06 01:56:04 laptop ollama[79654]: rocBLAS error: Cannot read /opt/rocm/lib/rocblas/library/TensileLibrary.dat: Illegal seek for GPU arch : gfx1032 Mär 06 01:56:04 laptop ollama[79654]: List of available TensileLibrary Files : Mär 06 01:56:04 laptop ollama[79654]: "/opt/rocm/lib/rocblas/library/TensileLibrary_lazy_gfx908.dat" Mär 06 01:56:04 laptop ollama[79654]: "/opt/rocm/lib/rocblas/library/TensileLibrary_lazy_gfx1100.dat" Mär 06 01:56:04 laptop ollama[79654]: "/opt/rocm/lib/rocblas/library/TensileLibrary_lazy_gfx1030.dat" Mär 06 01:56:04 laptop ollama[79654]: "/opt/rocm/lib/rocblas/library/TensileLibrary_lazy_gfx1102.dat" Mär 06 01:56:04 laptop ollama[79654]: "/opt/rocm/lib/rocblas/library/TensileLibrary_lazy_gfx940.dat" Mär 06 01:56:04 laptop ollama[79654]: "/opt/rocm/lib/rocblas/library/TensileLibrary_lazy_gfx941.dat" Mär 06 01:56:04 laptop ollama[79654]: "/opt/rocm/lib/rocblas/library/TensileLibrary_lazy_gfx906.dat" Mär 06 01:56:04 laptop ollama[79654]: "/opt/rocm/lib/rocblas/library/TensileLibrary_lazy_gfx1101.dat" ...skipping... Mär 06 18:00:23 laptop ollama[8183]: [GIN-debug] GET /api/tags --> github.com/jmorganca/ollama/server.ListModelsHandler (5 handlers) Mär 06 18:00:23 laptop ollama[8183]: [GIN-debug] GET /api/version --> github.com/jmorganca/ollama/server.(*Server).GenerateRoutes.func3 (5 handlers) Mär 06 18:00:23 laptop ollama[8183]: [GIN-debug] HEAD / --> github.com/jmorganca/ollama/server.(*Server).GenerateRoutes.func2 (5 handlers) Mär 06 18:00:23 laptop ollama[8183]: [GIN-debug] HEAD /api/tags --> github.com/jmorganca/ollama/server.ListModelsHandler (5 handlers) Mär 06 18:00:23 laptop ollama[8183]: [GIN-debug] HEAD /api/version --> github.com/jmorganca/ollama/server.(*Server).GenerateRoutes.func3 (5 handlers) Mär 06 18:00:23 laptop ollama[8183]: time=2024-03-06T18:00:23.317+01:00 level=INFO source=routes.go:1021 msg="Listening on 127.0.0.1:11434 (version 0.1.28.gce9f7c46)" Mär 06 18:00:23 laptop ollama[8183]: time=2024-03-06T18:00:23.318+01:00 level=INFO source=payload_common.go:107 msg="Extracting dynamic libraries..." Mär 06 18:00:23 laptop ollama[8183]: time=2024-03-06T18:00:23.550+01:00 level=INFO source=payload_common.go:146 msg="Dynamic LLM libraries [rocm cpu_avx cpu_avx2 cpu]" Mär 06 18:00:23 laptop ollama[8183]: time=2024-03-06T18:00:23.550+01:00 level=INFO source=gpu.go:94 msg="Detecting GPU type" Mär 06 18:00:23 laptop ollama[8183]: time=2024-03-06T18:00:23.550+01:00 level=INFO source=gpu.go:265 msg="Searching for GPU management library libnvidia-ml.so" Mär 06 18:00:23 laptop ollama[8183]: time=2024-03-06T18:00:23.556+01:00 level=INFO source=gpu.go:311 msg="Discovered GPU libraries: []" Mär 06 18:00:23 laptop ollama[8183]: time=2024-03-06T18:00:23.556+01:00 level=INFO source=gpu.go:265 msg="Searching for GPU management library librocm_smi64.so" Mär 06 18:00:23 laptop ollama[8183]: time=2024-03-06T18:00:23.556+01:00 level=INFO source=gpu.go:311 msg="Discovered GPU libraries: [/opt/rocm/lib/librocm_smi64.so.1.0]" Mär 06 18:00:23 laptop ollama[8183]: time=2024-03-06T18:00:23.564+01:00 level=INFO source=gpu.go:109 msg="Radeon GPU detected" Mär 06 18:00:23 laptop ollama[8183]: time=2024-03-06T18:00:23.564+01:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2" Mär 06 18:00:23 laptop ollama[8183]: [GIN] 2024/03/06 - 18:00:23 | 200 | 44.498µs | 127.0.0.1 | HEAD "/" Mär 06 18:00:23 laptop ollama[8183]: [GIN] 2024/03/06 - 18:00:23 | 200 | 590.002µs | 127.0.0.1 | POST "/api/show" Mär 06 18:00:23 laptop ollama[8183]: [GIN] 2024/03/06 - 18:00:23 | 200 | 1.097854ms | 127.0.0.1 | POST "/api/show" Mär 06 18:00:24 laptop ollama[8183]: time=2024-03-06T18:00:24.192+01:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2" Mär 06 18:00:24 laptop ollama[8183]: time=2024-03-06T18:00:24.193+01:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2" Mär 06 18:00:24 laptop ollama[8183]: time=2024-03-06T18:00:24.193+01:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2" Mär 06 18:00:24 laptop ollama[8183]: time=2024-03-06T18:00:24.351+01:00 level=INFO source=dyn_ext_server.go:90 msg="Loading Dynamic llm server: /tmp/ollama3662053751/rocm/libext_server.so" Mär 06 18:00:24 laptop ollama[8183]: time=2024-03-06T18:00:24.351+01:00 level=INFO source=dyn_ext_server.go:150 msg="Initializing llama server" Mär 06 18:00:31 laptop ollama[8183]: Memory access fault by GPU node-1 (Agent handle: 0x7d83e49250e0) on address 0x7d8475a03000. Reason: Page not present or supervisor privilege. Mär 06 18:00:32 laptop systemd[1]: ollama.service: Main process exited, code=dumped, status=6/ABRT Mär 06 18:00:32 laptop systemd[1]: ollama.service: Failed with result 'core-dump'. Mär 06 18:00:32 laptop systemd[1]: ollama.service: Consumed 5.543s CPU time. Mär 06 18:00:35 laptop systemd[1]: ollama.service: Scheduled restart job, restart counter is at 3. Mär 06 18:00:35 laptop systemd[1]: Started Ollama Service. Mär 06 18:00:35 laptop ollama[8262]: time=2024-03-06T18:00:35.331+01:00 level=INFO source=images.go:710 msg="total blobs: 11" Mär 06 18:00:35 laptop ollama[8262]: time=2024-03-06T18:00:35.331+01:00 level=INFO source=images.go:717 msg="total unused blobs removed: 0" Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] [WARNING] Creating an Engine instance with the Logger and Recovery middleware already attached. Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] [WARNING] Running in "debug" mode. Switch to "release" mode in production. Mär 06 18:00:35 laptop ollama[8262]: - using env: export GIN_MODE=release Mär 06 18:00:35 laptop ollama[8262]: - using code: gin.SetMode(gin.ReleaseMode) Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] POST /api/pull --> github.com/jmorganca/ollama/server.PullModelHandler (5 handlers) Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] POST /api/generate --> github.com/jmorganca/ollama/server.GenerateHandler (5 handlers) Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] POST /api/chat --> github.com/jmorganca/ollama/server.ChatHandler (5 handlers) Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] POST /api/embeddings --> github.com/jmorganca/ollama/server.EmbeddingsHandler (5 handlers) Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] POST /api/create --> github.com/jmorganca/ollama/server.CreateModelHandler (5 handlers) Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] POST /api/push --> github.com/jmorganca/ollama/server.PushModelHandler (5 handlers) Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] POST /api/copy --> github.com/jmorganca/ollama/server.CopyModelHandler (5 handlers) Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] DELETE /api/delete --> github.com/jmorganca/ollama/server.DeleteModelHandler (5 handlers) Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] POST /api/show --> github.com/jmorganca/ollama/server.ShowModelHandler (5 handlers) Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] POST /api/blobs/:digest --> github.com/jmorganca/ollama/server.CreateBlobHandler (5 handlers) Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] HEAD /api/blobs/:digest --> github.com/jmorganca/ollama/server.HeadBlobHandler (5 handlers) Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] POST /v1/chat/completions --> github.com/jmorganca/ollama/server.ChatHandler (6 handlers) Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] GET / --> github.com/jmorganca/ollama/server.(*Server).GenerateRoutes.func2 (5 handlers) Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] GET /api/tags --> github.com/jmorganca/ollama/server.ListModelsHandler (5 handlers) Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] GET /api/version --> github.com/jmorganca/ollama/server.(*Server).GenerateRoutes.func3 (5 handlers) Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] HEAD / --> github.com/jmorganca/ollama/server.(*Server).GenerateRoutes.func2 (5 handlers) Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] HEAD /api/tags --> github.com/jmorganca/ollama/server.ListModelsHandler (5 handlers) Mär 06 18:00:35 laptop ollama[8262]: [GIN-debug] HEAD /api/version --> github.com/jmorganca/ollama/server.(*Server).GenerateRoutes.func3 (5 handlers) Mär 06 18:00:35 laptop ollama[8262]: time=2024-03-06T18:00:35.332+01:00 level=INFO source=routes.go:1021 msg="Listening on 127.0.0.1:11434 (version 0.1.28.gce9f7c46)" Mär 06 18:00:35 laptop ollama[8262]: time=2024-03-06T18:00:35.332+01:00 level=INFO source=payload_common.go:107 msg="Extracting dynamic libraries..." Mär 06 18:00:35 laptop ollama[8262]: time=2024-03-06T18:00:35.562+01:00 level=INFO source=payload_common.go:146 msg="Dynamic LLM libraries [cpu_avx rocm cpu]" Mär 06 18:00:35 laptop ollama[8262]: time=2024-03-06T18:00:35.562+01:00 level=INFO source=gpu.go:94 msg="Detecting GPU type" Mär 06 18:00:35 laptop ollama[8262]: time=2024-03-06T18:00:35.562+01:00 level=INFO source=gpu.go:265 msg="Searching for GPU management library libnvidia-ml.so" Mär 06 18:00:35 laptop ollama[8262]: time=2024-03-06T18:00:35.568+01:00 level=INFO source=gpu.go:311 msg="Discovered GPU libraries: []" Mär 06 18:00:35 laptop ollama[8262]: time=2024-03-06T18:00:35.568+01:00 level=INFO source=gpu.go:265 msg="Searching for GPU management library librocm_smi64.so" Mär 06 18:00:35 laptop ollama[8262]: time=2024-03-06T18:00:35.569+01:00 level=INFO source=gpu.go:311 msg="Discovered GPU libraries: [/opt/rocm/lib/librocm_smi64.so.1.0]" Mär 06 18:00:35 laptop ollama[8262]: time=2024-03-06T18:00:35.575+01:00 level=INFO source=gpu.go:109 msg="Radeon GPU detected" Mär 06 18:00:35 laptop ollama[8262]: time=2024-03-06T18:00:35.575+01:00 level=INFO source=cpu_common.go:11 msg="CPU has AVX2" ```
Author
Owner

@dhiltgen commented on GitHub (Mar 6, 2024):

@bannert1337 you might want to open a new issue since this one is closed.

Mär 06 18:00:31 laptop ollama[8183]: Memory access fault by GPU node-1 (Agent handle: 0x7d83e49250e0) on address 0x7d8475a03000. Reason: Page not present or supervisor privilege.

Seems to imply there's a permission problem. As a workaround, try running as root and see if that resolves it, but we should try to figure out how to get things set up properly so it runs deprivileged.

<!-- gh-comment-id:1981854240 --> @dhiltgen commented on GitHub (Mar 6, 2024): @bannert1337 you might want to open a new issue since this one is closed. ``` Mär 06 18:00:31 laptop ollama[8183]: Memory access fault by GPU node-1 (Agent handle: 0x7d83e49250e0) on address 0x7d8475a03000. Reason: Page not present or supervisor privilege. ``` Seems to imply there's a permission problem. As a workaround, try running as root and see if that resolves it, but we should try to figure out how to get things set up properly so it runs deprivileged.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#1315