[GH-ISSUE #12698] ROCm 7.0 with gfx1103 not detected #8426

Closed
opened 2026-04-12 21:05:44 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @janemba on GitHub (Oct 19, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/12698

What is the issue?

I installed ollama-linux-amd64-rocm.tgz (don't know if it is the right one) and ROCm 7.0 is already installed in my system (GNU/Linux). While starting ollama, my GPU doesn't seem to be detected.

ROcm has been installed separately from the source in /opt/rocm and it seems to work fine:

$ rocm-smi


========================================= ROCm System Management Interface =========================================
=================================================== Concise Info ===================================================
Device  Node  IDs              Temp    Power     Partitions          SCLK  MCLK    Fan  Perf  PwrCap  VRAM%  GPU%  
              (DID,     GUID)  (Edge)  (Socket)  (Mem, Compute, ID)                                                
====================================================================================================================
0       1     0x15bf,   14665  35.0°C  7.158W    N/A, N/A, 0         N/A   800Mhz  0%   auto  N/A     42%    0%    
====================================================================================================================
=============================================== End of ROCm SMI Log ================================================

Attached: rocinfo.txt

rocminfo.txt

Relevant log output

$ OLLAMA_DEBUG=1 ollama serve
time=2025-10-20T03:41:02.420+04:00 level=INFO source=routes.go:1481 msg="server config" env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:4096 OLLAMA_DEBUG:DEBUG OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/home/janemba/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_REMOTES:[ollama.com] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES: http_proxy: https_proxy: no_proxy:]"
time=2025-10-20T03:41:02.421+04:00 level=INFO source=images.go:522 msg="total blobs: 11"
time=2025-10-20T03:41:02.421+04:00 level=INFO source=images.go:529 msg="total unused blobs removed: 0"
time=2025-10-20T03:41:02.421+04:00 level=INFO source=routes.go:1534 msg="Listening on 127.0.0.1:11434 (version 0.12.5)"
time=2025-10-20T03:41:02.422+04:00 level=DEBUG source=sched.go:122 msg="starting llm scheduler"
time=2025-10-20T03:41:02.450+04:00 level=INFO source=runner.go:80 msg="discovering available GPUs..."
time=2025-10-20T03:41:02.450+04:00 level=DEBUG source=runner.go:411 msg="spawing runner with" OLLAMA_LIBRARY_PATH=[/usr/bin] extra_envs=[]
time=2025-10-20T03:41:02.553+04:00 level=DEBUG source=runner.go:414 msg="bootstrap discovery took" duration=103.351713ms OLLAMA_LIBRARY_PATH=[/usr/bin] extra_envs=[]
time=2025-10-20T03:41:02.553+04:00 level=DEBUG source=runner.go:117 msg="filtering out unsupported or overlapping GPU library combinations" count=0
time=2025-10-20T03:41:02.554+04:00 level=DEBUG source=runner.go:45 msg="GPU bootstrap discovery took" duration=131.831422ms
time=2025-10-20T03:41:02.554+04:00 level=INFO source=types.go:129 msg="inference compute" id=cpu library=cpu compute="" name=cpu description=cpu libdirs=ollama driver="" pci_id="" type="" total="30.1 GiB" available="18.2 GiB"
time=2025-10-20T03:41:02.554+04:00 level=INFO source=routes.go:1575 msg="entering low vram mode" "total vram"="0 B" threshold="20.0 GiB"

OS

Linux

GPU

AMD

CPU

AMD

Ollama version

0.12.5

Originally created by @janemba on GitHub (Oct 19, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/12698 ### What is the issue? I installed `ollama-linux-amd64-rocm.tgz` (don't know if it is the right one) and ROCm 7.0 is already installed in my system (`GNU/Linux`). While starting `ollama`, my GPU doesn't seem to be detected. ROcm has been installed separately from the source in `/opt/rocm` and it seems to work fine: ```sh $ rocm-smi ========================================= ROCm System Management Interface ========================================= =================================================== Concise Info =================================================== Device Node IDs Temp Power Partitions SCLK MCLK Fan Perf PwrCap VRAM% GPU% (DID, GUID) (Edge) (Socket) (Mem, Compute, ID) ==================================================================================================================== 0 1 0x15bf, 14665 35.0°C 7.158W N/A, N/A, 0 N/A 800Mhz 0% auto N/A 42% 0% ==================================================================================================================== =============================================== End of ROCm SMI Log ================================================ ``` Attached: rocinfo.txt [rocminfo.txt](https://github.com/user-attachments/files/22993425/rocminfo.txt) ### Relevant log output ```shell $ OLLAMA_DEBUG=1 ollama serve time=2025-10-20T03:41:02.420+04:00 level=INFO source=routes.go:1481 msg="server config" env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:4096 OLLAMA_DEBUG:DEBUG OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/home/janemba/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_REMOTES:[ollama.com] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES: http_proxy: https_proxy: no_proxy:]" time=2025-10-20T03:41:02.421+04:00 level=INFO source=images.go:522 msg="total blobs: 11" time=2025-10-20T03:41:02.421+04:00 level=INFO source=images.go:529 msg="total unused blobs removed: 0" time=2025-10-20T03:41:02.421+04:00 level=INFO source=routes.go:1534 msg="Listening on 127.0.0.1:11434 (version 0.12.5)" time=2025-10-20T03:41:02.422+04:00 level=DEBUG source=sched.go:122 msg="starting llm scheduler" time=2025-10-20T03:41:02.450+04:00 level=INFO source=runner.go:80 msg="discovering available GPUs..." time=2025-10-20T03:41:02.450+04:00 level=DEBUG source=runner.go:411 msg="spawing runner with" OLLAMA_LIBRARY_PATH=[/usr/bin] extra_envs=[] time=2025-10-20T03:41:02.553+04:00 level=DEBUG source=runner.go:414 msg="bootstrap discovery took" duration=103.351713ms OLLAMA_LIBRARY_PATH=[/usr/bin] extra_envs=[] time=2025-10-20T03:41:02.553+04:00 level=DEBUG source=runner.go:117 msg="filtering out unsupported or overlapping GPU library combinations" count=0 time=2025-10-20T03:41:02.554+04:00 level=DEBUG source=runner.go:45 msg="GPU bootstrap discovery took" duration=131.831422ms time=2025-10-20T03:41:02.554+04:00 level=INFO source=types.go:129 msg="inference compute" id=cpu library=cpu compute="" name=cpu description=cpu libdirs=ollama driver="" pci_id="" type="" total="30.1 GiB" available="18.2 GiB" time=2025-10-20T03:41:02.554+04:00 level=INFO source=routes.go:1575 msg="entering low vram mode" "total vram"="0 B" threshold="20.0 GiB" ``` ### OS Linux ### GPU AMD ### CPU AMD ### Ollama version 0.12.5
GiteaMirror added the bug label 2026-04-12 21:05:45 -05:00
Author
Owner

@rick-github commented on GitHub (Oct 19, 2025):

Try with OLLAMA_DEBUG=2 for extra detail.

<!-- gh-comment-id:3420067324 --> @rick-github commented on GitHub (Oct 19, 2025): Try with `OLLAMA_DEBUG=2` for extra detail.
Author
Owner

@janemba commented on GitHub (Oct 20, 2025):

Attach ollama debug with OLLAMA_DEBUG=2.

ollama.debug2.txt

<!-- gh-comment-id:3420119212 --> @janemba commented on GitHub (Oct 20, 2025): Attach ollama debug with `OLLAMA_DEBUG=2`. [ollama.debug2.txt](https://github.com/user-attachments/files/22993627/ollama.debug2.txt)
Author
Owner

@janemba commented on GitHub (Oct 20, 2025):

I've delete everything and install again. I think I have no leftover but the issue stay the same. Also, I also used HSA_OVERRIDE_GFX_VERSION with different values but got the same results.

$ OLLAMA_DEBUG=2 ollama serve
time=2025-10-20T14:13:01.677+04:00 level=INFO source=routes.go:1511 msg="server config" env="map[CUDA_VISIBLE_DEVICES: GGML_VK_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:4096 OLLAMA_DEBUG:DEBUG-4 OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/home/janemba/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_REMOTES:[ollama.com] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES: http_proxy: https_proxy: no_proxy:]"
time=2025-10-20T14:13:01.681+04:00 level=INFO source=images.go:522 msg="total blobs: 11"
time=2025-10-20T14:13:01.681+04:00 level=INFO source=images.go:529 msg="total unused blobs removed: 0"
time=2025-10-20T14:13:01.682+04:00 level=INFO source=routes.go:1564 msg="Listening on 127.0.0.1:11434 (version 0.12.6)"
time=2025-10-20T14:13:01.682+04:00 level=DEBUG source=sched.go:123 msg="starting llm scheduler"
time=2025-10-20T14:13:01.682+04:00 level=INFO source=runner.go:80 msg="discovering available GPUs..."
time=2025-10-20T14:13:01.682+04:00 level=DEBUG source=runner.go:448 msg="spawning runner with" OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v13]" extra_envs=[]
time=2025-10-20T14:13:01.683+04:00 level=TRACE source=runner.go:529 msg="starting runner for device discovery" env="[SHELL=/bin/zsh HYPRLAND_CMD=Hyprland XDG_CONFIG_DIRS=/etc/xdg:/etc/kde/xdg LESS=-R XDG_SESSION_PATH=/org/freedesktop/DisplayManager/Session1 CLUTTER_BACKEND=wayland XDG_BACKEND=wayland PKG_CONFIG_PATH=/usr/local/lib64/pkgconfig:/usr/local/share/pkgconfig:/usr/lib64/pkgconfig:/usr/share/pkgconfig GNOME_KEYRING_CONTROL=/home/janemba/.cache/keyring-GBQVE3 G_BROKEN_FILENAMES=1 QT_WAYLAND_DISABLE_WINDOWDECORATION=1 HISTSIZE=999999999 HOSTNAME=z16.calculus.lan MINICOM=-c on JAVA_HOME=/usr/lib64/zulu-openjdk8 DESKTOP_SESSION=hyprland XCURSOR_SIZE=24 XDG_SEAT=seat0 PWD=/home/janemba XDG_SESSION_DESKTOP=Hyprland LOGNAME=janemba QT_QPA_PLATFORMTHEME=qt5ct XDG_SESSION_TYPE=wayland MANPATH=/usr/lib64/zulu-openjdk8/man:/usr/lib64/zulu-openjdk17/man:/usr/lib64/zulu-openjdk11/man::/home/janemba/.local/share/man LESSQUIET=true _=/usr/bin/ollama LS_OPTIONS=-F -b -T 0 --color=auto HOME=/home/janemba LANG=en_US.utf8 _JAVA_AWT_WM_NONREPARENTING=1 LS_COLORS=no=00:fi=00:di=01;34:ln=01;36:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.bat=01;32:*.btm=01;32:*.cmd=01;32:*.com=01;32:*.dll=01;32:*.exe=01;32:*.7z=01;31:*.ace=01;31:*.arj=01;31:*.bz2=01;31:*.cpio=01;31:*.deb=01;31:*.dz=01;31:*.gz=01;31:*.jar=01;31:*.lha=01;31:*.lz=01;31:*.lzh=01;31:*.lzma=01;31:*.rar=01;31:*.rpm=01;31:*.rz=01;31:*.tar=01;31:*.taz=01;31:*.tb2=01;31:*.tbz2=01;31:*.tbz=01;31:*.tgz=01;31:*.tlz=01;31:*.trz=01;31:*.txz=01;31:*.tz=01;31:*.tz2=01;31:*.tzst=01;31:*.xz=01;31:*.z=01;31:*.zip=01;31:*.zoo=01;31:*.zst=01;31:*.aac=01;35:*.anx=01;35:*.asf=01;35:*.au=01;35:*.axa=01;35:*.axv=01;35:*.avi=01;35:*.bmp=01;35:*.divx=01;35:*.flac=01;35:*.flv=01;35:*.gif=01;35:*.ico=01;35:*.jpg=01;35:*.jpeg=01;35:*.m2a=01;35:*.m2t=01;35:*.m2v=01;35:*.m4a=01;35:*.m4p=01;35:*.m4v=01;35:*.mid=01;35:*.midi=01;35:*.mka=01;35:*.mkv=01;35:*.mov=01;35:*.mp3=01;35:*.mp4=01;35:*.mp4v=01;35:*.mpc=01;35:*.mpeg=01;35:*.mpg=01;35:*.nuv=01;35:*.oga=01;35:*.ogv=01;35:*.ogx=01;35:*.ogg=01;35:*.opus=01;35:*.pbm=01;35:*.pgm=01;35:*.png=01;35:*.ppm=01;35:*.qt=01;35:*.ra=01;35:*.ram=01;35:*.rm=01;35:*.spx=01;35:*.svg=01;35:*.svgz=01;35:*.tga=01;35:*.tif=01;35:*.tiff=01;35:*.vob=01;35:*.wav=01;35:*.webm=01;35:*.webp=01;35:*.wma=01;35:*.wmv=01;35:*.xbm=01;35:*.xcf=01;35:*.xpm=01;35:*.xspf=01;35:*.xwd=01;35:*.xvid=01;35: XDG_CURRENT_DESKTOP=Hyprland WAYLAND_DISPLAY=wayland-1 XDG_SEAT_PATH=/org/freedesktop/DisplayManager/Seat0 M2_HOME=/usr/share/maven SAVEHIST=999999999 GOROOT=/usr/lib64/go1.22.9/go QT_QPA_PLATFORM=wayland;xcb XDG_SESSION_CLASS=user TERM=foot G_FILENAME_ENCODING=@locale LESSOPEN=|~/.lessfilter %s HIP_CLANG_PATH=/opt/rocm/llvm/bin USER=janemba SDL_VIDEODRIVER=wayland T1LIB_CONFIG=/usr/share/t1lib/t1lib.config GDK_USE_XFT=1 HYPRLAND_INSTANCE_SIGNATURE=918d8340afd652b011b937d29d5eea0be08467f5_1760954688_290007924 DISPLAY=:1 SHLVL=1 MOZ_ENABLE_WAYLAND=1 INPUTRC=/etc/inputrc XDG_VTNR=4 XDG_SESSION_ID=1 LD_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama:/usr/lib/ollama/cuda_v13:/usr/lib64/zulu-openjdk17/lib/server:/usr/lib64/zulu-openjdk11/lib/server XDG_RUNTIME_DIR=/run/user/1000 PYENV_ROOT=/home/janemba/.pyenv QT5DIR=/usr/lib64/qt5 HYPRLAND_LOG_WLR=1 KDEDIRS=/usr QT_AUTO_SCREEN_SCALE_FACTOR=1 LC_COLLATE=C XDG_DATA_DIRS=/home/janemba/.local/share/flatpak/exports/share:/var/lib/flatpak/exports/share:/usr/local/share:/usr/share VDPAU_LOG=0 GDK_BACKEND=wayland,x11,* PATH=/home/janemba/.pyenv/shims:/home/janemba/.local/bin:/home/janemba/.pyenv/bin:/home/janemba/.cargo/bin:/home/janemba:/.avm/bin:/home/janemba/.local/share/go/bin:/home/janemba/.npm-global/bin:/usr/lib64/zulu-openjdk8/bin:/usr/lib64/zulu-openjdk8/jre/bin:/usr/lib64/zulu-openjdk17/bin:/usr/lib64/zulu-openjdk11/bin:/opt/rocm/bin:/usr/lib64/go1.22.9/go/bin:/usr/local/bin:/usr/bin:/bin:/usr/games:/usr/lib64/libexec/kf5:/usr/lib64/qt5/bin:/usr/lib64/qt6/bin:/home/janemba/.fzf/bin:/home/janemba/.lmstudio/bin SAL_USE_VCLPLUGIN=gtk DBUS_SESSION_BUS_ADDRESS=unix:path=/tmp/dbus-MHRWq7Tuj6,guid=ee724b1cf380d4f9d045d99a68f60940 QT6DIR=/usr/lib64/qt6 OLDPWD=/home/janemba GOPATH=/home/janemba/.local/share/go HYPRCURSOR_SIZE=24 COLORTERM=truecolor P9K_TTY=old _P9K_TTY=/dev/pts/0 ZSH=/home/janemba/.oh-my-zsh PAGER=less LSCOLORS=Gxfxcxdxbxegedabagacad P9K_SSH=0 _P9K_SSH_TTY=/dev/pts/0 EDITOR=emacs PYENV_SHELL=zsh OLLAMA_DEBUG=2 OLLAMA_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/cuda_v13]" cmd="/usr/bin/ollama runner --ollama-engine --port 42277"
time=2025-10-20T14:13:01.710+04:00 level=INFO source=runner.go:1332 msg="starting ollama engine"
time=2025-10-20T14:13:01.710+04:00 level=INFO source=runner.go:1367 msg="Server listening on 127.0.0.1:42277"
time=2025-10-20T14:13:01.716+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
time=2025-10-20T14:13:01.716+04:00 level=DEBUG source=gguf.go:578 msg=general.architecture type=string
time=2025-10-20T14:13:01.716+04:00 level=DEBUG source=gguf.go:578 msg=tokenizer.ggml.model type=string
time=2025-10-20T14:13:01.716+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
time=2025-10-20T14:13:01.716+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0
time=2025-10-20T14:13:01.716+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default=""
time=2025-10-20T14:13:01.716+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default=""
time=2025-10-20T14:13:01.716+04:00 level=INFO source=ggml.go:134 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3
time=2025-10-20T14:13:01.716+04:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama
load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-icelake.so
time=2025-10-20T14:13:01.727+04:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama/cuda_v13
time=2025-10-20T14:13:01.728+04:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.AVX512=1 CPU.0.AVX512_VBMI=1 CPU.0.AVX512_VNNI=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 compiler=cgo(gcc)
time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0
time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0
time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}"
time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}"
time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}"
time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}"
time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true
time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0
time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false
time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0
time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}"
time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default=""
time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0
time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0
time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0
time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0
time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0
time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0
time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000
time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1
time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=runner.go:1307 msg="dummy model load took" duration=12.778425ms
time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=runner.go:1312 msg="gathering device infos took" duration=1.333µs
time=2025-10-20T14:13:01.729+04:00 level=TRACE source=runner.go:548 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v13]" devices=[]
time=2025-10-20T14:13:01.729+04:00 level=DEBUG source=runner.go:451 msg="bootstrap discovery took" duration=46.720571ms OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v13]" extra_envs=[]
time=2025-10-20T14:13:01.729+04:00 level=DEBUG source=runner.go:448 msg="spawning runner with" OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v12]" extra_envs=[]
time=2025-10-20T14:13:01.730+04:00 level=TRACE source=runner.go:529 msg="starting runner for device discovery" env="[SHELL=/bin/zsh HYPRLAND_CMD=Hyprland XDG_CONFIG_DIRS=/etc/xdg:/etc/kde/xdg LESS=-R XDG_SESSION_PATH=/org/freedesktop/DisplayManager/Session1 CLUTTER_BACKEND=wayland XDG_BACKEND=wayland PKG_CONFIG_PATH=/usr/local/lib64/pkgconfig:/usr/local/share/pkgconfig:/usr/lib64/pkgconfig:/usr/share/pkgconfig GNOME_KEYRING_CONTROL=/home/janemba/.cache/keyring-GBQVE3 G_BROKEN_FILENAMES=1 QT_WAYLAND_DISABLE_WINDOWDECORATION=1 HISTSIZE=999999999 HOSTNAME=z16.calculus.lan MINICOM=-c on JAVA_HOME=/usr/lib64/zulu-openjdk8 DESKTOP_SESSION=hyprland XCURSOR_SIZE=24 XDG_SEAT=seat0 PWD=/home/janemba XDG_SESSION_DESKTOP=Hyprland LOGNAME=janemba QT_QPA_PLATFORMTHEME=qt5ct XDG_SESSION_TYPE=wayland MANPATH=/usr/lib64/zulu-openjdk8/man:/usr/lib64/zulu-openjdk17/man:/usr/lib64/zulu-openjdk11/man::/home/janemba/.local/share/man LESSQUIET=true _=/usr/bin/ollama LS_OPTIONS=-F -b -T 0 --color=auto HOME=/home/janemba LANG=en_US.utf8 _JAVA_AWT_WM_NONREPARENTING=1 LS_COLORS=no=00:fi=00:di=01;34:ln=01;36:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.bat=01;32:*.btm=01;32:*.cmd=01;32:*.com=01;32:*.dll=01;32:*.exe=01;32:*.7z=01;31:*.ace=01;31:*.arj=01;31:*.bz2=01;31:*.cpio=01;31:*.deb=01;31:*.dz=01;31:*.gz=01;31:*.jar=01;31:*.lha=01;31:*.lz=01;31:*.lzh=01;31:*.lzma=01;31:*.rar=01;31:*.rpm=01;31:*.rz=01;31:*.tar=01;31:*.taz=01;31:*.tb2=01;31:*.tbz2=01;31:*.tbz=01;31:*.tgz=01;31:*.tlz=01;31:*.trz=01;31:*.txz=01;31:*.tz=01;31:*.tz2=01;31:*.tzst=01;31:*.xz=01;31:*.z=01;31:*.zip=01;31:*.zoo=01;31:*.zst=01;31:*.aac=01;35:*.anx=01;35:*.asf=01;35:*.au=01;35:*.axa=01;35:*.axv=01;35:*.avi=01;35:*.bmp=01;35:*.divx=01;35:*.flac=01;35:*.flv=01;35:*.gif=01;35:*.ico=01;35:*.jpg=01;35:*.jpeg=01;35:*.m2a=01;35:*.m2t=01;35:*.m2v=01;35:*.m4a=01;35:*.m4p=01;35:*.m4v=01;35:*.mid=01;35:*.midi=01;35:*.mka=01;35:*.mkv=01;35:*.mov=01;35:*.mp3=01;35:*.mp4=01;35:*.mp4v=01;35:*.mpc=01;35:*.mpeg=01;35:*.mpg=01;35:*.nuv=01;35:*.oga=01;35:*.ogv=01;35:*.ogx=01;35:*.ogg=01;35:*.opus=01;35:*.pbm=01;35:*.pgm=01;35:*.png=01;35:*.ppm=01;35:*.qt=01;35:*.ra=01;35:*.ram=01;35:*.rm=01;35:*.spx=01;35:*.svg=01;35:*.svgz=01;35:*.tga=01;35:*.tif=01;35:*.tiff=01;35:*.vob=01;35:*.wav=01;35:*.webm=01;35:*.webp=01;35:*.wma=01;35:*.wmv=01;35:*.xbm=01;35:*.xcf=01;35:*.xpm=01;35:*.xspf=01;35:*.xwd=01;35:*.xvid=01;35: XDG_CURRENT_DESKTOP=Hyprland WAYLAND_DISPLAY=wayland-1 XDG_SEAT_PATH=/org/freedesktop/DisplayManager/Seat0 M2_HOME=/usr/share/maven SAVEHIST=999999999 GOROOT=/usr/lib64/go1.22.9/go QT_QPA_PLATFORM=wayland;xcb XDG_SESSION_CLASS=user TERM=foot G_FILENAME_ENCODING=@locale LESSOPEN=|~/.lessfilter %s HIP_CLANG_PATH=/opt/rocm/llvm/bin USER=janemba SDL_VIDEODRIVER=wayland T1LIB_CONFIG=/usr/share/t1lib/t1lib.config GDK_USE_XFT=1 HYPRLAND_INSTANCE_SIGNATURE=918d8340afd652b011b937d29d5eea0be08467f5_1760954688_290007924 DISPLAY=:1 SHLVL=1 MOZ_ENABLE_WAYLAND=1 INPUTRC=/etc/inputrc XDG_VTNR=4 XDG_SESSION_ID=1 LD_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama:/usr/lib/ollama/cuda_v12:/usr/lib64/zulu-openjdk17/lib/server:/usr/lib64/zulu-openjdk11/lib/server XDG_RUNTIME_DIR=/run/user/1000 PYENV_ROOT=/home/janemba/.pyenv QT5DIR=/usr/lib64/qt5 HYPRLAND_LOG_WLR=1 KDEDIRS=/usr QT_AUTO_SCREEN_SCALE_FACTOR=1 LC_COLLATE=C XDG_DATA_DIRS=/home/janemba/.local/share/flatpak/exports/share:/var/lib/flatpak/exports/share:/usr/local/share:/usr/share VDPAU_LOG=0 GDK_BACKEND=wayland,x11,* PATH=/home/janemba/.pyenv/shims:/home/janemba/.local/bin:/home/janemba/.pyenv/bin:/home/janemba/.cargo/bin:/home/janemba:/.avm/bin:/home/janemba/.local/share/go/bin:/home/janemba/.npm-global/bin:/usr/lib64/zulu-openjdk8/bin:/usr/lib64/zulu-openjdk8/jre/bin:/usr/lib64/zulu-openjdk17/bin:/usr/lib64/zulu-openjdk11/bin:/opt/rocm/bin:/usr/lib64/go1.22.9/go/bin:/usr/local/bin:/usr/bin:/bin:/usr/games:/usr/lib64/libexec/kf5:/usr/lib64/qt5/bin:/usr/lib64/qt6/bin:/home/janemba/.fzf/bin:/home/janemba/.lmstudio/bin SAL_USE_VCLPLUGIN=gtk DBUS_SESSION_BUS_ADDRESS=unix:path=/tmp/dbus-MHRWq7Tuj6,guid=ee724b1cf380d4f9d045d99a68f60940 QT6DIR=/usr/lib64/qt6 OLDPWD=/home/janemba GOPATH=/home/janemba/.local/share/go HYPRCURSOR_SIZE=24 COLORTERM=truecolor P9K_TTY=old _P9K_TTY=/dev/pts/0 ZSH=/home/janemba/.oh-my-zsh PAGER=less LSCOLORS=Gxfxcxdxbxegedabagacad P9K_SSH=0 _P9K_SSH_TTY=/dev/pts/0 EDITOR=emacs PYENV_SHELL=zsh OLLAMA_DEBUG=2 OLLAMA_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/cuda_v12]" cmd="/usr/bin/ollama runner --ollama-engine --port 40877"
time=2025-10-20T14:13:01.754+04:00 level=INFO source=runner.go:1332 msg="starting ollama engine"
time=2025-10-20T14:13:01.754+04:00 level=INFO source=runner.go:1367 msg="Server listening on 127.0.0.1:40877"
time=2025-10-20T14:13:01.762+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
time=2025-10-20T14:13:01.762+04:00 level=DEBUG source=gguf.go:578 msg=general.architecture type=string
time=2025-10-20T14:13:01.762+04:00 level=DEBUG source=gguf.go:578 msg=tokenizer.ggml.model type=string
time=2025-10-20T14:13:01.762+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
time=2025-10-20T14:13:01.762+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0
time=2025-10-20T14:13:01.762+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default=""
time=2025-10-20T14:13:01.762+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default=""
time=2025-10-20T14:13:01.762+04:00 level=INFO source=ggml.go:134 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3
time=2025-10-20T14:13:01.762+04:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama
load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-icelake.so
time=2025-10-20T14:13:01.772+04:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama/cuda_v12
time=2025-10-20T14:13:01.773+04:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.AVX512=1 CPU.0.AVX512_VBMI=1 CPU.0.AVX512_VNNI=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 compiler=cgo(gcc)
time=2025-10-20T14:13:01.773+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0
time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0
time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}"
time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}"
time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}"
time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}"
time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true
time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0
time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false
time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0
time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}"
time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default=""
time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0
time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0
time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0
time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0
time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0
time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0
time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000
time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1
time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=runner.go:1307 msg="dummy model load took" duration=12.638828ms
time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=runner.go:1312 msg="gathering device infos took" duration=1.352µs
time=2025-10-20T14:13:01.775+04:00 level=TRACE source=runner.go:548 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v12]" devices=[]
time=2025-10-20T14:13:01.775+04:00 level=DEBUG source=runner.go:451 msg="bootstrap discovery took" duration=45.469635ms OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v12]" extra_envs=[]
time=2025-10-20T14:13:01.775+04:00 level=DEBUG source=runner.go:118 msg="filtering out unsupported or overlapping GPU library combinations" count=0
time=2025-10-20T14:13:01.775+04:00 level=TRACE source=runner.go:171 msg="supported GPU library combinations" supported=map[]
time=2025-10-20T14:13:01.775+04:00 level=DEBUG source=runner.go:45 msg="GPU bootstrap discovery took" duration=92.694648ms
time=2025-10-20T14:13:01.775+04:00 level=INFO source=types.go:129 msg="inference compute" id=cpu library=cpu compute="" name=cpu description=cpu libdirs=ollama driver="" pci_id="" type="" total="30.1 GiB" available="23.9 GiB"
time=2025-10-20T14:13:01.775+04:00 level=INFO source=routes.go:1605 msg="entering low vram mode" "total vram"="0 B" threshold="20.0 GiB"
```
<!-- gh-comment-id:3421424921 --> @janemba commented on GitHub (Oct 20, 2025): I've delete everything and install again. I think I have no leftover but the issue stay the same. Also, I also used HSA_OVERRIDE_GFX_VERSION with different values but got the same results. ````sh $ OLLAMA_DEBUG=2 ollama serve time=2025-10-20T14:13:01.677+04:00 level=INFO source=routes.go:1511 msg="server config" env="map[CUDA_VISIBLE_DEVICES: GGML_VK_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:4096 OLLAMA_DEBUG:DEBUG-4 OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/home/janemba/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_REMOTES:[ollama.com] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES: http_proxy: https_proxy: no_proxy:]" time=2025-10-20T14:13:01.681+04:00 level=INFO source=images.go:522 msg="total blobs: 11" time=2025-10-20T14:13:01.681+04:00 level=INFO source=images.go:529 msg="total unused blobs removed: 0" time=2025-10-20T14:13:01.682+04:00 level=INFO source=routes.go:1564 msg="Listening on 127.0.0.1:11434 (version 0.12.6)" time=2025-10-20T14:13:01.682+04:00 level=DEBUG source=sched.go:123 msg="starting llm scheduler" time=2025-10-20T14:13:01.682+04:00 level=INFO source=runner.go:80 msg="discovering available GPUs..." time=2025-10-20T14:13:01.682+04:00 level=DEBUG source=runner.go:448 msg="spawning runner with" OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v13]" extra_envs=[] time=2025-10-20T14:13:01.683+04:00 level=TRACE source=runner.go:529 msg="starting runner for device discovery" env="[SHELL=/bin/zsh HYPRLAND_CMD=Hyprland XDG_CONFIG_DIRS=/etc/xdg:/etc/kde/xdg LESS=-R XDG_SESSION_PATH=/org/freedesktop/DisplayManager/Session1 CLUTTER_BACKEND=wayland XDG_BACKEND=wayland PKG_CONFIG_PATH=/usr/local/lib64/pkgconfig:/usr/local/share/pkgconfig:/usr/lib64/pkgconfig:/usr/share/pkgconfig GNOME_KEYRING_CONTROL=/home/janemba/.cache/keyring-GBQVE3 G_BROKEN_FILENAMES=1 QT_WAYLAND_DISABLE_WINDOWDECORATION=1 HISTSIZE=999999999 HOSTNAME=z16.calculus.lan MINICOM=-c on JAVA_HOME=/usr/lib64/zulu-openjdk8 DESKTOP_SESSION=hyprland XCURSOR_SIZE=24 XDG_SEAT=seat0 PWD=/home/janemba XDG_SESSION_DESKTOP=Hyprland LOGNAME=janemba QT_QPA_PLATFORMTHEME=qt5ct XDG_SESSION_TYPE=wayland MANPATH=/usr/lib64/zulu-openjdk8/man:/usr/lib64/zulu-openjdk17/man:/usr/lib64/zulu-openjdk11/man::/home/janemba/.local/share/man LESSQUIET=true _=/usr/bin/ollama LS_OPTIONS=-F -b -T 0 --color=auto HOME=/home/janemba LANG=en_US.utf8 _JAVA_AWT_WM_NONREPARENTING=1 LS_COLORS=no=00:fi=00:di=01;34:ln=01;36:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.bat=01;32:*.btm=01;32:*.cmd=01;32:*.com=01;32:*.dll=01;32:*.exe=01;32:*.7z=01;31:*.ace=01;31:*.arj=01;31:*.bz2=01;31:*.cpio=01;31:*.deb=01;31:*.dz=01;31:*.gz=01;31:*.jar=01;31:*.lha=01;31:*.lz=01;31:*.lzh=01;31:*.lzma=01;31:*.rar=01;31:*.rpm=01;31:*.rz=01;31:*.tar=01;31:*.taz=01;31:*.tb2=01;31:*.tbz2=01;31:*.tbz=01;31:*.tgz=01;31:*.tlz=01;31:*.trz=01;31:*.txz=01;31:*.tz=01;31:*.tz2=01;31:*.tzst=01;31:*.xz=01;31:*.z=01;31:*.zip=01;31:*.zoo=01;31:*.zst=01;31:*.aac=01;35:*.anx=01;35:*.asf=01;35:*.au=01;35:*.axa=01;35:*.axv=01;35:*.avi=01;35:*.bmp=01;35:*.divx=01;35:*.flac=01;35:*.flv=01;35:*.gif=01;35:*.ico=01;35:*.jpg=01;35:*.jpeg=01;35:*.m2a=01;35:*.m2t=01;35:*.m2v=01;35:*.m4a=01;35:*.m4p=01;35:*.m4v=01;35:*.mid=01;35:*.midi=01;35:*.mka=01;35:*.mkv=01;35:*.mov=01;35:*.mp3=01;35:*.mp4=01;35:*.mp4v=01;35:*.mpc=01;35:*.mpeg=01;35:*.mpg=01;35:*.nuv=01;35:*.oga=01;35:*.ogv=01;35:*.ogx=01;35:*.ogg=01;35:*.opus=01;35:*.pbm=01;35:*.pgm=01;35:*.png=01;35:*.ppm=01;35:*.qt=01;35:*.ra=01;35:*.ram=01;35:*.rm=01;35:*.spx=01;35:*.svg=01;35:*.svgz=01;35:*.tga=01;35:*.tif=01;35:*.tiff=01;35:*.vob=01;35:*.wav=01;35:*.webm=01;35:*.webp=01;35:*.wma=01;35:*.wmv=01;35:*.xbm=01;35:*.xcf=01;35:*.xpm=01;35:*.xspf=01;35:*.xwd=01;35:*.xvid=01;35: XDG_CURRENT_DESKTOP=Hyprland WAYLAND_DISPLAY=wayland-1 XDG_SEAT_PATH=/org/freedesktop/DisplayManager/Seat0 M2_HOME=/usr/share/maven SAVEHIST=999999999 GOROOT=/usr/lib64/go1.22.9/go QT_QPA_PLATFORM=wayland;xcb XDG_SESSION_CLASS=user TERM=foot G_FILENAME_ENCODING=@locale LESSOPEN=|~/.lessfilter %s HIP_CLANG_PATH=/opt/rocm/llvm/bin USER=janemba SDL_VIDEODRIVER=wayland T1LIB_CONFIG=/usr/share/t1lib/t1lib.config GDK_USE_XFT=1 HYPRLAND_INSTANCE_SIGNATURE=918d8340afd652b011b937d29d5eea0be08467f5_1760954688_290007924 DISPLAY=:1 SHLVL=1 MOZ_ENABLE_WAYLAND=1 INPUTRC=/etc/inputrc XDG_VTNR=4 XDG_SESSION_ID=1 LD_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama:/usr/lib/ollama/cuda_v13:/usr/lib64/zulu-openjdk17/lib/server:/usr/lib64/zulu-openjdk11/lib/server XDG_RUNTIME_DIR=/run/user/1000 PYENV_ROOT=/home/janemba/.pyenv QT5DIR=/usr/lib64/qt5 HYPRLAND_LOG_WLR=1 KDEDIRS=/usr QT_AUTO_SCREEN_SCALE_FACTOR=1 LC_COLLATE=C XDG_DATA_DIRS=/home/janemba/.local/share/flatpak/exports/share:/var/lib/flatpak/exports/share:/usr/local/share:/usr/share VDPAU_LOG=0 GDK_BACKEND=wayland,x11,* PATH=/home/janemba/.pyenv/shims:/home/janemba/.local/bin:/home/janemba/.pyenv/bin:/home/janemba/.cargo/bin:/home/janemba:/.avm/bin:/home/janemba/.local/share/go/bin:/home/janemba/.npm-global/bin:/usr/lib64/zulu-openjdk8/bin:/usr/lib64/zulu-openjdk8/jre/bin:/usr/lib64/zulu-openjdk17/bin:/usr/lib64/zulu-openjdk11/bin:/opt/rocm/bin:/usr/lib64/go1.22.9/go/bin:/usr/local/bin:/usr/bin:/bin:/usr/games:/usr/lib64/libexec/kf5:/usr/lib64/qt5/bin:/usr/lib64/qt6/bin:/home/janemba/.fzf/bin:/home/janemba/.lmstudio/bin SAL_USE_VCLPLUGIN=gtk DBUS_SESSION_BUS_ADDRESS=unix:path=/tmp/dbus-MHRWq7Tuj6,guid=ee724b1cf380d4f9d045d99a68f60940 QT6DIR=/usr/lib64/qt6 OLDPWD=/home/janemba GOPATH=/home/janemba/.local/share/go HYPRCURSOR_SIZE=24 COLORTERM=truecolor P9K_TTY=old _P9K_TTY=/dev/pts/0 ZSH=/home/janemba/.oh-my-zsh PAGER=less LSCOLORS=Gxfxcxdxbxegedabagacad P9K_SSH=0 _P9K_SSH_TTY=/dev/pts/0 EDITOR=emacs PYENV_SHELL=zsh OLLAMA_DEBUG=2 OLLAMA_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/cuda_v13]" cmd="/usr/bin/ollama runner --ollama-engine --port 42277" time=2025-10-20T14:13:01.710+04:00 level=INFO source=runner.go:1332 msg="starting ollama engine" time=2025-10-20T14:13:01.710+04:00 level=INFO source=runner.go:1367 msg="Server listening on 127.0.0.1:42277" time=2025-10-20T14:13:01.716+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32 time=2025-10-20T14:13:01.716+04:00 level=DEBUG source=gguf.go:578 msg=general.architecture type=string time=2025-10-20T14:13:01.716+04:00 level=DEBUG source=gguf.go:578 msg=tokenizer.ggml.model type=string time=2025-10-20T14:13:01.716+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32 time=2025-10-20T14:13:01.716+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0 time=2025-10-20T14:13:01.716+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default="" time=2025-10-20T14:13:01.716+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default="" time=2025-10-20T14:13:01.716+04:00 level=INFO source=ggml.go:134 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3 time=2025-10-20T14:13:01.716+04:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-icelake.so time=2025-10-20T14:13:01.727+04:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama/cuda_v13 time=2025-10-20T14:13:01.728+04:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.AVX512=1 CPU.0.AVX512_VBMI=1 CPU.0.AVX512_VNNI=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 compiler=cgo(gcc) time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0 time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0 time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0 time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}" time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}" time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}" time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}" time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0 time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0 time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}" time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default="" time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0 time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0 time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0 time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0 time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0 time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0 time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0 time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000 time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1 time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=runner.go:1307 msg="dummy model load took" duration=12.778425ms time=2025-10-20T14:13:01.728+04:00 level=DEBUG source=runner.go:1312 msg="gathering device infos took" duration=1.333µs time=2025-10-20T14:13:01.729+04:00 level=TRACE source=runner.go:548 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v13]" devices=[] time=2025-10-20T14:13:01.729+04:00 level=DEBUG source=runner.go:451 msg="bootstrap discovery took" duration=46.720571ms OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v13]" extra_envs=[] time=2025-10-20T14:13:01.729+04:00 level=DEBUG source=runner.go:448 msg="spawning runner with" OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v12]" extra_envs=[] time=2025-10-20T14:13:01.730+04:00 level=TRACE source=runner.go:529 msg="starting runner for device discovery" env="[SHELL=/bin/zsh HYPRLAND_CMD=Hyprland XDG_CONFIG_DIRS=/etc/xdg:/etc/kde/xdg LESS=-R XDG_SESSION_PATH=/org/freedesktop/DisplayManager/Session1 CLUTTER_BACKEND=wayland XDG_BACKEND=wayland PKG_CONFIG_PATH=/usr/local/lib64/pkgconfig:/usr/local/share/pkgconfig:/usr/lib64/pkgconfig:/usr/share/pkgconfig GNOME_KEYRING_CONTROL=/home/janemba/.cache/keyring-GBQVE3 G_BROKEN_FILENAMES=1 QT_WAYLAND_DISABLE_WINDOWDECORATION=1 HISTSIZE=999999999 HOSTNAME=z16.calculus.lan MINICOM=-c on JAVA_HOME=/usr/lib64/zulu-openjdk8 DESKTOP_SESSION=hyprland XCURSOR_SIZE=24 XDG_SEAT=seat0 PWD=/home/janemba XDG_SESSION_DESKTOP=Hyprland LOGNAME=janemba QT_QPA_PLATFORMTHEME=qt5ct XDG_SESSION_TYPE=wayland MANPATH=/usr/lib64/zulu-openjdk8/man:/usr/lib64/zulu-openjdk17/man:/usr/lib64/zulu-openjdk11/man::/home/janemba/.local/share/man LESSQUIET=true _=/usr/bin/ollama LS_OPTIONS=-F -b -T 0 --color=auto HOME=/home/janemba LANG=en_US.utf8 _JAVA_AWT_WM_NONREPARENTING=1 LS_COLORS=no=00:fi=00:di=01;34:ln=01;36:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.bat=01;32:*.btm=01;32:*.cmd=01;32:*.com=01;32:*.dll=01;32:*.exe=01;32:*.7z=01;31:*.ace=01;31:*.arj=01;31:*.bz2=01;31:*.cpio=01;31:*.deb=01;31:*.dz=01;31:*.gz=01;31:*.jar=01;31:*.lha=01;31:*.lz=01;31:*.lzh=01;31:*.lzma=01;31:*.rar=01;31:*.rpm=01;31:*.rz=01;31:*.tar=01;31:*.taz=01;31:*.tb2=01;31:*.tbz2=01;31:*.tbz=01;31:*.tgz=01;31:*.tlz=01;31:*.trz=01;31:*.txz=01;31:*.tz=01;31:*.tz2=01;31:*.tzst=01;31:*.xz=01;31:*.z=01;31:*.zip=01;31:*.zoo=01;31:*.zst=01;31:*.aac=01;35:*.anx=01;35:*.asf=01;35:*.au=01;35:*.axa=01;35:*.axv=01;35:*.avi=01;35:*.bmp=01;35:*.divx=01;35:*.flac=01;35:*.flv=01;35:*.gif=01;35:*.ico=01;35:*.jpg=01;35:*.jpeg=01;35:*.m2a=01;35:*.m2t=01;35:*.m2v=01;35:*.m4a=01;35:*.m4p=01;35:*.m4v=01;35:*.mid=01;35:*.midi=01;35:*.mka=01;35:*.mkv=01;35:*.mov=01;35:*.mp3=01;35:*.mp4=01;35:*.mp4v=01;35:*.mpc=01;35:*.mpeg=01;35:*.mpg=01;35:*.nuv=01;35:*.oga=01;35:*.ogv=01;35:*.ogx=01;35:*.ogg=01;35:*.opus=01;35:*.pbm=01;35:*.pgm=01;35:*.png=01;35:*.ppm=01;35:*.qt=01;35:*.ra=01;35:*.ram=01;35:*.rm=01;35:*.spx=01;35:*.svg=01;35:*.svgz=01;35:*.tga=01;35:*.tif=01;35:*.tiff=01;35:*.vob=01;35:*.wav=01;35:*.webm=01;35:*.webp=01;35:*.wma=01;35:*.wmv=01;35:*.xbm=01;35:*.xcf=01;35:*.xpm=01;35:*.xspf=01;35:*.xwd=01;35:*.xvid=01;35: XDG_CURRENT_DESKTOP=Hyprland WAYLAND_DISPLAY=wayland-1 XDG_SEAT_PATH=/org/freedesktop/DisplayManager/Seat0 M2_HOME=/usr/share/maven SAVEHIST=999999999 GOROOT=/usr/lib64/go1.22.9/go QT_QPA_PLATFORM=wayland;xcb XDG_SESSION_CLASS=user TERM=foot G_FILENAME_ENCODING=@locale LESSOPEN=|~/.lessfilter %s HIP_CLANG_PATH=/opt/rocm/llvm/bin USER=janemba SDL_VIDEODRIVER=wayland T1LIB_CONFIG=/usr/share/t1lib/t1lib.config GDK_USE_XFT=1 HYPRLAND_INSTANCE_SIGNATURE=918d8340afd652b011b937d29d5eea0be08467f5_1760954688_290007924 DISPLAY=:1 SHLVL=1 MOZ_ENABLE_WAYLAND=1 INPUTRC=/etc/inputrc XDG_VTNR=4 XDG_SESSION_ID=1 LD_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama:/usr/lib/ollama/cuda_v12:/usr/lib64/zulu-openjdk17/lib/server:/usr/lib64/zulu-openjdk11/lib/server XDG_RUNTIME_DIR=/run/user/1000 PYENV_ROOT=/home/janemba/.pyenv QT5DIR=/usr/lib64/qt5 HYPRLAND_LOG_WLR=1 KDEDIRS=/usr QT_AUTO_SCREEN_SCALE_FACTOR=1 LC_COLLATE=C XDG_DATA_DIRS=/home/janemba/.local/share/flatpak/exports/share:/var/lib/flatpak/exports/share:/usr/local/share:/usr/share VDPAU_LOG=0 GDK_BACKEND=wayland,x11,* PATH=/home/janemba/.pyenv/shims:/home/janemba/.local/bin:/home/janemba/.pyenv/bin:/home/janemba/.cargo/bin:/home/janemba:/.avm/bin:/home/janemba/.local/share/go/bin:/home/janemba/.npm-global/bin:/usr/lib64/zulu-openjdk8/bin:/usr/lib64/zulu-openjdk8/jre/bin:/usr/lib64/zulu-openjdk17/bin:/usr/lib64/zulu-openjdk11/bin:/opt/rocm/bin:/usr/lib64/go1.22.9/go/bin:/usr/local/bin:/usr/bin:/bin:/usr/games:/usr/lib64/libexec/kf5:/usr/lib64/qt5/bin:/usr/lib64/qt6/bin:/home/janemba/.fzf/bin:/home/janemba/.lmstudio/bin SAL_USE_VCLPLUGIN=gtk DBUS_SESSION_BUS_ADDRESS=unix:path=/tmp/dbus-MHRWq7Tuj6,guid=ee724b1cf380d4f9d045d99a68f60940 QT6DIR=/usr/lib64/qt6 OLDPWD=/home/janemba GOPATH=/home/janemba/.local/share/go HYPRCURSOR_SIZE=24 COLORTERM=truecolor P9K_TTY=old _P9K_TTY=/dev/pts/0 ZSH=/home/janemba/.oh-my-zsh PAGER=less LSCOLORS=Gxfxcxdxbxegedabagacad P9K_SSH=0 _P9K_SSH_TTY=/dev/pts/0 EDITOR=emacs PYENV_SHELL=zsh OLLAMA_DEBUG=2 OLLAMA_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/cuda_v12]" cmd="/usr/bin/ollama runner --ollama-engine --port 40877" time=2025-10-20T14:13:01.754+04:00 level=INFO source=runner.go:1332 msg="starting ollama engine" time=2025-10-20T14:13:01.754+04:00 level=INFO source=runner.go:1367 msg="Server listening on 127.0.0.1:40877" time=2025-10-20T14:13:01.762+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32 time=2025-10-20T14:13:01.762+04:00 level=DEBUG source=gguf.go:578 msg=general.architecture type=string time=2025-10-20T14:13:01.762+04:00 level=DEBUG source=gguf.go:578 msg=tokenizer.ggml.model type=string time=2025-10-20T14:13:01.762+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32 time=2025-10-20T14:13:01.762+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0 time=2025-10-20T14:13:01.762+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default="" time=2025-10-20T14:13:01.762+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default="" time=2025-10-20T14:13:01.762+04:00 level=INFO source=ggml.go:134 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3 time=2025-10-20T14:13:01.762+04:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-icelake.so time=2025-10-20T14:13:01.772+04:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama/cuda_v12 time=2025-10-20T14:13:01.773+04:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.AVX512=1 CPU.0.AVX512_VBMI=1 CPU.0.AVX512_VNNI=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 compiler=cgo(gcc) time=2025-10-20T14:13:01.773+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0 time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0 time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0 time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}" time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}" time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}" time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}" time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0 time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0 time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}" time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default="" time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0 time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0 time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0 time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0 time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0 time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0 time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0 time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000 time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1 time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=runner.go:1307 msg="dummy model load took" duration=12.638828ms time=2025-10-20T14:13:01.774+04:00 level=DEBUG source=runner.go:1312 msg="gathering device infos took" duration=1.352µs time=2025-10-20T14:13:01.775+04:00 level=TRACE source=runner.go:548 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v12]" devices=[] time=2025-10-20T14:13:01.775+04:00 level=DEBUG source=runner.go:451 msg="bootstrap discovery took" duration=45.469635ms OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v12]" extra_envs=[] time=2025-10-20T14:13:01.775+04:00 level=DEBUG source=runner.go:118 msg="filtering out unsupported or overlapping GPU library combinations" count=0 time=2025-10-20T14:13:01.775+04:00 level=TRACE source=runner.go:171 msg="supported GPU library combinations" supported=map[] time=2025-10-20T14:13:01.775+04:00 level=DEBUG source=runner.go:45 msg="GPU bootstrap discovery took" duration=92.694648ms time=2025-10-20T14:13:01.775+04:00 level=INFO source=types.go:129 msg="inference compute" id=cpu library=cpu compute="" name=cpu description=cpu libdirs=ollama driver="" pci_id="" type="" total="30.1 GiB" available="23.9 GiB" time=2025-10-20T14:13:01.775+04:00 level=INFO source=routes.go:1605 msg="entering low vram mode" "total vram"="0 B" threshold="20.0 GiB" ```
Author
Owner

@janemba commented on GitHub (Oct 20, 2025):

Problem solved.
HSA override had to be set to 11.0.2. Also, re-installation was missing ollama-linux-amd64-rocm.tgz.

<!-- gh-comment-id:3421602181 --> @janemba commented on GitHub (Oct 20, 2025): Problem solved. HSA override had to be set to 11.0.2. Also, re-installation was missing ollama-linux-amd64-rocm.tgz.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#8426