[GH-ISSUE #13061] Ollama Linux with Multiple AMD GPUs Fails to Use Any AMD GPU Since 0.12.6 --AMD gfx1030 AMD 6650 GPU #55164

Closed
opened 2026-04-29 08:26:06 -05:00 by GiteaMirror · 18 comments
Owner

Originally created by @ganakee on GitHub (Nov 12, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/13061

Originally assigned to: @dhiltgen on GitHub.

What is the issue?

Since 0.12.6 (all versions tested) , my system with a AMD 6650 GPU (gfx1030) fails to use the AMD 6650 GPU (dual) and reverts to CPU-only mode.

Several bug reports on Windows etc. of similar bug but those fixes have not affected, for me, the issue.

Must revert, for me, to 0.12.3 to get the GPU working again. No other subsequent version works. Ollama loaded from systemd.

Tested all RC and final from 0.12.6--all fail.

System Ubuntu 25.10.

Relevant log output


OS

Ubuntu 25.10

GPU

Duaal
Radeon RX 6650XT
Radeon 680M

CPU

AMD Ryzen 7 6800H

Ollama version

Tried all from 0.12.6. Working until 0.12.3. Can revert to 0.12.3 and immediately works on systemd restart of ollama service

Originally created by @ganakee on GitHub (Nov 12, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/13061 Originally assigned to: @dhiltgen on GitHub. ### What is the issue? Since 0.12.6 (all versions tested) , my system with a AMD 6650 GPU (gfx1030) fails to use the AMD 6650 GPU (dual) and reverts to CPU-only mode. Several bug reports on Windows etc. of similar bug but those fixes have not affected, for me, the issue. Must revert, for me, to 0.12.3 to get the GPU working again. No other subsequent version works. Ollama loaded from `systemd`. Tested all RC and final from 0.12.6--all fail. System Ubuntu 25.10. ### Relevant log output ```shell ``` ### OS Ubuntu 25.10 ### GPU Duaal Radeon RX 6650XT Radeon 680M ### CPU AMD Ryzen 7 6800H ### Ollama version Tried all from 0.12.6. Working until 0.12.3. Can revert to 0.12.3 and immediately works on systemd restart of ollama service
GiteaMirror added the amdbuglinux labels 2026-04-29 08:26:07 -05:00
Author
Owner

@rick-github commented on GitHub (Nov 12, 2025):

Server log with OLLAMA_DEBUG=2 in the server environment will help in debugging. Note this will include any prompt so only share up to the line that says inference compute.

<!-- gh-comment-id:3523327377 --> @rick-github commented on GitHub (Nov 12, 2025): [Server log](https://docs.ollama.com/troubleshooting) with `OLLAMA_DEBUG=2` in the server environment will help in debugging. Note this will include any prompt so only share up to the line that says `inference compute`.
Author
Owner

@0xra0 commented on GitHub (Nov 12, 2025):

I had the same problem but the problem disappeared by itself

<!-- gh-comment-id:3523908132 --> @0xra0 commented on GitHub (Nov 12, 2025): I had the same problem but the problem disappeared by itself
Author
Owner

@ganakee commented on GitHub (Nov 13, 2025):

Thanks @rick-github. I appreciate the helpful link to the logs information.

I just tried the new 0.12.-11-rc1. For me, 0.12.11-rc-1 still uses CPU only. Reverting to 0.12.3, the GPU is used.

Testing 0.12.11-rc-1 and Logs

I just downloaded and installed the 0.12.11-rc1. I used both the base download and the AMD ROCm support download as normal. My update BASH script does all the systemd restarts etc.

With 0.12.11-rc1 loaded I executed a prompt (see 10:03:44). nvtop showed no GPU activity.

Using journalctl -u ollama --no-pager --follow --pager-end:

LOG for 0.12.11-rc1

Nov 13 10:03:21 OMEN ollama[6578]: [GIN] 2025/11/13 - 10:03:21 | 200 |    1.022314ms |      172.17.0.2 | GET      "/api/tags"
Nov 13 10:03:21 OMEN ollama[6578]: [GIN] 2025/11/13 - 10:03:21 | 200 |      72.557µs |      172.17.0.2 | GET      "/api/ps"
Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.732-05:00 level=INFO source=server.go:392 msg="starting runner" cmd="/usr/bin/ollama runner --ollama-engine --model /usr/share/ollama/.ollama/models/blobs/sha256-38e8dcc30df4eb0e29eaf5c74ba6ce3f2cd66badad50768fc14362acfb8b8cb6 --port 44079"
Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.732-05:00 level=INFO source=sched.go:443 msg="system memory" total="30.1 GiB" free="26.0 GiB" free_swap="31.7 GiB"
Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.732-05:00 level=INFO source=server.go:702 msg="loading model" "model layers"=36 requested=-1
Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.742-05:00 level=INFO source=runner.go:1398 msg="starting ollama engine"
Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.742-05:00 level=INFO source=runner.go:1433 msg="Server listening on 127.0.0.1:44079"
Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.744-05:00 level=INFO source=runner.go:1271 msg=load request="{Operation:fit LoraPath:[] Parallel:1 BatchSize:512 FlashAttention:false KvSize:4096 KvCacheType: NumThreads:8 GPULayers:[] MultiUserCache:false ProjectorPath: MainGPU:0 UseMmap:false}"
Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.799-05:00 level=INFO source=ggml.go:136 msg="" architecture=gemma3n file_type=Q4_K_M name="" description="" num_tensors=847 num_key_values=40
Nov 13 10:03:35 OMEN ollama[6578]: load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-haswell.so
Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.803-05:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 compiler=cgo(gcc)
Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.824-05:00 level=INFO source=runner.go:1271 msg=load request="{Operation:alloc LoraPath:[] Parallel:1 BatchSize:512 FlashAttention:false KvSize:4096 KvCacheType: NumThreads:8 GPULayers:[] MultiUserCache:false ProjectorPath: MainGPU:0 UseMmap:false}"
Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.916-05:00 level=INFO source=runner.go:1271 msg=load request="{Operation:commit LoraPath:[] Parallel:1 BatchSize:512 FlashAttention:false KvSize:4096 KvCacheType: NumThreads:8 GPULayers:[] MultiUserCache:false ProjectorPath: MainGPU:0 UseMmap:false}"
Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.916-05:00 level=INFO source=device.go:245 msg="model weights" device=CPU size="7.4 GiB"
Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.916-05:00 level=INFO source=device.go:256 msg="kv cache" device=CPU size="64.0 MiB"
Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.916-05:00 level=INFO source=device.go:267 msg="compute graph" device=CPU size="157.5 MiB"
Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.916-05:00 level=INFO source=ggml.go:482 msg="offloading 0 repeating layers to GPU"
Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.916-05:00 level=INFO source=ggml.go:486 msg="offloading output layer to CPU"
Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.916-05:00 level=INFO source=ggml.go:494 msg="offloaded 0/36 layers to GPU"
Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.916-05:00 level=INFO source=device.go:272 msg="total memory" size="7.6 GiB"
Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.916-05:00 level=INFO source=sched.go:517 msg="loaded runners" count=1
Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.916-05:00 level=INFO source=server.go:1294 msg="waiting for llama runner to start responding"
Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.916-05:00 level=INFO source=server.go:1328 msg="waiting for server to become available" status="llm server loading model"
Nov 13 10:03:40 OMEN ollama[6578]: time=2025-11-13T10:03:40.181-05:00 level=INFO source=server.go:1332 msg="llama runner started in 4.45 seconds"
Nov 13 10:03:44 OMEN ollama[6578]: [GIN] 2025/11/13 - 10:03:44 | 200 |  9.138404728s |      172.17.0.2 | POST     "/api/chat"
Nov 13 10:03:55 OMEN ollama[6578]: [GIN] 2025/11/13 - 10:03:55 | 200 | 11.351919296s |      172.17.0.2 | POST     "/api/chat"
Nov 13 10:04:04 OMEN ollama[6578]: [GIN] 2025/11/13 - 10:04:04 | 200 |  8.339421722s |      172.17.0.2 | POST     "/api/chat"
Nov 13 10:04:11 OMEN ollama[6578]: [GIN] 2025/11/13 - 10:04:11 | 200 |  6.838718432s |      172.17.0.2 | POST     "/api/chat"


REVERTING to 0.12.3 and Testing 0.12.3 and Logs

I reverted to 0.12.3 using both the base ollama and AMD ROCm support download. Restarted systemd etc.

When I run 0.12.3 and execute a prompt (same prompt), nvtop shows immediate activity on (for me) GPU1 (the 6650) and offloading seems to succeed. Compare
0.12.11-rc1 NO Offload
Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.916-05:00 level=INFO source=ggml.go:482 msg="offloading 0 repeating layers to GPU"
0.12.3 Offload
Nov 13 10:15:18 OMEN ollama[11302]: time=2025-11-13T10:15:18.253-05:00 level=INFO source=ggml.go:498 msg="offloaded 36/36 layers to GPU"

I have seen this since 0.12.6 when my prompt execution time suddenly increased (lower token counts due to CPU). I then re-installed older versions and the GPU worked again. @0xra0 suggested this issue went away. I admit that I have not just allowed any of the new versions to just sit for a long time.

Logs for REVERTED 0.12.3

Nov 13 10:15:09 OMEN ollama[11302]: [GIN] 2025/11/13 - 10:15:09 | 200 |    3.119915ms |      172.17.0.2 | GET      "/api/tags"
Nov 13 10:15:09 OMEN ollama[11302]: [GIN] 2025/11/13 - 10:15:09 | 200 |      71.906µs |      172.17.0.2 | GET      "/api/ps"
Nov 13 10:15:16 OMEN ollama[11302]: time=2025-11-13T10:15:16.536-05:00 level=INFO source=server.go:399 msg="starting runner" cmd="/usr/bin/ollama runner --ollama-engine --model /usr/share/ollama/.ollama/models/blobs/sha256-38e8dcc30df4eb0e29eaf5c74ba6ce3f2cd66badad50768fc14362acfb8b8cb6 --port 37195"
Nov 13 10:15:16 OMEN ollama[11302]: time=2025-11-13T10:15:16.536-05:00 level=INFO source=server.go:672 msg="loading model" "model layers"=36 requested=-1
Nov 13 10:15:16 OMEN ollama[11302]: time=2025-11-13T10:15:16.537-05:00 level=INFO source=server.go:678 msg="system memory" total="30.1 GiB" free="25.4 GiB" free_swap="31.7 GiB"
Nov 13 10:15:16 OMEN ollama[11302]: time=2025-11-13T10:15:16.537-05:00 level=INFO source=server.go:686 msg="gpu memory" id=0 available="7.5 GiB" free="8.0 GiB" minimum="457.0 MiB" overhead="0 B"
Nov 13 10:15:16 OMEN ollama[11302]: time=2025-11-13T10:15:16.546-05:00 level=INFO source=runner.go:1252 msg="starting ollama engine"
Nov 13 10:15:16 OMEN ollama[11302]: time=2025-11-13T10:15:16.547-05:00 level=INFO source=runner.go:1287 msg="Server listening on 127.0.0.1:37195"
Nov 13 10:15:16 OMEN ollama[11302]: time=2025-11-13T10:15:16.548-05:00 level=INFO source=runner.go:1171 msg=load request="{Operation:fit LoraPath:[] Parallel:1 BatchSize:512 FlashAttention:false KvSize:4096 KvCacheType: NumThreads:8 GPULayers:36[ID:0 Layers:36(0..35)] MultiUserCache:false ProjectorPath: MainGPU:0 UseMmap:false}"
Nov 13 10:15:16 OMEN ollama[11302]: time=2025-11-13T10:15:16.603-05:00 level=INFO source=ggml.go:131 msg="" architecture=gemma3n file_type=Q4_K_M name="" description="" num_tensors=847 num_key_values=40
Nov 13 10:15:17 OMEN ollama[11302]: ggml_cuda_init: GGML_CUDA_FORCE_MMQ:    no
Nov 13 10:15:17 OMEN ollama[11302]: ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no
Nov 13 10:15:17 OMEN ollama[11302]: ggml_cuda_init: found 1 ROCm devices:
Nov 13 10:15:17 OMEN ollama[11302]:   Device 0: AMD Radeon RX 6650M, gfx1030 (0x1030), VMM: no, Wave Size: 32, ID: 0
Nov 13 10:15:17 OMEN ollama[11302]: load_backend: loaded ROCm backend from /usr/lib/ollama/libggml-hip.so
Nov 13 10:15:17 OMEN ollama[11302]: load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-haswell.so
Nov 13 10:15:17 OMEN ollama[11302]: time=2025-11-13T10:15:17.985-05:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 ROCm.0.NO_VMM=1 ROCm.0.PEER_MAX_BATCH_SIZE=128 compiler=cgo(gcc)
Nov 13 10:15:18 OMEN ollama[11302]: time=2025-11-13T10:15:18.177-05:00 level=INFO source=runner.go:1171 msg=load request="{Operation:alloc LoraPath:[] Parallel:1 BatchSize:512 FlashAttention:false KvSize:4096 KvCacheType: NumThreads:8 GPULayers:36[ID:0 Layers:36(0..35)] MultiUserCache:false ProjectorPath: MainGPU:0 UseMmap:false}"
Nov 13 10:15:18 OMEN ollama[11302]: time=2025-11-13T10:15:18.253-05:00 level=INFO source=runner.go:1171 msg=load request="{Operation:commit LoraPath:[] Parallel:1 BatchSize:512 FlashAttention:false KvSize:4096 KvCacheType: NumThreads:8 GPULayers:36[ID:0 Layers:36(0..35)] MultiUserCache:false ProjectorPath: MainGPU:0 UseMmap:false}"
Nov 13 10:15:18 OMEN ollama[11302]: time=2025-11-13T10:15:18.253-05:00 level=INFO source=ggml.go:487 msg="offloading 35 repeating layers to GPU"
Nov 13 10:15:18 OMEN ollama[11302]: time=2025-11-13T10:15:18.253-05:00 level=INFO source=ggml.go:493 msg="offloading output layer to GPU"
Nov 13 10:15:18 OMEN ollama[11302]: time=2025-11-13T10:15:18.253-05:00 level=INFO source=ggml.go:498 msg="offloaded 36/36 layers to GPU"
Nov 13 10:15:18 OMEN ollama[11302]: time=2025-11-13T10:15:18.254-05:00 level=INFO source=backend.go:310 msg="model weights" device=ROCm0 size="7.0 GiB"
Nov 13 10:15:18 OMEN ollama[11302]: time=2025-11-13T10:15:18.254-05:00 level=INFO source=backend.go:315 msg="model weights" device=CPU size="420.4 MiB"
Nov 13 10:15:18 OMEN ollama[11302]: time=2025-11-13T10:15:18.254-05:00 level=INFO source=backend.go:321 msg="kv cache" device=ROCm0 size="64.0 MiB"
Nov 13 10:15:18 OMEN ollama[11302]: time=2025-11-13T10:15:18.254-05:00 level=INFO source=backend.go:332 msg="compute graph" device=ROCm0 size="167.5 MiB"
Nov 13 10:15:18 OMEN ollama[11302]: time=2025-11-13T10:15:18.254-05:00 level=INFO source=backend.go:337 msg="compute graph" device=CPU size="4.0 MiB"
Nov 13 10:15:18 OMEN ollama[11302]: time=2025-11-13T10:15:18.254-05:00 level=INFO source=backend.go:342 msg="total memory" size="7.7 GiB"
Nov 13 10:15:18 OMEN ollama[11302]: time=2025-11-13T10:15:18.254-05:00 level=INFO source=sched.go:470 msg="loaded runners" count=1
Nov 13 10:15:18 OMEN ollama[11302]: time=2025-11-13T10:15:18.254-05:00 level=INFO source=server.go:1251 msg="waiting for llama runner to start responding"
Nov 13 10:15:18 OMEN ollama[11302]: time=2025-11-13T10:15:18.254-05:00 level=INFO source=server.go:1285 msg="waiting for server to become available" status="llm server loading model"
Nov 13 10:15:21 OMEN ollama[11302]: time=2025-11-13T10:15:21.514-05:00 level=INFO source=server.go:1289 msg="llama runner started in 4.98 seconds"
Nov 13 10:15:23 OMEN ollama[11302]: [GIN] 2025/11/13 - 10:15:23 | 200 |  7.316268439s |      172.17.0.2 | POST     "/api/chat"
Nov 13 10:15:26 OMEN ollama[11302]: [GIN] 2025/11/13 - 10:15:26 | 200 |  2.896200658s |      172.17.0.2 | POST     "/api/chat"



<!-- gh-comment-id:3528337607 --> @ganakee commented on GitHub (Nov 13, 2025): Thanks @rick-github. I appreciate the helpful link to the logs information. I just tried the new 0.12.-11-rc1. For me, 0.12.11-rc-1 still uses CPU only. Reverting to 0.12.3, the GPU is used. # Testing 0.12.11-rc-1 and Logs I just downloaded and installed the 0.12.11-rc1. I used both the base download and the AMD ROCm support download as normal. My update BASH script does all the systemd restarts etc. With 0.12.11-rc1 loaded I executed a prompt (see 10:03:44). `nvtop` showed no GPU activity. Using ` journalctl -u ollama --no-pager --follow --pager-end`: ## LOG for 0.12.11-rc1 ``` Nov 13 10:03:21 OMEN ollama[6578]: [GIN] 2025/11/13 - 10:03:21 | 200 | 1.022314ms | 172.17.0.2 | GET "/api/tags" Nov 13 10:03:21 OMEN ollama[6578]: [GIN] 2025/11/13 - 10:03:21 | 200 | 72.557µs | 172.17.0.2 | GET "/api/ps" Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.732-05:00 level=INFO source=server.go:392 msg="starting runner" cmd="/usr/bin/ollama runner --ollama-engine --model /usr/share/ollama/.ollama/models/blobs/sha256-38e8dcc30df4eb0e29eaf5c74ba6ce3f2cd66badad50768fc14362acfb8b8cb6 --port 44079" Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.732-05:00 level=INFO source=sched.go:443 msg="system memory" total="30.1 GiB" free="26.0 GiB" free_swap="31.7 GiB" Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.732-05:00 level=INFO source=server.go:702 msg="loading model" "model layers"=36 requested=-1 Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.742-05:00 level=INFO source=runner.go:1398 msg="starting ollama engine" Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.742-05:00 level=INFO source=runner.go:1433 msg="Server listening on 127.0.0.1:44079" Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.744-05:00 level=INFO source=runner.go:1271 msg=load request="{Operation:fit LoraPath:[] Parallel:1 BatchSize:512 FlashAttention:false KvSize:4096 KvCacheType: NumThreads:8 GPULayers:[] MultiUserCache:false ProjectorPath: MainGPU:0 UseMmap:false}" Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.799-05:00 level=INFO source=ggml.go:136 msg="" architecture=gemma3n file_type=Q4_K_M name="" description="" num_tensors=847 num_key_values=40 Nov 13 10:03:35 OMEN ollama[6578]: load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-haswell.so Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.803-05:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 compiler=cgo(gcc) Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.824-05:00 level=INFO source=runner.go:1271 msg=load request="{Operation:alloc LoraPath:[] Parallel:1 BatchSize:512 FlashAttention:false KvSize:4096 KvCacheType: NumThreads:8 GPULayers:[] MultiUserCache:false ProjectorPath: MainGPU:0 UseMmap:false}" Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.916-05:00 level=INFO source=runner.go:1271 msg=load request="{Operation:commit LoraPath:[] Parallel:1 BatchSize:512 FlashAttention:false KvSize:4096 KvCacheType: NumThreads:8 GPULayers:[] MultiUserCache:false ProjectorPath: MainGPU:0 UseMmap:false}" Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.916-05:00 level=INFO source=device.go:245 msg="model weights" device=CPU size="7.4 GiB" Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.916-05:00 level=INFO source=device.go:256 msg="kv cache" device=CPU size="64.0 MiB" Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.916-05:00 level=INFO source=device.go:267 msg="compute graph" device=CPU size="157.5 MiB" Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.916-05:00 level=INFO source=ggml.go:482 msg="offloading 0 repeating layers to GPU" Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.916-05:00 level=INFO source=ggml.go:486 msg="offloading output layer to CPU" Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.916-05:00 level=INFO source=ggml.go:494 msg="offloaded 0/36 layers to GPU" Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.916-05:00 level=INFO source=device.go:272 msg="total memory" size="7.6 GiB" Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.916-05:00 level=INFO source=sched.go:517 msg="loaded runners" count=1 Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.916-05:00 level=INFO source=server.go:1294 msg="waiting for llama runner to start responding" Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.916-05:00 level=INFO source=server.go:1328 msg="waiting for server to become available" status="llm server loading model" Nov 13 10:03:40 OMEN ollama[6578]: time=2025-11-13T10:03:40.181-05:00 level=INFO source=server.go:1332 msg="llama runner started in 4.45 seconds" Nov 13 10:03:44 OMEN ollama[6578]: [GIN] 2025/11/13 - 10:03:44 | 200 | 9.138404728s | 172.17.0.2 | POST "/api/chat" Nov 13 10:03:55 OMEN ollama[6578]: [GIN] 2025/11/13 - 10:03:55 | 200 | 11.351919296s | 172.17.0.2 | POST "/api/chat" Nov 13 10:04:04 OMEN ollama[6578]: [GIN] 2025/11/13 - 10:04:04 | 200 | 8.339421722s | 172.17.0.2 | POST "/api/chat" Nov 13 10:04:11 OMEN ollama[6578]: [GIN] 2025/11/13 - 10:04:11 | 200 | 6.838718432s | 172.17.0.2 | POST "/api/chat" ``` # REVERTING to 0.12.3 and Testing 0.12.3 and Logs I reverted to 0.12.3 using both the base ollama and AMD ROCm support download. Restarted systemd etc. When I run 0.12.3 and execute a prompt (same prompt), `nvtop` shows immediate activity on (for me) GPU1 (the 6650) and offloading seems to succeed. Compare **0.12.11-rc1 NO Offload** `Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.916-05:00 level=INFO source=ggml.go:482 msg="offloading 0 repeating layers to GPU"` **0.12.3 Offload** `Nov 13 10:15:18 OMEN ollama[11302]: time=2025-11-13T10:15:18.253-05:00 level=INFO source=ggml.go:498 msg="offloaded 36/36 layers to GPU"` I have seen this since 0.12.6 when my prompt execution time suddenly increased (lower token counts due to CPU). I then re-installed older versions and the GPU worked again. @0xra0 suggested this issue went away. I admit that I have not just allowed any of the new versions to just sit for a long time. ## Logs for REVERTED 0.12.3 ``` Nov 13 10:15:09 OMEN ollama[11302]: [GIN] 2025/11/13 - 10:15:09 | 200 | 3.119915ms | 172.17.0.2 | GET "/api/tags" Nov 13 10:15:09 OMEN ollama[11302]: [GIN] 2025/11/13 - 10:15:09 | 200 | 71.906µs | 172.17.0.2 | GET "/api/ps" Nov 13 10:15:16 OMEN ollama[11302]: time=2025-11-13T10:15:16.536-05:00 level=INFO source=server.go:399 msg="starting runner" cmd="/usr/bin/ollama runner --ollama-engine --model /usr/share/ollama/.ollama/models/blobs/sha256-38e8dcc30df4eb0e29eaf5c74ba6ce3f2cd66badad50768fc14362acfb8b8cb6 --port 37195" Nov 13 10:15:16 OMEN ollama[11302]: time=2025-11-13T10:15:16.536-05:00 level=INFO source=server.go:672 msg="loading model" "model layers"=36 requested=-1 Nov 13 10:15:16 OMEN ollama[11302]: time=2025-11-13T10:15:16.537-05:00 level=INFO source=server.go:678 msg="system memory" total="30.1 GiB" free="25.4 GiB" free_swap="31.7 GiB" Nov 13 10:15:16 OMEN ollama[11302]: time=2025-11-13T10:15:16.537-05:00 level=INFO source=server.go:686 msg="gpu memory" id=0 available="7.5 GiB" free="8.0 GiB" minimum="457.0 MiB" overhead="0 B" Nov 13 10:15:16 OMEN ollama[11302]: time=2025-11-13T10:15:16.546-05:00 level=INFO source=runner.go:1252 msg="starting ollama engine" Nov 13 10:15:16 OMEN ollama[11302]: time=2025-11-13T10:15:16.547-05:00 level=INFO source=runner.go:1287 msg="Server listening on 127.0.0.1:37195" Nov 13 10:15:16 OMEN ollama[11302]: time=2025-11-13T10:15:16.548-05:00 level=INFO source=runner.go:1171 msg=load request="{Operation:fit LoraPath:[] Parallel:1 BatchSize:512 FlashAttention:false KvSize:4096 KvCacheType: NumThreads:8 GPULayers:36[ID:0 Layers:36(0..35)] MultiUserCache:false ProjectorPath: MainGPU:0 UseMmap:false}" Nov 13 10:15:16 OMEN ollama[11302]: time=2025-11-13T10:15:16.603-05:00 level=INFO source=ggml.go:131 msg="" architecture=gemma3n file_type=Q4_K_M name="" description="" num_tensors=847 num_key_values=40 Nov 13 10:15:17 OMEN ollama[11302]: ggml_cuda_init: GGML_CUDA_FORCE_MMQ: no Nov 13 10:15:17 OMEN ollama[11302]: ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no Nov 13 10:15:17 OMEN ollama[11302]: ggml_cuda_init: found 1 ROCm devices: Nov 13 10:15:17 OMEN ollama[11302]: Device 0: AMD Radeon RX 6650M, gfx1030 (0x1030), VMM: no, Wave Size: 32, ID: 0 Nov 13 10:15:17 OMEN ollama[11302]: load_backend: loaded ROCm backend from /usr/lib/ollama/libggml-hip.so Nov 13 10:15:17 OMEN ollama[11302]: load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-haswell.so Nov 13 10:15:17 OMEN ollama[11302]: time=2025-11-13T10:15:17.985-05:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 ROCm.0.NO_VMM=1 ROCm.0.PEER_MAX_BATCH_SIZE=128 compiler=cgo(gcc) Nov 13 10:15:18 OMEN ollama[11302]: time=2025-11-13T10:15:18.177-05:00 level=INFO source=runner.go:1171 msg=load request="{Operation:alloc LoraPath:[] Parallel:1 BatchSize:512 FlashAttention:false KvSize:4096 KvCacheType: NumThreads:8 GPULayers:36[ID:0 Layers:36(0..35)] MultiUserCache:false ProjectorPath: MainGPU:0 UseMmap:false}" Nov 13 10:15:18 OMEN ollama[11302]: time=2025-11-13T10:15:18.253-05:00 level=INFO source=runner.go:1171 msg=load request="{Operation:commit LoraPath:[] Parallel:1 BatchSize:512 FlashAttention:false KvSize:4096 KvCacheType: NumThreads:8 GPULayers:36[ID:0 Layers:36(0..35)] MultiUserCache:false ProjectorPath: MainGPU:0 UseMmap:false}" Nov 13 10:15:18 OMEN ollama[11302]: time=2025-11-13T10:15:18.253-05:00 level=INFO source=ggml.go:487 msg="offloading 35 repeating layers to GPU" Nov 13 10:15:18 OMEN ollama[11302]: time=2025-11-13T10:15:18.253-05:00 level=INFO source=ggml.go:493 msg="offloading output layer to GPU" Nov 13 10:15:18 OMEN ollama[11302]: time=2025-11-13T10:15:18.253-05:00 level=INFO source=ggml.go:498 msg="offloaded 36/36 layers to GPU" Nov 13 10:15:18 OMEN ollama[11302]: time=2025-11-13T10:15:18.254-05:00 level=INFO source=backend.go:310 msg="model weights" device=ROCm0 size="7.0 GiB" Nov 13 10:15:18 OMEN ollama[11302]: time=2025-11-13T10:15:18.254-05:00 level=INFO source=backend.go:315 msg="model weights" device=CPU size="420.4 MiB" Nov 13 10:15:18 OMEN ollama[11302]: time=2025-11-13T10:15:18.254-05:00 level=INFO source=backend.go:321 msg="kv cache" device=ROCm0 size="64.0 MiB" Nov 13 10:15:18 OMEN ollama[11302]: time=2025-11-13T10:15:18.254-05:00 level=INFO source=backend.go:332 msg="compute graph" device=ROCm0 size="167.5 MiB" Nov 13 10:15:18 OMEN ollama[11302]: time=2025-11-13T10:15:18.254-05:00 level=INFO source=backend.go:337 msg="compute graph" device=CPU size="4.0 MiB" Nov 13 10:15:18 OMEN ollama[11302]: time=2025-11-13T10:15:18.254-05:00 level=INFO source=backend.go:342 msg="total memory" size="7.7 GiB" Nov 13 10:15:18 OMEN ollama[11302]: time=2025-11-13T10:15:18.254-05:00 level=INFO source=sched.go:470 msg="loaded runners" count=1 Nov 13 10:15:18 OMEN ollama[11302]: time=2025-11-13T10:15:18.254-05:00 level=INFO source=server.go:1251 msg="waiting for llama runner to start responding" Nov 13 10:15:18 OMEN ollama[11302]: time=2025-11-13T10:15:18.254-05:00 level=INFO source=server.go:1285 msg="waiting for server to become available" status="llm server loading model" Nov 13 10:15:21 OMEN ollama[11302]: time=2025-11-13T10:15:21.514-05:00 level=INFO source=server.go:1289 msg="llama runner started in 4.98 seconds" Nov 13 10:15:23 OMEN ollama[11302]: [GIN] 2025/11/13 - 10:15:23 | 200 | 7.316268439s | 172.17.0.2 | POST "/api/chat" Nov 13 10:15:26 OMEN ollama[11302]: [GIN] 2025/11/13 - 10:15:26 | 200 | 2.896200658s | 172.17.0.2 | POST "/api/chat" ```
Author
Owner

@rick-github commented on GitHub (Nov 13, 2025):

Please include the full log with OLLAMA_DEBUG=2 from 0.12.11-rc1.

<!-- gh-comment-id:3528351455 --> @rick-github commented on GitHub (Nov 13, 2025): Please include the full log with `OLLAMA_DEBUG=2` from 0.12.11-rc1.
Author
Owner

@ganakee commented on GitHub (Nov 13, 2025):

Sorry @rick-github .I added the OLLAMA_DEBUG=2. SEE FIXED LOG BELOW IN SECOND POST.

I just re-downloaded 0.12.11-rc1 and set OLLAM_DEBUG=2. (See below if anyone else neds to do this.)

I re-installed 0.12.11-rc1 with both the ollama base and AMD ROCm support. Restart ollama service.

I executed a simple prompt. nvtop showed no GPU activity using 0.12.11-rc1 (that is, default to CPU).

Because this is a production system, I needed to now revert to 0.12.3.

I looked at the new log and no offload. (see full log below)

SNIP OF LOG

Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.916-05:00 level=INFO source=ggml.go:482 msg="offloading 0 repeating layers to GPU"
Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.916-05:00 level=INFO source=ggml.go:486 msg="offloading output layer to CPU"
Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.916-05:00 level=INFO source=ggml.go:494 msg="offloaded 0/36 layers to GPU"
Nov

IGNORE Full Log 0.12.11-rc1 on AMD 6650 with OLLAMA_DEBUG=2

~~
Nov 13 14:04:41 OMEN ollama[11302]: [GIN] 2025/11/13 - 14:04:41 | 200 |     966.103µs |   192.168.2.135 | GET      "/api/tags"
Nov 13 14:04:41 OMEN ollama[11302]: [GIN] 2025/11/13 - 14:04:41 | 200 |    9.944613ms |   192.168.2.135 | POST     "/api/show"
Nov 13 14:04:41 OMEN ollama[11302]: [GIN] 2025/11/13 - 14:04:41 | 200 |   16.842178ms |   192.168.2.135 | POST     "/api/show"
Nov 13 14:04:41 OMEN ollama[11302]: [GIN] 2025/11/13 - 14:04:41 | 200 |   50.303875ms |   192.168.2.135 | POST     "/api/show"
Nov 13 14:04:41 OMEN ollama[11302]: [GIN] 2025/11/13 - 14:04:41 | 200 |   55.893164ms |   192.168.2.135 | POST     "/api/show"
Nov 13 14:04:41 OMEN ollama[11302]: [GIN] 2025/11/13 - 14:04:41 | 200 |  103.906841ms |   192.168.2.135 | POST     "/api/show"
Nov 13 14:04:41 OMEN ollama[11302]: [GIN] 2025/11/13 - 14:04:41 | 200 |  101.145549ms |   192.168.2.135 | POST     "/api/show"
Nov 13 14:17:51 OMEN systemd[1]: Stopping ollama.service - Ollama Service...
Nov 13 14:17:51 OMEN systemd[1]: ollama.service: Deactivated successfully.
Nov 13 14:17:51 OMEN systemd[1]: Stopped ollama.service - Ollama Service.
Nov 13 14:17:51 OMEN systemd[1]: ollama.service: Consumed 1min 40.242s CPU time, 11.9G memory peak, 9M memory swap peak.
Nov 13 14:18:25 OMEN systemd[1]: Started ollama.service - Ollama Service.
Nov 13 14:18:25 OMEN ollama[34540]: time=2025-11-13T14:18:25.186-05:00 level=INFO source=routes.go:1544 msg="server config" env="map[CUDA_VISIBLE_DEVICES: GGML_VK_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION:10.3.0 HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:4096 OLLAMA_DEBUG:INFO OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://0.0.0.0:11434 OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/usr/share/ollama/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_REMOTES:[ollama.com] OLLAMA_SCHED_SPREAD:false OLLAMA_VULKAN:false ROCR_VISIBLE_DEVICES:2,3,1,0 http_proxy: https_proxy: no_proxy:]"
Nov 13 14:18:25 OMEN ollama[34540]: time=2025-11-13T14:18:25.190-05:00 level=INFO source=images.go:522 msg="total blobs: 29"
Nov 13 14:18:25 OMEN ollama[34540]: time=2025-11-13T14:18:25.190-05:00 level=INFO source=images.go:529 msg="total unused blobs removed: 0"
Nov 13 14:18:25 OMEN ollama[34540]: time=2025-11-13T14:18:25.190-05:00 level=INFO source=routes.go:1597 msg="Listening on [::]:11434 (version 0.12.11-rc1)"
Nov 13 14:18:25 OMEN ollama[34540]: time=2025-11-13T14:18:25.191-05:00 level=INFO source=runner.go:67 msg="discovering available GPUs..."
Nov 13 14:18:25 OMEN ollama[34540]: time=2025-11-13T14:18:25.191-05:00 level=INFO source=server.go:392 msg="starting runner" cmd="/usr/bin/ollama runner --ollama-engine --port 37601"
Nov 13 14:18:25 OMEN ollama[34540]: time=2025-11-13T14:18:25.210-05:00 level=INFO source=server.go:392 msg="starting runner" cmd="/usr/bin/ollama runner --ollama-engine --port 43231"
Nov 13 14:18:27 OMEN ollama[34540]: time=2025-11-13T14:18:27.501-05:00 level=INFO source=server.go:392 msg="starting runner" cmd="/usr/bin/ollama runner --ollama-engine --port 35123"
Nov 13 14:18:27 OMEN ollama[34540]: time=2025-11-13T14:18:27.531-05:00 level=INFO source=types.go:60 msg="inference compute" id=cpu library=cpu compute="" name=cpu description=cpu libdirs=ollama driver="" pci_id="" type="" total="30.1 GiB" available="25.5 GiB"
Nov 13 14:18:27 OMEN ollama[34540]: time=2025-11-13T14:18:27.531-05:00 level=INFO source=routes.go:1638 msg="entering low vram mode" "total vram"="0 B" threshold="20.0 GiB"
Nov 13 14:19:21 OMEN ollama[34540]: [GIN] 2025/11/13 - 14:19:21 | 200 |    1.813804ms |      172.17.0.2 | GET      "/api/tags"
Nov 13 14:19:21 OMEN ollama[34540]: [GIN] 2025/11/13 - 14:19:21 | 200 |      71.865µs |      172.17.0.2 | GET      "/api/ps"
Nov 13 14:19:42 OMEN ollama[34540]: time=2025-11-13T14:19:42.584-05:00 level=WARN source=types.go:797 msg="invalid option provided" option=rope_frequency_base
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: loaded meta data with 20 key-value pairs and 363 tensors from /usr/share/ollama/.ollama/models/blobs/sha256-e73cc17c718156e5ad34b119eb363e2c10389a503673f9c36144c42dfde8334c (version GGUF V2)
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: Dumping metadata keys/values. Note: KV overrides do not apply in this output.
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv   0:                       general.architecture str              = llama
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv   1:                               general.name str              = codellama
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv   2:                       llama.context_length u32              = 16384
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv   3:                     llama.embedding_length u32              = 5120
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv   4:                          llama.block_count u32              = 40
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv   5:                  llama.feed_forward_length u32              = 13824
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv   6:                 llama.rope.dimension_count u32              = 128
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv   7:                 llama.attention.head_count u32              = 40
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv   8:              llama.attention.head_count_kv u32              = 40
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv   9:     llama.attention.layer_norm_rms_epsilon f32              = 0.000010
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv  10:                       llama.rope.freq_base f32              = 1000000.000000
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv  11:                          general.file_type u32              = 2
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv  12:                       tokenizer.ggml.model str              = llama
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv  13:                      tokenizer.ggml.tokens arr[str,32016]   = ["<unk>", "<s>", "</s>", "<0x00>", "<...
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv  14:                      tokenizer.ggml.scores arr[f32,32016]   = [0.000000, 0.000000, 0.000000, 0.0000...
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv  15:                  tokenizer.ggml.token_type arr[i32,32016]   = [2, 3, 3, 6, 6, 6, 6, 6, 6, 6, 6, 6, ...
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv  16:                tokenizer.ggml.bos_token_id u32              = 1
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv  17:                tokenizer.ggml.eos_token_id u32              = 2
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv  18:            tokenizer.ggml.unknown_token_id u32              = 0
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv  19:               general.quantization_version u32              = 2
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - type  f32:   81 tensors
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - type q4_0:  281 tensors
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - type q6_K:    1 tensors
Nov 13 14:19:42 OMEN ollama[34540]: print_info: file format = GGUF V2
Nov 13 14:19:42 OMEN ollama[34540]: print_info: file type   = Q4_0
Nov 13 14:19:42 OMEN ollama[34540]: print_info: file size   = 6.86 GiB (4.53 BPW)
Nov 13 14:19:42 OMEN ollama[34540]: load: control-looking token:  32007 '▁<PRE>' was not control-type; this is probably a bug in the model. its type will be overridden
Nov 13 14:19:42 OMEN ollama[34540]: load: control-looking token:  32009 '▁<MID>' was not control-type; this is probably a bug in the model. its type will be overridden
Nov 13 14:19:42 OMEN ollama[34540]: load: control-looking token:  32008 '▁<SUF>' was not control-type; this is probably a bug in the model. its type will be overridden
Nov 13 14:19:42 OMEN ollama[34540]: load: special_eos_id is not in special_eog_ids - the tokenizer config may be incorrect
Nov 13 14:19:42 OMEN ollama[34540]: load: printing all EOG tokens:
Nov 13 14:19:42 OMEN ollama[34540]: load:   - 2 ('</s>')
Nov 13 14:19:42 OMEN ollama[34540]: load: special tokens cache size = 6
Nov 13 14:19:42 OMEN ollama[34540]: load: token to piece cache size = 0.1686 MB
Nov 13 14:19:42 OMEN ollama[34540]: print_info: arch             = llama
Nov 13 14:19:42 OMEN ollama[34540]: print_info: vocab_only       = 1
Nov 13 14:19:42 OMEN ollama[34540]: print_info: model type       = ?B
Nov 13 14:19:42 OMEN ollama[34540]: print_info: model params     = 13.02 B
Nov 13 14:19:42 OMEN ollama[34540]: print_info: general.name     = codellama
Nov 13 14:19:42 OMEN ollama[34540]: print_info: vocab type       = SPM
Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_vocab          = 32016
Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_merges         = 0
Nov 13 14:19:42 OMEN ollama[34540]: print_info: BOS token        = 1 '<s>'
Nov 13 14:19:42 OMEN ollama[34540]: print_info: EOS token        = 2 '</s>'
Nov 13 14:19:42 OMEN ollama[34540]: print_info: UNK token        = 0 '<unk>'
Nov 13 14:19:42 OMEN ollama[34540]: print_info: LF token         = 13 '<0x0A>'
Nov 13 14:19:42 OMEN ollama[34540]: print_info: FIM PRE token    = 32007 '▁<PRE>'
Nov 13 14:19:42 OMEN ollama[34540]: print_info: FIM SUF token    = 32008 '▁<SUF>'
Nov 13 14:19:42 OMEN ollama[34540]: print_info: FIM MID token    = 32009 '▁<MID>'
Nov 13 14:19:42 OMEN ollama[34540]: print_info: EOG token        = 2 '</s>'
Nov 13 14:19:42 OMEN ollama[34540]: print_info: max token length = 48
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_load: vocab only - skipping tensors
Nov 13 14:19:42 OMEN ollama[34540]: time=2025-11-13T14:19:42.624-05:00 level=INFO source=server.go:392 msg="starting runner" cmd="/usr/bin/ollama runner --model /usr/share/ollama/.ollama/models/blobs/sha256-e73cc17c718156e5ad34b119eb363e2c10389a503673f9c36144c42dfde8334c --port 35883"
Nov 13 14:19:42 OMEN ollama[34540]: time=2025-11-13T14:19:42.624-05:00 level=INFO source=sched.go:443 msg="system memory" total="30.1 GiB" free="25.5 GiB" free_swap="31.7 GiB"
Nov 13 14:19:42 OMEN ollama[34540]: time=2025-11-13T14:19:42.624-05:00 level=INFO source=server.go:459 msg="loading model" "model layers"=41 requested=-1
Nov 13 14:19:42 OMEN ollama[34540]: time=2025-11-13T14:19:42.624-05:00 level=INFO source=device.go:245 msg="model weights" device=CPU size="6.8 GiB"
Nov 13 14:19:42 OMEN ollama[34540]: time=2025-11-13T14:19:42.624-05:00 level=INFO source=device.go:256 msg="kv cache" device=CPU size="3.1 GiB"
Nov 13 14:19:42 OMEN ollama[34540]: time=2025-11-13T14:19:42.624-05:00 level=INFO source=device.go:272 msg="total memory" size="9.9 GiB"
Nov 13 14:19:42 OMEN ollama[34540]: time=2025-11-13T14:19:42.634-05:00 level=INFO source=runner.go:963 msg="starting go runner"
Nov 13 14:19:42 OMEN ollama[34540]: load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-haswell.so
Nov 13 14:19:42 OMEN ollama[34540]: time=2025-11-13T14:19:42.638-05:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 compiler=cgo(gcc)
Nov 13 14:19:42 OMEN ollama[34540]: time=2025-11-13T14:19:42.638-05:00 level=INFO source=runner.go:999 msg="Server listening on 127.0.0.1:35883"
Nov 13 14:19:42 OMEN ollama[34540]: time=2025-11-13T14:19:42.646-05:00 level=INFO source=runner.go:893 msg=load request="{Operation:commit LoraPath:[] Parallel:1 BatchSize:512 FlashAttention:false KvSize:4096 KvCacheType: NumThreads:8 GPULayers:[] MultiUserCache:false ProjectorPath: MainGPU:0 UseMmap:false}"
Nov 13 14:19:42 OMEN ollama[34540]: time=2025-11-13T14:19:42.646-05:00 level=INFO source=server.go:1294 msg="waiting for llama runner to start responding"
Nov 13 14:19:42 OMEN ollama[34540]: time=2025-11-13T14:19:42.646-05:00 level=INFO source=server.go:1328 msg="waiting for server to become available" status="llm server loading model"
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: loaded meta data with 20 key-value pairs and 363 tensors from /usr/share/ollama/.ollama/models/blobs/sha256-e73cc17c718156e5ad34b119eb363e2c10389a503673f9c36144c42dfde8334c (version GGUF V2)
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: Dumping metadata keys/values. Note: KV overrides do not apply in this output.
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv   0:                       general.architecture str              = llama
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv   1:                               general.name str              = codellama
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv   2:                       llama.context_length u32              = 16384
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv   3:                     llama.embedding_length u32              = 5120
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv   4:                          llama.block_count u32              = 40
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv   5:                  llama.feed_forward_length u32              = 13824
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv   6:                 llama.rope.dimension_count u32              = 128
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv   7:                 llama.attention.head_count u32              = 40
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv   8:              llama.attention.head_count_kv u32              = 40
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv   9:     llama.attention.layer_norm_rms_epsilon f32              = 0.000010
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv  10:                       llama.rope.freq_base f32              = 1000000.000000
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv  11:                          general.file_type u32              = 2
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv  12:                       tokenizer.ggml.model str              = llama
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv  13:                      tokenizer.ggml.tokens arr[str,32016]   = ["<unk>", "<s>", "</s>", "<0x00>", "<...
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv  14:                      tokenizer.ggml.scores arr[f32,32016]   = [0.000000, 0.000000, 0.000000, 0.0000...
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv  15:                  tokenizer.ggml.token_type arr[i32,32016]   = [2, 3, 3, 6, 6, 6, 6, 6, 6, 6, 6, 6, ...
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv  16:                tokenizer.ggml.bos_token_id u32              = 1
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv  17:                tokenizer.ggml.eos_token_id u32              = 2
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv  18:            tokenizer.ggml.unknown_token_id u32              = 0
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv  19:               general.quantization_version u32              = 2
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - type  f32:   81 tensors
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - type q4_0:  281 tensors
Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - type q6_K:    1 tensors
Nov 13 14:19:42 OMEN ollama[34540]: print_info: file format = GGUF V2
Nov 13 14:19:42 OMEN ollama[34540]: print_info: file type   = Q4_0
Nov 13 14:19:42 OMEN ollama[34540]: print_info: file size   = 6.86 GiB (4.53 BPW)
Nov 13 14:19:42 OMEN ollama[34540]: load: control-looking token:  32007 '▁<PRE>' was not control-type; this is probably a bug in the model. its type will be overridden
Nov 13 14:19:42 OMEN ollama[34540]: load: control-looking token:  32009 '▁<MID>' was not control-type; this is probably a bug in the model. its type will be overridden
Nov 13 14:19:42 OMEN ollama[34540]: load: control-looking token:  32008 '▁<SUF>' was not control-type; this is probably a bug in the model. its type will be overridden
Nov 13 14:19:42 OMEN ollama[34540]: load: special_eos_id is not in special_eog_ids - the tokenizer config may be incorrect
Nov 13 14:19:42 OMEN ollama[34540]: load: printing all EOG tokens:
Nov 13 14:19:42 OMEN ollama[34540]: load:   - 2 ('</s>')
Nov 13 14:19:42 OMEN ollama[34540]: load: special tokens cache size = 6
Nov 13 14:19:42 OMEN ollama[34540]: load: token to piece cache size = 0.1686 MB
Nov 13 14:19:42 OMEN ollama[34540]: print_info: arch             = llama
Nov 13 14:19:42 OMEN ollama[34540]: print_info: vocab_only       = 0
Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_ctx_train      = 16384
Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_embd           = 5120
Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_layer          = 40
Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_head           = 40
Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_head_kv        = 40
Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_rot            = 128
Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_swa            = 0
Nov 13 14:19:42 OMEN ollama[34540]: print_info: is_swa_any       = 0
Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_embd_head_k    = 128
Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_embd_head_v    = 128
Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_gqa            = 1
Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_embd_k_gqa     = 5120
Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_embd_v_gqa     = 5120
Nov 13 14:19:42 OMEN ollama[34540]: print_info: f_norm_eps       = 0.0e+00
Nov 13 14:19:42 OMEN ollama[34540]: print_info: f_norm_rms_eps   = 1.0e-05
Nov 13 14:19:42 OMEN ollama[34540]: print_info: f_clamp_kqv      = 0.0e+00
Nov 13 14:19:42 OMEN ollama[34540]: print_info: f_max_alibi_bias = 0.0e+00
Nov 13 14:19:42 OMEN ollama[34540]: print_info: f_logit_scale    = 0.0e+00
Nov 13 14:19:42 OMEN ollama[34540]: print_info: f_attn_scale     = 0.0e+00
Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_ff             = 13824
Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_expert         = 0
Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_expert_used    = 0
Nov 13 14:19:42 OMEN ollama[34540]: print_info: causal attn      = 1
Nov 13 14:19:42 OMEN ollama[34540]: print_info: pooling type     = 0
Nov 13 14:19:42 OMEN ollama[34540]: print_info: rope type        = 0
Nov 13 14:19:42 OMEN ollama[34540]: print_info: rope scaling     = linear
Nov 13 14:19:42 OMEN ollama[34540]: print_info: freq_base_train  = 1000000.0
Nov 13 14:19:42 OMEN ollama[34540]: print_info: freq_scale_train = 1
Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_ctx_orig_yarn  = 16384
Nov 13 14:19:42 OMEN ollama[34540]: print_info: rope_finetuned   = unknown
Nov 13 14:19:42 OMEN ollama[34540]: print_info: model type       = 13B
Nov 13 14:19:42 OMEN ollama[34540]: print_info: model params     = 13.02 B
Nov 13 14:19:42 OMEN ollama[34540]: print_info: general.name     = codellama
Nov 13 14:19:42 OMEN ollama[34540]: print_info: vocab type       = SPM
Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_vocab          = 32016
Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_merges         = 0
Nov 13 14:19:42 OMEN ollama[34540]: print_info: BOS token        = 1 '<s>'
Nov 13 14:19:42 OMEN ollama[34540]: print_info: EOS token        = 2 '</s>'
Nov 13 14:19:42 OMEN ollama[34540]: print_info: UNK token        = 0 '<unk>'
Nov 13 14:19:42 OMEN ollama[34540]: print_info: LF token         = 13 '<0x0A>'
Nov 13 14:19:42 OMEN ollama[34540]: print_info: FIM PRE token    = 32007 '▁<PRE>'
Nov 13 14:19:42 OMEN ollama[34540]: print_info: FIM SUF token    = 32008 '▁<SUF>'
Nov 13 14:19:42 OMEN ollama[34540]: print_info: FIM MID token    = 32009 '▁<MID>'
Nov 13 14:19:42 OMEN ollama[34540]: print_info: EOG token        = 2 '</s>'
Nov 13 14:19:42 OMEN ollama[34540]: print_info: max token length = 48
Nov 13 14:19:42 OMEN ollama[34540]: load_tensors: loading model tensors, this can take a while... (mmap = false)
Nov 13 14:19:42 OMEN ollama[34540]: load_tensors:          CPU model buffer size =  7024.00 MiB
Nov 13 14:19:47 OMEN ollama[34540]: [GIN] 2025/11/13 - 14:19:47 | 200 |    1.288198ms |      172.17.0.2 | GET      "/api/tags"
Nov 13 14:19:47 OMEN ollama[34540]: [GIN] 2025/11/13 - 14:19:47 | 200 |     146.545µs |      172.17.0.2 | GET      "/api/ps"
Nov 13 14:19:47 OMEN ollama[34540]: llama_context: constructing llama_context
Nov 13 14:19:47 OMEN ollama[34540]: llama_context: n_seq_max     = 1
Nov 13 14:19:47 OMEN ollama[34540]: llama_context: n_ctx         = 4096
Nov 13 14:19:47 OMEN ollama[34540]: llama_context: n_ctx_per_seq = 4096
Nov 13 14:19:47 OMEN ollama[34540]: llama_context: n_batch       = 512
Nov 13 14:19:47 OMEN ollama[34540]: llama_context: n_ubatch      = 512
Nov 13 14:19:47 OMEN ollama[34540]: llama_context: causal_attn   = 1
Nov 13 14:19:47 OMEN ollama[34540]: llama_context: flash_attn    = disabled
Nov 13 14:19:47 OMEN ollama[34540]: llama_context: kv_unified    = false
Nov 13 14:19:47 OMEN ollama[34540]: llama_context: freq_base     = 1000000.0
Nov 13 14:19:47 OMEN ollama[34540]: llama_context: freq_scale    = 1
Nov 13 14:19:47 OMEN ollama[34540]: llama_context: n_ctx_per_seq (4096) < n_ctx_train (16384) -- the full capacity of the model will not be utilized
Nov 13 14:19:47 OMEN ollama[34540]: llama_context:        CPU  output buffer size =     0.14 MiB
Nov 13 14:19:47 OMEN ollama[34540]: llama_kv_cache:        CPU KV buffer size =  3200.00 MiB
Nov 13 14:19:48 OMEN ollama[34540]: llama_kv_cache: size = 3200.00 MiB (  4096 cells,  40 layers,  1/1 seqs), K (f16): 1600.00 MiB, V (f16): 1600.00 MiB
Nov 13 14:19:48 OMEN ollama[34540]: llama_context:        CPU compute buffer size =   388.01 MiB
Nov 13 14:19:48 OMEN ollama[34540]: llama_context: graph nodes  = 1446
Nov 13 14:19:48 OMEN ollama[34540]: llama_context: graph splits = 1
Nov 13 14:19:49 OMEN ollama[34540]: time=2025-11-13T14:19:49.173-05:00 level=INFO source=server.go:1332 msg="llama runner started in 6.55 seconds"
Nov 13 14:19:49 OMEN ollama[34540]: time=2025-11-13T14:19:49.173-05:00 level=INFO source=sched.go:517 msg="loaded runners" count=1
Nov 13 14:19:49 OMEN ollama[34540]: time=2025-11-13T14:19:49.173-05:00 level=INFO source=server.go:1294 msg="waiting for llama runner to start responding"
Nov 13 14:19:49 OMEN ollama[34540]: time=2025-11-13T14:19:49.173-05:00 level=INFO source=server.go:1332 msg="llama runner started in 6.55 seconds"
Nov 13 14:20:10 OMEN ollama[34540]: [GIN] 2025/11/13 - 14:20:10 | 200 | 27.935280844s |      172.17.0.2 | POST     "/api/chat"
Nov 13 14:20:10 OMEN ollama[34540]: time=2025-11-13T14:20:10.529-05:00 level=WARN source=types.go:797 msg="invalid option provided" option=rope_frequency_base
Nov 13 14:20:46 OMEN ollama[34540]: [GIN] 2025/11/13 - 14:20:46 | 200 | 36.265326774s |      172.17.0.2 | POST     "/api/chat"
Nov 13 14:20:46 OMEN ollama[34540]: time=2025-11-13T14:20:46.797-05:00 level=WARN source=types.go:797 msg="invalid option provided" option=rope_frequency_base
Nov 13 14:21:12 OMEN ollama[34540]: [GIN] 2025/11/13 - 14:21:12 | 200 | 25.232106932s |      172.17.0.2 | POST     "/api/chat"
Nov 13 14:21:12 OMEN ollama[34540]: time=2025-11-13T14:21:12.068-05:00 level=WARN source=types.go:797 msg="invalid option provided" option=rope_frequency_base
Nov 13 14:21:30 OMEN ollama[34540]: [GIN] 2025/11/13 - 14:21:30 | 200 | 18.237797884s |      172.17.0.2 | POST     "/api/chat"
~~


Simple Instructions for setting OLLAMA_DEBUG=2 in Systemd Service on Ubuntu 25.10

  1. cd /etc/systemd/system
  2. sudo gedit ollama.service
  3. look for the OLLAMA_DEBUG=1 line and change to 2
  4. save
  5. sudo systemctl daemon-reload
  6. sudo systemctl restart ollama.service
<!-- gh-comment-id:3529386759 --> @ganakee commented on GitHub (Nov 13, 2025): Sorry @rick-github .I added the `OLLAMA_DEBUG=2`. **SEE FIXED LOG BELOW IN SECOND POST.** I just re-downloaded 0.12.11-rc1 and set `OLLAM_DEBUG=2`. (See below if anyone else neds to do this.) I re-installed 0.12.11-rc1 with both the ollama base and AMD ROCm support. Restart ollama service. I executed a simple prompt. `nvtop `showed no GPU activity using 0.12.11-rc1 (that is, default to CPU). _Because this is a production system, I needed to now revert to 0.12.3._ I looked at the new log and no offload. (see full log below) SNIP OF LOG ``` Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.916-05:00 level=INFO source=ggml.go:482 msg="offloading 0 repeating layers to GPU" Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.916-05:00 level=INFO source=ggml.go:486 msg="offloading output layer to CPU" Nov 13 10:03:35 OMEN ollama[6578]: time=2025-11-13T10:03:35.916-05:00 level=INFO source=ggml.go:494 msg="offloaded 0/36 layers to GPU" Nov ``` ## ~~IGNORE Full Log 0.12.11-rc1 on AMD 6650 with OLLAMA_DEBUG=2~~ ``` ~~ Nov 13 14:04:41 OMEN ollama[11302]: [GIN] 2025/11/13 - 14:04:41 | 200 | 966.103µs | 192.168.2.135 | GET "/api/tags" Nov 13 14:04:41 OMEN ollama[11302]: [GIN] 2025/11/13 - 14:04:41 | 200 | 9.944613ms | 192.168.2.135 | POST "/api/show" Nov 13 14:04:41 OMEN ollama[11302]: [GIN] 2025/11/13 - 14:04:41 | 200 | 16.842178ms | 192.168.2.135 | POST "/api/show" Nov 13 14:04:41 OMEN ollama[11302]: [GIN] 2025/11/13 - 14:04:41 | 200 | 50.303875ms | 192.168.2.135 | POST "/api/show" Nov 13 14:04:41 OMEN ollama[11302]: [GIN] 2025/11/13 - 14:04:41 | 200 | 55.893164ms | 192.168.2.135 | POST "/api/show" Nov 13 14:04:41 OMEN ollama[11302]: [GIN] 2025/11/13 - 14:04:41 | 200 | 103.906841ms | 192.168.2.135 | POST "/api/show" Nov 13 14:04:41 OMEN ollama[11302]: [GIN] 2025/11/13 - 14:04:41 | 200 | 101.145549ms | 192.168.2.135 | POST "/api/show" Nov 13 14:17:51 OMEN systemd[1]: Stopping ollama.service - Ollama Service... Nov 13 14:17:51 OMEN systemd[1]: ollama.service: Deactivated successfully. Nov 13 14:17:51 OMEN systemd[1]: Stopped ollama.service - Ollama Service. Nov 13 14:17:51 OMEN systemd[1]: ollama.service: Consumed 1min 40.242s CPU time, 11.9G memory peak, 9M memory swap peak. Nov 13 14:18:25 OMEN systemd[1]: Started ollama.service - Ollama Service. Nov 13 14:18:25 OMEN ollama[34540]: time=2025-11-13T14:18:25.186-05:00 level=INFO source=routes.go:1544 msg="server config" env="map[CUDA_VISIBLE_DEVICES: GGML_VK_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION:10.3.0 HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:4096 OLLAMA_DEBUG:INFO OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://0.0.0.0:11434 OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/usr/share/ollama/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_REMOTES:[ollama.com] OLLAMA_SCHED_SPREAD:false OLLAMA_VULKAN:false ROCR_VISIBLE_DEVICES:2,3,1,0 http_proxy: https_proxy: no_proxy:]" Nov 13 14:18:25 OMEN ollama[34540]: time=2025-11-13T14:18:25.190-05:00 level=INFO source=images.go:522 msg="total blobs: 29" Nov 13 14:18:25 OMEN ollama[34540]: time=2025-11-13T14:18:25.190-05:00 level=INFO source=images.go:529 msg="total unused blobs removed: 0" Nov 13 14:18:25 OMEN ollama[34540]: time=2025-11-13T14:18:25.190-05:00 level=INFO source=routes.go:1597 msg="Listening on [::]:11434 (version 0.12.11-rc1)" Nov 13 14:18:25 OMEN ollama[34540]: time=2025-11-13T14:18:25.191-05:00 level=INFO source=runner.go:67 msg="discovering available GPUs..." Nov 13 14:18:25 OMEN ollama[34540]: time=2025-11-13T14:18:25.191-05:00 level=INFO source=server.go:392 msg="starting runner" cmd="/usr/bin/ollama runner --ollama-engine --port 37601" Nov 13 14:18:25 OMEN ollama[34540]: time=2025-11-13T14:18:25.210-05:00 level=INFO source=server.go:392 msg="starting runner" cmd="/usr/bin/ollama runner --ollama-engine --port 43231" Nov 13 14:18:27 OMEN ollama[34540]: time=2025-11-13T14:18:27.501-05:00 level=INFO source=server.go:392 msg="starting runner" cmd="/usr/bin/ollama runner --ollama-engine --port 35123" Nov 13 14:18:27 OMEN ollama[34540]: time=2025-11-13T14:18:27.531-05:00 level=INFO source=types.go:60 msg="inference compute" id=cpu library=cpu compute="" name=cpu description=cpu libdirs=ollama driver="" pci_id="" type="" total="30.1 GiB" available="25.5 GiB" Nov 13 14:18:27 OMEN ollama[34540]: time=2025-11-13T14:18:27.531-05:00 level=INFO source=routes.go:1638 msg="entering low vram mode" "total vram"="0 B" threshold="20.0 GiB" Nov 13 14:19:21 OMEN ollama[34540]: [GIN] 2025/11/13 - 14:19:21 | 200 | 1.813804ms | 172.17.0.2 | GET "/api/tags" Nov 13 14:19:21 OMEN ollama[34540]: [GIN] 2025/11/13 - 14:19:21 | 200 | 71.865µs | 172.17.0.2 | GET "/api/ps" Nov 13 14:19:42 OMEN ollama[34540]: time=2025-11-13T14:19:42.584-05:00 level=WARN source=types.go:797 msg="invalid option provided" option=rope_frequency_base Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: loaded meta data with 20 key-value pairs and 363 tensors from /usr/share/ollama/.ollama/models/blobs/sha256-e73cc17c718156e5ad34b119eb363e2c10389a503673f9c36144c42dfde8334c (version GGUF V2) Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: Dumping metadata keys/values. Note: KV overrides do not apply in this output. Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 0: general.architecture str = llama Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 1: general.name str = codellama Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 2: llama.context_length u32 = 16384 Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 3: llama.embedding_length u32 = 5120 Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 4: llama.block_count u32 = 40 Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 5: llama.feed_forward_length u32 = 13824 Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 6: llama.rope.dimension_count u32 = 128 Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 7: llama.attention.head_count u32 = 40 Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 8: llama.attention.head_count_kv u32 = 40 Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 9: llama.attention.layer_norm_rms_epsilon f32 = 0.000010 Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 10: llama.rope.freq_base f32 = 1000000.000000 Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 11: general.file_type u32 = 2 Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 12: tokenizer.ggml.model str = llama Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 13: tokenizer.ggml.tokens arr[str,32016] = ["<unk>", "<s>", "</s>", "<0x00>", "<... Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 14: tokenizer.ggml.scores arr[f32,32016] = [0.000000, 0.000000, 0.000000, 0.0000... Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 15: tokenizer.ggml.token_type arr[i32,32016] = [2, 3, 3, 6, 6, 6, 6, 6, 6, 6, 6, 6, ... Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 16: tokenizer.ggml.bos_token_id u32 = 1 Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 17: tokenizer.ggml.eos_token_id u32 = 2 Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 18: tokenizer.ggml.unknown_token_id u32 = 0 Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 19: general.quantization_version u32 = 2 Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - type f32: 81 tensors Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - type q4_0: 281 tensors Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - type q6_K: 1 tensors Nov 13 14:19:42 OMEN ollama[34540]: print_info: file format = GGUF V2 Nov 13 14:19:42 OMEN ollama[34540]: print_info: file type = Q4_0 Nov 13 14:19:42 OMEN ollama[34540]: print_info: file size = 6.86 GiB (4.53 BPW) Nov 13 14:19:42 OMEN ollama[34540]: load: control-looking token: 32007 '▁<PRE>' was not control-type; this is probably a bug in the model. its type will be overridden Nov 13 14:19:42 OMEN ollama[34540]: load: control-looking token: 32009 '▁<MID>' was not control-type; this is probably a bug in the model. its type will be overridden Nov 13 14:19:42 OMEN ollama[34540]: load: control-looking token: 32008 '▁<SUF>' was not control-type; this is probably a bug in the model. its type will be overridden Nov 13 14:19:42 OMEN ollama[34540]: load: special_eos_id is not in special_eog_ids - the tokenizer config may be incorrect Nov 13 14:19:42 OMEN ollama[34540]: load: printing all EOG tokens: Nov 13 14:19:42 OMEN ollama[34540]: load: - 2 ('</s>') Nov 13 14:19:42 OMEN ollama[34540]: load: special tokens cache size = 6 Nov 13 14:19:42 OMEN ollama[34540]: load: token to piece cache size = 0.1686 MB Nov 13 14:19:42 OMEN ollama[34540]: print_info: arch = llama Nov 13 14:19:42 OMEN ollama[34540]: print_info: vocab_only = 1 Nov 13 14:19:42 OMEN ollama[34540]: print_info: model type = ?B Nov 13 14:19:42 OMEN ollama[34540]: print_info: model params = 13.02 B Nov 13 14:19:42 OMEN ollama[34540]: print_info: general.name = codellama Nov 13 14:19:42 OMEN ollama[34540]: print_info: vocab type = SPM Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_vocab = 32016 Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_merges = 0 Nov 13 14:19:42 OMEN ollama[34540]: print_info: BOS token = 1 '<s>' Nov 13 14:19:42 OMEN ollama[34540]: print_info: EOS token = 2 '</s>' Nov 13 14:19:42 OMEN ollama[34540]: print_info: UNK token = 0 '<unk>' Nov 13 14:19:42 OMEN ollama[34540]: print_info: LF token = 13 '<0x0A>' Nov 13 14:19:42 OMEN ollama[34540]: print_info: FIM PRE token = 32007 '▁<PRE>' Nov 13 14:19:42 OMEN ollama[34540]: print_info: FIM SUF token = 32008 '▁<SUF>' Nov 13 14:19:42 OMEN ollama[34540]: print_info: FIM MID token = 32009 '▁<MID>' Nov 13 14:19:42 OMEN ollama[34540]: print_info: EOG token = 2 '</s>' Nov 13 14:19:42 OMEN ollama[34540]: print_info: max token length = 48 Nov 13 14:19:42 OMEN ollama[34540]: llama_model_load: vocab only - skipping tensors Nov 13 14:19:42 OMEN ollama[34540]: time=2025-11-13T14:19:42.624-05:00 level=INFO source=server.go:392 msg="starting runner" cmd="/usr/bin/ollama runner --model /usr/share/ollama/.ollama/models/blobs/sha256-e73cc17c718156e5ad34b119eb363e2c10389a503673f9c36144c42dfde8334c --port 35883" Nov 13 14:19:42 OMEN ollama[34540]: time=2025-11-13T14:19:42.624-05:00 level=INFO source=sched.go:443 msg="system memory" total="30.1 GiB" free="25.5 GiB" free_swap="31.7 GiB" Nov 13 14:19:42 OMEN ollama[34540]: time=2025-11-13T14:19:42.624-05:00 level=INFO source=server.go:459 msg="loading model" "model layers"=41 requested=-1 Nov 13 14:19:42 OMEN ollama[34540]: time=2025-11-13T14:19:42.624-05:00 level=INFO source=device.go:245 msg="model weights" device=CPU size="6.8 GiB" Nov 13 14:19:42 OMEN ollama[34540]: time=2025-11-13T14:19:42.624-05:00 level=INFO source=device.go:256 msg="kv cache" device=CPU size="3.1 GiB" Nov 13 14:19:42 OMEN ollama[34540]: time=2025-11-13T14:19:42.624-05:00 level=INFO source=device.go:272 msg="total memory" size="9.9 GiB" Nov 13 14:19:42 OMEN ollama[34540]: time=2025-11-13T14:19:42.634-05:00 level=INFO source=runner.go:963 msg="starting go runner" Nov 13 14:19:42 OMEN ollama[34540]: load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-haswell.so Nov 13 14:19:42 OMEN ollama[34540]: time=2025-11-13T14:19:42.638-05:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 compiler=cgo(gcc) Nov 13 14:19:42 OMEN ollama[34540]: time=2025-11-13T14:19:42.638-05:00 level=INFO source=runner.go:999 msg="Server listening on 127.0.0.1:35883" Nov 13 14:19:42 OMEN ollama[34540]: time=2025-11-13T14:19:42.646-05:00 level=INFO source=runner.go:893 msg=load request="{Operation:commit LoraPath:[] Parallel:1 BatchSize:512 FlashAttention:false KvSize:4096 KvCacheType: NumThreads:8 GPULayers:[] MultiUserCache:false ProjectorPath: MainGPU:0 UseMmap:false}" Nov 13 14:19:42 OMEN ollama[34540]: time=2025-11-13T14:19:42.646-05:00 level=INFO source=server.go:1294 msg="waiting for llama runner to start responding" Nov 13 14:19:42 OMEN ollama[34540]: time=2025-11-13T14:19:42.646-05:00 level=INFO source=server.go:1328 msg="waiting for server to become available" status="llm server loading model" Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: loaded meta data with 20 key-value pairs and 363 tensors from /usr/share/ollama/.ollama/models/blobs/sha256-e73cc17c718156e5ad34b119eb363e2c10389a503673f9c36144c42dfde8334c (version GGUF V2) Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: Dumping metadata keys/values. Note: KV overrides do not apply in this output. Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 0: general.architecture str = llama Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 1: general.name str = codellama Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 2: llama.context_length u32 = 16384 Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 3: llama.embedding_length u32 = 5120 Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 4: llama.block_count u32 = 40 Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 5: llama.feed_forward_length u32 = 13824 Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 6: llama.rope.dimension_count u32 = 128 Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 7: llama.attention.head_count u32 = 40 Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 8: llama.attention.head_count_kv u32 = 40 Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 9: llama.attention.layer_norm_rms_epsilon f32 = 0.000010 Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 10: llama.rope.freq_base f32 = 1000000.000000 Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 11: general.file_type u32 = 2 Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 12: tokenizer.ggml.model str = llama Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 13: tokenizer.ggml.tokens arr[str,32016] = ["<unk>", "<s>", "</s>", "<0x00>", "<... Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 14: tokenizer.ggml.scores arr[f32,32016] = [0.000000, 0.000000, 0.000000, 0.0000... Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 15: tokenizer.ggml.token_type arr[i32,32016] = [2, 3, 3, 6, 6, 6, 6, 6, 6, 6, 6, 6, ... Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 16: tokenizer.ggml.bos_token_id u32 = 1 Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 17: tokenizer.ggml.eos_token_id u32 = 2 Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 18: tokenizer.ggml.unknown_token_id u32 = 0 Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - kv 19: general.quantization_version u32 = 2 Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - type f32: 81 tensors Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - type q4_0: 281 tensors Nov 13 14:19:42 OMEN ollama[34540]: llama_model_loader: - type q6_K: 1 tensors Nov 13 14:19:42 OMEN ollama[34540]: print_info: file format = GGUF V2 Nov 13 14:19:42 OMEN ollama[34540]: print_info: file type = Q4_0 Nov 13 14:19:42 OMEN ollama[34540]: print_info: file size = 6.86 GiB (4.53 BPW) Nov 13 14:19:42 OMEN ollama[34540]: load: control-looking token: 32007 '▁<PRE>' was not control-type; this is probably a bug in the model. its type will be overridden Nov 13 14:19:42 OMEN ollama[34540]: load: control-looking token: 32009 '▁<MID>' was not control-type; this is probably a bug in the model. its type will be overridden Nov 13 14:19:42 OMEN ollama[34540]: load: control-looking token: 32008 '▁<SUF>' was not control-type; this is probably a bug in the model. its type will be overridden Nov 13 14:19:42 OMEN ollama[34540]: load: special_eos_id is not in special_eog_ids - the tokenizer config may be incorrect Nov 13 14:19:42 OMEN ollama[34540]: load: printing all EOG tokens: Nov 13 14:19:42 OMEN ollama[34540]: load: - 2 ('</s>') Nov 13 14:19:42 OMEN ollama[34540]: load: special tokens cache size = 6 Nov 13 14:19:42 OMEN ollama[34540]: load: token to piece cache size = 0.1686 MB Nov 13 14:19:42 OMEN ollama[34540]: print_info: arch = llama Nov 13 14:19:42 OMEN ollama[34540]: print_info: vocab_only = 0 Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_ctx_train = 16384 Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_embd = 5120 Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_layer = 40 Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_head = 40 Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_head_kv = 40 Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_rot = 128 Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_swa = 0 Nov 13 14:19:42 OMEN ollama[34540]: print_info: is_swa_any = 0 Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_embd_head_k = 128 Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_embd_head_v = 128 Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_gqa = 1 Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_embd_k_gqa = 5120 Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_embd_v_gqa = 5120 Nov 13 14:19:42 OMEN ollama[34540]: print_info: f_norm_eps = 0.0e+00 Nov 13 14:19:42 OMEN ollama[34540]: print_info: f_norm_rms_eps = 1.0e-05 Nov 13 14:19:42 OMEN ollama[34540]: print_info: f_clamp_kqv = 0.0e+00 Nov 13 14:19:42 OMEN ollama[34540]: print_info: f_max_alibi_bias = 0.0e+00 Nov 13 14:19:42 OMEN ollama[34540]: print_info: f_logit_scale = 0.0e+00 Nov 13 14:19:42 OMEN ollama[34540]: print_info: f_attn_scale = 0.0e+00 Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_ff = 13824 Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_expert = 0 Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_expert_used = 0 Nov 13 14:19:42 OMEN ollama[34540]: print_info: causal attn = 1 Nov 13 14:19:42 OMEN ollama[34540]: print_info: pooling type = 0 Nov 13 14:19:42 OMEN ollama[34540]: print_info: rope type = 0 Nov 13 14:19:42 OMEN ollama[34540]: print_info: rope scaling = linear Nov 13 14:19:42 OMEN ollama[34540]: print_info: freq_base_train = 1000000.0 Nov 13 14:19:42 OMEN ollama[34540]: print_info: freq_scale_train = 1 Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_ctx_orig_yarn = 16384 Nov 13 14:19:42 OMEN ollama[34540]: print_info: rope_finetuned = unknown Nov 13 14:19:42 OMEN ollama[34540]: print_info: model type = 13B Nov 13 14:19:42 OMEN ollama[34540]: print_info: model params = 13.02 B Nov 13 14:19:42 OMEN ollama[34540]: print_info: general.name = codellama Nov 13 14:19:42 OMEN ollama[34540]: print_info: vocab type = SPM Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_vocab = 32016 Nov 13 14:19:42 OMEN ollama[34540]: print_info: n_merges = 0 Nov 13 14:19:42 OMEN ollama[34540]: print_info: BOS token = 1 '<s>' Nov 13 14:19:42 OMEN ollama[34540]: print_info: EOS token = 2 '</s>' Nov 13 14:19:42 OMEN ollama[34540]: print_info: UNK token = 0 '<unk>' Nov 13 14:19:42 OMEN ollama[34540]: print_info: LF token = 13 '<0x0A>' Nov 13 14:19:42 OMEN ollama[34540]: print_info: FIM PRE token = 32007 '▁<PRE>' Nov 13 14:19:42 OMEN ollama[34540]: print_info: FIM SUF token = 32008 '▁<SUF>' Nov 13 14:19:42 OMEN ollama[34540]: print_info: FIM MID token = 32009 '▁<MID>' Nov 13 14:19:42 OMEN ollama[34540]: print_info: EOG token = 2 '</s>' Nov 13 14:19:42 OMEN ollama[34540]: print_info: max token length = 48 Nov 13 14:19:42 OMEN ollama[34540]: load_tensors: loading model tensors, this can take a while... (mmap = false) Nov 13 14:19:42 OMEN ollama[34540]: load_tensors: CPU model buffer size = 7024.00 MiB Nov 13 14:19:47 OMEN ollama[34540]: [GIN] 2025/11/13 - 14:19:47 | 200 | 1.288198ms | 172.17.0.2 | GET "/api/tags" Nov 13 14:19:47 OMEN ollama[34540]: [GIN] 2025/11/13 - 14:19:47 | 200 | 146.545µs | 172.17.0.2 | GET "/api/ps" Nov 13 14:19:47 OMEN ollama[34540]: llama_context: constructing llama_context Nov 13 14:19:47 OMEN ollama[34540]: llama_context: n_seq_max = 1 Nov 13 14:19:47 OMEN ollama[34540]: llama_context: n_ctx = 4096 Nov 13 14:19:47 OMEN ollama[34540]: llama_context: n_ctx_per_seq = 4096 Nov 13 14:19:47 OMEN ollama[34540]: llama_context: n_batch = 512 Nov 13 14:19:47 OMEN ollama[34540]: llama_context: n_ubatch = 512 Nov 13 14:19:47 OMEN ollama[34540]: llama_context: causal_attn = 1 Nov 13 14:19:47 OMEN ollama[34540]: llama_context: flash_attn = disabled Nov 13 14:19:47 OMEN ollama[34540]: llama_context: kv_unified = false Nov 13 14:19:47 OMEN ollama[34540]: llama_context: freq_base = 1000000.0 Nov 13 14:19:47 OMEN ollama[34540]: llama_context: freq_scale = 1 Nov 13 14:19:47 OMEN ollama[34540]: llama_context: n_ctx_per_seq (4096) < n_ctx_train (16384) -- the full capacity of the model will not be utilized Nov 13 14:19:47 OMEN ollama[34540]: llama_context: CPU output buffer size = 0.14 MiB Nov 13 14:19:47 OMEN ollama[34540]: llama_kv_cache: CPU KV buffer size = 3200.00 MiB Nov 13 14:19:48 OMEN ollama[34540]: llama_kv_cache: size = 3200.00 MiB ( 4096 cells, 40 layers, 1/1 seqs), K (f16): 1600.00 MiB, V (f16): 1600.00 MiB Nov 13 14:19:48 OMEN ollama[34540]: llama_context: CPU compute buffer size = 388.01 MiB Nov 13 14:19:48 OMEN ollama[34540]: llama_context: graph nodes = 1446 Nov 13 14:19:48 OMEN ollama[34540]: llama_context: graph splits = 1 Nov 13 14:19:49 OMEN ollama[34540]: time=2025-11-13T14:19:49.173-05:00 level=INFO source=server.go:1332 msg="llama runner started in 6.55 seconds" Nov 13 14:19:49 OMEN ollama[34540]: time=2025-11-13T14:19:49.173-05:00 level=INFO source=sched.go:517 msg="loaded runners" count=1 Nov 13 14:19:49 OMEN ollama[34540]: time=2025-11-13T14:19:49.173-05:00 level=INFO source=server.go:1294 msg="waiting for llama runner to start responding" Nov 13 14:19:49 OMEN ollama[34540]: time=2025-11-13T14:19:49.173-05:00 level=INFO source=server.go:1332 msg="llama runner started in 6.55 seconds" Nov 13 14:20:10 OMEN ollama[34540]: [GIN] 2025/11/13 - 14:20:10 | 200 | 27.935280844s | 172.17.0.2 | POST "/api/chat" Nov 13 14:20:10 OMEN ollama[34540]: time=2025-11-13T14:20:10.529-05:00 level=WARN source=types.go:797 msg="invalid option provided" option=rope_frequency_base Nov 13 14:20:46 OMEN ollama[34540]: [GIN] 2025/11/13 - 14:20:46 | 200 | 36.265326774s | 172.17.0.2 | POST "/api/chat" Nov 13 14:20:46 OMEN ollama[34540]: time=2025-11-13T14:20:46.797-05:00 level=WARN source=types.go:797 msg="invalid option provided" option=rope_frequency_base Nov 13 14:21:12 OMEN ollama[34540]: [GIN] 2025/11/13 - 14:21:12 | 200 | 25.232106932s | 172.17.0.2 | POST "/api/chat" Nov 13 14:21:12 OMEN ollama[34540]: time=2025-11-13T14:21:12.068-05:00 level=WARN source=types.go:797 msg="invalid option provided" option=rope_frequency_base Nov 13 14:21:30 OMEN ollama[34540]: [GIN] 2025/11/13 - 14:21:30 | 200 | 18.237797884s | 172.17.0.2 | POST "/api/chat" ~~ ``` ## Simple Instructions for setting OLLAMA_DEBUG=2 in Systemd Service on Ubuntu 25.10 1. `cd /etc/systemd/system` 2. `sudo gedit ollama.service` 3. look for the `OLLAMA_DEBUG=1` line and change to 2 4. save 5. `sudo systemctl daemon-reload` 6. `sudo systemctl restart ollama.service`
Author
Owner

@ganakee commented on GitHub (Nov 13, 2025):

Use first. (My stupid error. I changed to 2. But that line was commented, and I did not carefully note that.)

FIXED Full Log 0.12.11-rc1 on AMD 6650 with OLLAMA_DEBUG=2


journalctl -u ollama --no-pager --follow --pager-end
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=245 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=246 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=247 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=248 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=249 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=250 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=251 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=252 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=253 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=254 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=255 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=256 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=257 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=258 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=259 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=260 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=261 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=262 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=263 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=264 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=265 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=266 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=267 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=268 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=269 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=270 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=271 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=272 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=273 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=274 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=275 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=276 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=277 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=278 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=279 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=280 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=281 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=282 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=283 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=284 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=285 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=286 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=287 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=288 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=289 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=290 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=291 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=292 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=293 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=294 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=295 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=296 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=297 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=298 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=299 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=300 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=301 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=302 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=303 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=304 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=305 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=306 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=307 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=308 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=309 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=310 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=311 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=312 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=313 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=314 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=315 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=316 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=317 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=318 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=319 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=320 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=321 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=322 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=323 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=324 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=325 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=326 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=327 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=328 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=329 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=330 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=331 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=332 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=333 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=334 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=335 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=336 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=337 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=338 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=339 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=340 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=341 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=342 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=343 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=344 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=345 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=346 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=347 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=348 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=349 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=350 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=351 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=352 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=353 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=354 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=355 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=356 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=357 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=358 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=359 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=360 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=361 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=362 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=363 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=364 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=365 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=366 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=367 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=368 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=369 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=370 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=371 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=372 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=373 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=374 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=375 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=376 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=377 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=378 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=379 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=380 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=381 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=382 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=383 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=384 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=385 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=386 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=387 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=388 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=389 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=390 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=391 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=392 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=393 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=394 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=395 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=396 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=397 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=398 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=399 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=400 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=401 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=402 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=403 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=404 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=405 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=406 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=407 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=408 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=409 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=410 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=411 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=412 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=413 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=414 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=415 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=416 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=417 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=418 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=419 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=420 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=421 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=422 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=423 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=424 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=425 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=426 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=427 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=428 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=429 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=430 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=431 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=432 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=433 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=434 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=435 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=436 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=437 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=438 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=439 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=440 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=441 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=442 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=443 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=444 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=445 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=446 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=447 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=448 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=449 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=450 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=451 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=452 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=453 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=454 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=455 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=456 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=457 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=458 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=459 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=460 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=461 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=462 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=463 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=464 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=465 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=466 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=467 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=468 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=469 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=470 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=471 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=472 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=473 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=474 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=475 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=476 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=477 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=478 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=479 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=480 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=481 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=482 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=483 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=484 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=485 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=486 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=487 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=488 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=489 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=490 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=491 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=492 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=493 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=494 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=495 len(seq.inputs)=495
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.524-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=65
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.524-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=65
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.524-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=65
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.524-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=65
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.524-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=65 id=66
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.524-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=66 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.526-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=66
Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.526-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=66
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.271-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=65
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.271-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=65
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.271-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=65 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.271-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[2717] string=```
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.272-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=65
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.272-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=66
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.272-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=66
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.272-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=66 id=67
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.272-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=67 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.274-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=67
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.274-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=67
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.342-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=66
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.342-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=66
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.342-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=66 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.342-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[3723] string=json
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.342-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=66
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.342-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=67
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.342-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=67
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.343-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=67 id=68
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.343-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=68 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.344-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=68
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.344-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=68
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.410-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=67
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.410-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=67
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.410-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=67 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.411-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[107] string="\n"
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.411-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=67
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.411-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=68
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.411-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=68
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.411-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=68 id=69
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.411-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=69 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.412-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=69
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.412-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=69
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.478-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=68
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.478-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=68
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.478-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=68 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.479-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[236782] string={
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.479-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=68
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.479-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=69
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.479-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=69
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.479-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=69 id=70
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.479-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=70 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.480-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=70
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.480-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=70
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.547-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=69
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.547-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=69
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.547-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=69 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.548-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[107] string="\n"
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.548-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=69
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.548-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=70
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.548-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=70
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.548-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=70 id=71
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.548-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=71 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.549-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=71
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.549-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=71
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.615-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=70
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.616-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=70
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.616-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=70 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.617-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[236775] string="\""
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.617-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=70
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.617-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=71
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.617-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=71
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.617-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=71 id=72
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.617-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=72 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.618-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=72
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.618-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=72
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.684-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=71
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.684-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=71
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.684-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=71 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.685-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[25162] string=follow
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.685-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=71
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.685-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=72
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.685-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=72
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.685-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=72 id=73
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.685-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=73 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.687-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=73
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.687-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=73
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.753-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=72
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.753-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=72
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.753-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=72 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.754-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[236779] string=_
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.755-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=72
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.755-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=73
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.755-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=73
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.755-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=73 id=74
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.755-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=74 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.756-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=74
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.756-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=74
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.825-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=73
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.825-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=73
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.825-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=73 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.830-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[11303] string=ups
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.830-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=73
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.830-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=74
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.830-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=74
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.830-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=74 id=75
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.830-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=75 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.832-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=75
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.832-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=75
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.898-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=74
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.898-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=74
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.898-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=74 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.898-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[1083] string="\":"
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.898-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=74
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.898-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=75
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.899-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=75
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.899-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=75 id=76
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.899-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=76 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.900-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=76
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.900-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=76
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.966-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=75
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.966-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=75
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.966-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=75 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.967-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[11058] string=" [\""
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.967-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=75
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.967-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=76
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.967-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=76
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.967-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=76 id=77
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.967-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=77 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.968-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=77
Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.968-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=77
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.034-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=76
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.034-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=76
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.034-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=76 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.034-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[8574] string=Can
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.034-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=76
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.034-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=77
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.034-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=77
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.034-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=77 id=78
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.034-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=78 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.037-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=78
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.037-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=78
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.102-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=77
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.102-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=77
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.102-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=77 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.103-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[611] string=" you"
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.103-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=77
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.103-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=78
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.103-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=78
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.103-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=78 id=79
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.103-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=79 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.104-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=79
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.104-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=79
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.170-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=78
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.170-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=78
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.170-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=78 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.170-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[1386] string=" make"
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.170-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=78
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.171-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=79
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.171-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=79
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.171-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=79 id=80
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.171-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=80 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.172-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=80
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.172-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=80
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.238-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=79
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.238-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=79
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.238-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=79 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.239-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[506] string=" the"
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.239-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=79
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.239-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=80
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.239-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=80
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.239-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=80 id=81
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.239-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=81 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.241-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=81
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.241-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=81
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.305-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=80
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.305-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=80
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.305-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=80 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.307-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[16273] string=" autumn"
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.307-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=80
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.307-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=81
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.307-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=81
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.307-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=81 id=82
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.307-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=82 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.309-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=82
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.309-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=82
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.375-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=81
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.375-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=81
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.375-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=81 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.377-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[3925] string=" story"
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.377-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=81
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.377-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=82
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.377-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=82
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.377-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=82 id=83
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.377-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=83 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.378-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=83
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.379-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=83
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.444-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=82
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.444-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=82
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.444-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=82 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.445-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[496] string=" a"
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.445-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=82
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.445-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=83
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.446-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=83
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.446-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=83 id=84
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.446-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=84 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.447-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=84
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.447-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=84
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.514-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=83
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.514-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=83
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.514-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=83 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.515-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[2268] string=" little"
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.515-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=83
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.515-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=84
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.515-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=84
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.515-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=84 id=85
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.515-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=85 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.516-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=85
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.516-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=85
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.582-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=84
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.582-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=84
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.582-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=84 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.583-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[4890] string=" longer"
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.583-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=84
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.584-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=85
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.584-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=85
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.584-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=85 id=86
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.584-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=86 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.585-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=86
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.585-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=86
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.651-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=85
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.651-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=85
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.651-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=85 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.653-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[32109] string="?\","
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.653-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=85
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.653-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=86
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.653-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=86
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.653-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=86 id=87
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.653-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=87 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.655-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=87
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.656-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=87
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.721-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=86
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.721-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=86
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.721-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=86 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.722-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[623] string=" \""
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.722-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=86
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.722-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=87
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.722-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=87
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.722-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=87 id=88
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.722-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=88 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.723-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=88
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.723-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=88
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.791-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=87
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.791-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=87
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.791-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=87 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.792-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[3689] string=What
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.792-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=87
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.792-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=88
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.792-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=88
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.792-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=88 id=89
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.792-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=89 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.794-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=89
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.794-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=89
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.860-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=88
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.860-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=88
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.860-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=88 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.861-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[1032] string=" other"
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.861-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=88
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.861-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=89
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.861-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=89
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.861-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=89 id=90
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.861-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=90 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.862-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=90
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.862-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=90
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.930-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=89
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.930-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=89
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.930-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=89 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.931-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[8511] string=" stories"
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.931-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=89
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.931-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=90
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.931-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=90
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.931-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=90 id=91
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.931-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=91 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.932-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=91
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.932-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=91
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.998-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=90
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.998-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=90
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.998-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=90 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.999-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[776] string=" do"
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.999-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=90
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.999-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=91
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.999-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=91
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.999-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=91 id=92
Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.999-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=92 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.000-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=92
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.000-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=92
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.065-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=91
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.066-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=91
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.066-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=91 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.066-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[611] string=" you"
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.066-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=91
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.066-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=92
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.066-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=92
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.066-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=92 id=93
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.066-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=93 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.067-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=93
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.067-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=93
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.133-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=92
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.133-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=92
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.133-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=92 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.134-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[735] string=" have"
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.134-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=92
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.134-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=93
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.134-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=93
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.134-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=93 id=94
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.134-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=94 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.135-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=94
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.135-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=94
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.201-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=93
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.201-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=93
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.201-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=93 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.202-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[32109] string="?\","
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.202-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=93
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.202-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=94
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.202-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=94
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.202-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=94 id=95
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.202-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=95 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.203-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=95
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.203-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=95
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.269-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=94
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.269-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=94
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.269-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=94 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.270-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[623] string=" \""
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.270-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=94
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.270-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=95
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.270-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=95
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.270-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=95 id=96
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.270-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=96 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.271-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=96
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.271-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=96
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.338-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=95
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.338-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=95
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.338-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=95 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.338-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[30092] string=Could
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.338-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=95
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.338-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=96
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.338-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=96
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.338-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=96 id=97
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.338-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=97 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.339-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=97
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.339-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=97
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.406-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=96
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.406-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=96
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.406-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=96 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.407-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[611] string=" you"
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.407-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=96
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.407-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=97
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.407-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=97
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.407-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=97 id=98
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.407-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=98 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.408-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=98
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.408-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=98
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.475-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=97
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.475-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=97
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.475-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=97 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.477-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[4903] string=" write"
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.477-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=97
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.477-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=98
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.477-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=98
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.477-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=98 id=99
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.477-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=99 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.478-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=99
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.478-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=99
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.544-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=98
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.544-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=98
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.544-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=98 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.545-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[496] string=" a"
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.545-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=98
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.545-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=99
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.545-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=99
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.545-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=99 id=100
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.545-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=100 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.546-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=100
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.546-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=100
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.611-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=99
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.611-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=99
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.611-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=99 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.612-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[3925] string=" story"
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.612-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=99
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.612-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=100
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.612-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=100
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.612-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=100 id=101
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.612-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=101 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.614-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=101
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.614-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=101
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.679-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=100
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.679-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=100
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.679-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=100 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.680-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[1003] string=" about"
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.680-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=100
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.680-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=101
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.680-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=101
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.680-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=101 id=102
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.680-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=102 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.681-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=102
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.681-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=102
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.748-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=101
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.748-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=101
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.748-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=101 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.748-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[496] string=" a"
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.748-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=101
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.748-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=102
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.748-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=102
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.748-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=102 id=103
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.748-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=103 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.749-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=103
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.749-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=103
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.815-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=102
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.815-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=102
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.815-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=102 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.816-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[1607] string=" different"
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.816-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=102
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.816-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=103
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.816-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=103
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.816-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=103 id=104
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.816-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=104 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.818-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=104
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.818-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=104
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.883-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=103
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.883-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=103
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.883-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=103 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.883-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[3409] string=" season"
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.883-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=103
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.883-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=104
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.883-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=104
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.883-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=104 id=105
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.883-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=105 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.884-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=105
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.884-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=105
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.950-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=104
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.950-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=104
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.950-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=104 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.950-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[32109] string="?\","
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.950-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=104
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.950-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=105
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.950-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=105
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.950-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=105 id=106
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.950-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=106 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.951-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=106
Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.951-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=106
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.017-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=105
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.017-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=105
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.017-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=105 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.018-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[623] string=" \""
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.018-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=105
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.018-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=106
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.018-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=106
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.018-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=106 id=107
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.018-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=107 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.019-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=107
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.019-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=107
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.085-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=106
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.085-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=106
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.085-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=106 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.086-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[8574] string=Can
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.086-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=106
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.086-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=107
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.086-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=107
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.086-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=107 id=108
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.086-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=108 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.087-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=108
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.087-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=108
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.154-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=107
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.154-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=107
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.154-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=107 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.154-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[611] string=" you"
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.154-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=107
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.154-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=108
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.154-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=108
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.155-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=108 id=109
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.155-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=109 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.155-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=109
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.156-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=109
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.222-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=108
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.222-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=108
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.222-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=108 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.223-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[2352] string=" change"
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.223-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=108
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.223-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=109
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.223-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=109
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.223-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=109 id=110
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.223-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=110 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.224-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=110
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.224-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=110
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.291-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=109
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.291-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=109
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.291-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=109 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.291-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[506] string=" the"
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.291-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=109
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.291-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=110
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.291-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=110
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.291-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=110 id=111
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.291-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=111 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.293-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=111
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.293-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=111
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.358-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=110
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.358-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=110
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.358-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=110 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.359-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[3925] string=" story"
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.359-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=110
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.359-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=111
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.359-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=111
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.359-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=111 id=112
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.359-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=112 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.360-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=112
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.360-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=112
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.426-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=111
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.426-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=111
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.426-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=111 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.427-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[531] string=" to"
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.427-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=111
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.427-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=112
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.427-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=112
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.427-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=112 id=113
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.427-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=113 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.429-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=113
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.429-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=113
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.495-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=112
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.495-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=112
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.495-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=112 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.496-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[577] string=" be"
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.496-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=112
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.496-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=113
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.496-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=113
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.496-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=113 id=114
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.496-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=114 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.497-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=114
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.497-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=114
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.563-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=113
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.563-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=113
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.563-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=113 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.564-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[1003] string=" about"
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.564-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=113
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.564-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=114
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.564-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=114
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.564-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=114 id=115
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.564-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=115 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.565-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=115
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.565-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=115
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.631-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=114
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.631-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=114
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.631-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=114 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.632-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[8075] string=" animals"
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.632-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=114
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.632-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=115
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.632-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=115
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.632-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=115 id=116
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.632-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=116 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.633-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=116
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.633-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=116
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.699-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=115
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.699-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=115
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.699-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=115 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.699-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[32109] string="?\","
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.699-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=115
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.699-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=116
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.700-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=116
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.700-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=116 id=117
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.700-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=117 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.701-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=117
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.701-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=117
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.767-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=116
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.767-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=116
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.767-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=116 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.768-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[623] string=" \""
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.768-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=116
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.768-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=117
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.768-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=117
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.768-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=117 id=118
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.768-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=118 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.769-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=118
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.769-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=118
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.838-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=117
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.838-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=117
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.838-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=117 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.839-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[3689] string=What
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.839-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=117
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.839-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=118
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.839-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=118
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.839-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=118 id=119
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.839-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=119 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.840-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=119
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.840-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=119
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.908-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=118
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.908-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=118
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.908-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=118 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.909-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[659] string=" are"
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.909-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=118
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.909-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=119
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.909-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=119
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.909-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=119 id=120
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.909-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=120 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.911-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=120
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.911-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=120
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.975-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=119
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.975-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=119
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.975-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=119 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.976-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[1070] string=" some"
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.976-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=119
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.976-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=120
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.976-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=120
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.976-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=120 id=121
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.976-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=121 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.977-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=121
Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.977-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=121
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.043-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=120
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.043-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=120
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.043-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=120 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.044-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[529] string=" of"
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.044-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=120
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.044-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=121
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.044-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=121
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.044-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=121 id=122
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.044-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=122 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.045-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=122
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.045-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=122
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.111-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=121
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.111-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=121
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.111-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=121 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.112-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[822] string=" your"
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.112-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=121
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.112-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=122
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.112-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=122
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.112-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=122 id=123
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.112-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=123 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.116-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=123
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.116-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=123
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.181-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=122
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.181-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=122
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.181-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=122 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.181-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[8126] string=" favorite"
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.181-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=122
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.181-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=123
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.181-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=123
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.181-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=123 id=124
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.181-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=124 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.182-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=124
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.182-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=124
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.248-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=123
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.248-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=123
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.248-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=123 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.249-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[3925] string=" story"
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.249-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=123
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.249-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=124
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.249-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=124
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.249-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=124 id=125
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.249-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=125 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.250-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=125
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.250-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=125
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.316-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=124
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.316-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=124
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.316-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=124 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.317-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[62571] string=" prompts"
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.317-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=124
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.317-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=125
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.317-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=125
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.317-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=125 id=126
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.317-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=126 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.318-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=126
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.318-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=126
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.384-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=125
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.384-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=125
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.384-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=125 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.384-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[236881] string=?
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.384-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=125
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.384-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=126
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.384-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=126
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.384-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=126 id=127
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.384-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=127 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.386-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=127
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.387-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=127
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.453-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=126
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.453-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=126
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.453-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=126 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.453-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[1935] string="\"]"
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.453-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=126
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.453-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=127
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.453-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=127
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.453-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=127 id=128
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.453-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=128 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.454-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=128
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.454-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=128
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.522-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=127
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.522-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=127
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.522-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=127 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.523-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[107] string="\n"
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.523-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=127
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.523-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=128
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.523-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=128
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.523-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=128 id=129
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.523-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=129 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.524-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=129
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.524-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=129
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.590-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=128
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.590-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=128
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.590-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=128 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.591-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[236783] string=}
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.591-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=128
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.591-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=129
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.591-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=129
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.591-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=129 id=130
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.591-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=130 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.592-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=130
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.592-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=130
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.659-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=129
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.659-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=129
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.659-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=129 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.659-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[107] string="\n"
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.659-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=129
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.659-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=130
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.659-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=130
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.659-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=130 id=131
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.659-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=131 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.660-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=131
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.660-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=131
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.726-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=130
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.726-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=130
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.726-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=130 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.726-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[2717] string=```
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.726-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=130
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.726-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=131
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.726-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=131
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.726-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=131 id=132
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.726-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=132 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.727-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=132
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.727-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=132
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.793-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=131
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.793-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=131
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.793-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=131 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0]
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.794-05:00 level=TRACE source=runner.go:773 msg="computeBatch: EOS" batchID=131 seqIdx=0
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.794-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=131
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.794-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=132
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.794-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=132
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.794-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=132 id=133
Nov 13 14:37:49 OMEN ollama[38785]: [GIN] 2025/11/13 - 14:37:49 | 200 | 13.529439634s |      172.17.0.2 | POST     "/api/chat"
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.794-05:00 level=DEBUG source=sched.go:385 msg="context for request finished" runner.name=registry.ollama.ai/library/gemma3n:latest runner.size="7.6 GiB" runner.vram="0 B" runner.parallel=1 runner.pid=38937 runner.model=/usr/share/ollama/.ollama/models/blobs/sha256-38e8dcc30df4eb0e29eaf5c74ba6ce3f2cd66badad50768fc14362acfb8b8cb6 runner.num_ctx=4096
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.794-05:00 level=DEBUG source=sched.go:290 msg="runner with non-zero duration has gone idle, adding timer" runner.name=registry.ollama.ai/library/gemma3n:latest runner.size="7.6 GiB" runner.vram="0 B" runner.parallel=1 runner.pid=38937 runner.model=/usr/share/ollama/.ollama/models/blobs/sha256-38e8dcc30df4eb0e29eaf5c74ba6ce3f2cd66badad50768fc14362acfb8b8cb6 runner.num_ctx=4096 duration=5m0s
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.794-05:00 level=DEBUG source=sched.go:308 msg="after processing request finished event" runner.name=registry.ollama.ai/library/gemma3n:latest runner.size="7.6 GiB" runner.vram="0 B" runner.parallel=1 runner.pid=38937 runner.model=/usr/share/ollama/.ollama/models/blobs/sha256-38e8dcc30df4eb0e29eaf5c74ba6ce3f2cd66badad50768fc14362acfb8b8cb6 runner.num_ctx=4096 refCount=0
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.861-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=132
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.861-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=132
Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.861-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=132




<!-- gh-comment-id:3529425999 --> @ganakee commented on GitHub (Nov 13, 2025): **Use first.** _(My stupid error. I changed to 2. But that line was commented, and I did not carefully note that.)_ # FIXED Full Log 0.12.11-rc1 on AMD 6650 with OLLAMA_DEBUG=2 ``` journalctl -u ollama --no-pager --follow --pager-end Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=245 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=246 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=247 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=248 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=249 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=250 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=251 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=252 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=253 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=254 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=255 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=256 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=257 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=258 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=259 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=260 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=261 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=262 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=263 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=264 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=265 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=266 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=267 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=268 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=269 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=270 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=271 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=272 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=273 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=274 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=275 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=276 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=277 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=278 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=279 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=280 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=281 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=282 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=283 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=284 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=285 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=286 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=287 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=288 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=289 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=290 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=291 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=292 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=293 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=294 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=295 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=296 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=297 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=298 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=299 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=300 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=301 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=302 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=303 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=304 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=305 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=306 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=307 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=308 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=309 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.520-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=310 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=311 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=312 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=313 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=314 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=315 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=316 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=317 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=318 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=319 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=320 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=321 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=322 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=323 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=324 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=325 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=326 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=327 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=328 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=329 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=330 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=331 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=332 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=333 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=334 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=335 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=336 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=337 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=338 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=339 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=340 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=341 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=342 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=343 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=344 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=345 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=346 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=347 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=348 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=349 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=350 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=351 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=352 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=353 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=354 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=355 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=356 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=357 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=358 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=359 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=360 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=361 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=362 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=363 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=364 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=365 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=366 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=367 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=368 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=369 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=370 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=371 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=372 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=373 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=374 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=375 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=376 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=377 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=378 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=379 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=380 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=381 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=382 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=383 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=384 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=385 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=386 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=387 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=388 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=389 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=390 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=391 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=392 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=393 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=394 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=395 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=396 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=397 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=398 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=399 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=400 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=401 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=402 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=403 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=404 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=405 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=406 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=407 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=408 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=409 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=410 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=411 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=412 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=413 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=414 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=415 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=416 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=417 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=418 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=419 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=420 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=421 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=422 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=423 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=424 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=425 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=426 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=427 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=428 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=429 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=430 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=431 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=432 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=433 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=434 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=435 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=436 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=437 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=438 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=439 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=440 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=441 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=442 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=443 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=444 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=445 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=446 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=447 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=448 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=449 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=450 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=451 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=452 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=453 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=454 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=455 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=456 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=457 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=458 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=459 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=460 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=461 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=462 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=463 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=464 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=465 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=466 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=467 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=468 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=469 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=470 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=471 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=472 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=473 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=474 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=475 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=476 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=477 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=478 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=479 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=480 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=481 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=482 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=483 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=484 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=485 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=486 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=487 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=488 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=489 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=490 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=491 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=492 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=493 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=494 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.521-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=65 seqIdx=0 seq.iBatch=0 i+1=495 len(seq.inputs)=495 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.524-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=65 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.524-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=65 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.524-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=65 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.524-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=65 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.524-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=65 id=66 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.524-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=66 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.526-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=66 Nov 13 14:37:36 OMEN ollama[38785]: time=2025-11-13T14:37:36.526-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=66 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.271-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=65 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.271-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=65 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.271-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=65 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.271-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[2717] string=``` Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.272-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=65 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.272-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=66 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.272-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=66 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.272-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=66 id=67 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.272-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=67 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.274-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=67 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.274-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=67 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.342-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=66 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.342-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=66 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.342-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=66 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.342-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[3723] string=json Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.342-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=66 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.342-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=67 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.342-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=67 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.343-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=67 id=68 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.343-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=68 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.344-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=68 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.344-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=68 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.410-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=67 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.410-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=67 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.410-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=67 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.411-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[107] string="\n" Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.411-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=67 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.411-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=68 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.411-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=68 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.411-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=68 id=69 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.411-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=69 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.412-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=69 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.412-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=69 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.478-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=68 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.478-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=68 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.478-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=68 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.479-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[236782] string={ Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.479-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=68 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.479-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=69 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.479-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=69 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.479-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=69 id=70 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.479-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=70 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.480-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=70 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.480-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=70 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.547-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=69 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.547-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=69 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.547-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=69 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.548-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[107] string="\n" Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.548-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=69 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.548-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=70 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.548-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=70 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.548-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=70 id=71 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.548-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=71 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.549-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=71 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.549-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=71 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.615-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=70 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.616-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=70 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.616-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=70 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.617-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[236775] string="\"" Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.617-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=70 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.617-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=71 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.617-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=71 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.617-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=71 id=72 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.617-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=72 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.618-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=72 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.618-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=72 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.684-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=71 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.684-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=71 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.684-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=71 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.685-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[25162] string=follow Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.685-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=71 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.685-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=72 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.685-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=72 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.685-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=72 id=73 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.685-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=73 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.687-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=73 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.687-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=73 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.753-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=72 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.753-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=72 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.753-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=72 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.754-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[236779] string=_ Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.755-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=72 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.755-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=73 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.755-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=73 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.755-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=73 id=74 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.755-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=74 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.756-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=74 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.756-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=74 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.825-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=73 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.825-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=73 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.825-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=73 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.830-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[11303] string=ups Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.830-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=73 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.830-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=74 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.830-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=74 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.830-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=74 id=75 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.830-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=75 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.832-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=75 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.832-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=75 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.898-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=74 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.898-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=74 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.898-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=74 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.898-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[1083] string="\":" Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.898-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=74 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.898-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=75 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.899-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=75 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.899-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=75 id=76 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.899-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=76 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.900-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=76 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.900-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=76 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.966-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=75 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.966-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=75 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.966-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=75 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.967-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[11058] string=" [\"" Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.967-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=75 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.967-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=76 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.967-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=76 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.967-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=76 id=77 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.967-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=77 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.968-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=77 Nov 13 14:37:45 OMEN ollama[38785]: time=2025-11-13T14:37:45.968-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=77 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.034-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=76 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.034-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=76 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.034-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=76 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.034-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[8574] string=Can Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.034-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=76 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.034-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=77 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.034-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=77 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.034-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=77 id=78 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.034-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=78 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.037-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=78 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.037-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=78 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.102-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=77 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.102-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=77 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.102-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=77 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.103-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[611] string=" you" Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.103-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=77 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.103-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=78 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.103-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=78 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.103-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=78 id=79 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.103-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=79 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.104-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=79 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.104-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=79 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.170-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=78 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.170-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=78 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.170-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=78 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.170-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[1386] string=" make" Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.170-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=78 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.171-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=79 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.171-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=79 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.171-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=79 id=80 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.171-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=80 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.172-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=80 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.172-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=80 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.238-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=79 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.238-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=79 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.238-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=79 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.239-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[506] string=" the" Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.239-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=79 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.239-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=80 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.239-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=80 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.239-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=80 id=81 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.239-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=81 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.241-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=81 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.241-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=81 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.305-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=80 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.305-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=80 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.305-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=80 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.307-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[16273] string=" autumn" Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.307-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=80 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.307-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=81 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.307-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=81 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.307-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=81 id=82 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.307-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=82 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.309-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=82 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.309-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=82 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.375-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=81 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.375-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=81 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.375-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=81 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.377-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[3925] string=" story" Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.377-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=81 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.377-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=82 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.377-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=82 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.377-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=82 id=83 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.377-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=83 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.378-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=83 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.379-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=83 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.444-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=82 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.444-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=82 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.444-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=82 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.445-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[496] string=" a" Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.445-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=82 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.445-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=83 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.446-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=83 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.446-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=83 id=84 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.446-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=84 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.447-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=84 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.447-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=84 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.514-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=83 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.514-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=83 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.514-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=83 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.515-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[2268] string=" little" Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.515-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=83 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.515-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=84 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.515-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=84 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.515-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=84 id=85 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.515-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=85 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.516-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=85 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.516-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=85 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.582-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=84 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.582-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=84 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.582-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=84 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.583-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[4890] string=" longer" Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.583-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=84 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.584-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=85 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.584-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=85 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.584-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=85 id=86 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.584-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=86 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.585-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=86 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.585-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=86 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.651-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=85 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.651-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=85 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.651-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=85 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.653-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[32109] string="?\"," Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.653-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=85 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.653-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=86 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.653-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=86 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.653-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=86 id=87 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.653-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=87 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.655-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=87 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.656-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=87 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.721-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=86 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.721-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=86 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.721-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=86 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.722-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[623] string=" \"" Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.722-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=86 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.722-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=87 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.722-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=87 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.722-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=87 id=88 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.722-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=88 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.723-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=88 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.723-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=88 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.791-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=87 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.791-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=87 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.791-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=87 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.792-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[3689] string=What Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.792-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=87 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.792-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=88 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.792-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=88 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.792-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=88 id=89 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.792-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=89 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.794-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=89 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.794-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=89 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.860-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=88 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.860-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=88 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.860-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=88 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.861-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[1032] string=" other" Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.861-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=88 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.861-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=89 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.861-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=89 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.861-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=89 id=90 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.861-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=90 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.862-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=90 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.862-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=90 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.930-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=89 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.930-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=89 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.930-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=89 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.931-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[8511] string=" stories" Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.931-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=89 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.931-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=90 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.931-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=90 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.931-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=90 id=91 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.931-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=91 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.932-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=91 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.932-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=91 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.998-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=90 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.998-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=90 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.998-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=90 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.999-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[776] string=" do" Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.999-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=90 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.999-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=91 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.999-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=91 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.999-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=91 id=92 Nov 13 14:37:46 OMEN ollama[38785]: time=2025-11-13T14:37:46.999-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=92 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.000-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=92 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.000-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=92 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.065-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=91 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.066-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=91 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.066-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=91 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.066-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[611] string=" you" Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.066-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=91 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.066-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=92 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.066-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=92 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.066-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=92 id=93 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.066-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=93 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.067-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=93 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.067-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=93 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.133-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=92 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.133-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=92 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.133-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=92 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.134-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[735] string=" have" Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.134-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=92 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.134-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=93 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.134-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=93 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.134-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=93 id=94 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.134-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=94 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.135-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=94 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.135-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=94 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.201-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=93 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.201-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=93 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.201-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=93 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.202-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[32109] string="?\"," Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.202-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=93 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.202-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=94 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.202-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=94 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.202-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=94 id=95 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.202-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=95 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.203-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=95 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.203-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=95 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.269-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=94 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.269-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=94 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.269-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=94 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.270-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[623] string=" \"" Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.270-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=94 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.270-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=95 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.270-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=95 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.270-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=95 id=96 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.270-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=96 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.271-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=96 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.271-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=96 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.338-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=95 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.338-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=95 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.338-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=95 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.338-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[30092] string=Could Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.338-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=95 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.338-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=96 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.338-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=96 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.338-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=96 id=97 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.338-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=97 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.339-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=97 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.339-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=97 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.406-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=96 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.406-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=96 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.406-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=96 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.407-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[611] string=" you" Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.407-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=96 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.407-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=97 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.407-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=97 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.407-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=97 id=98 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.407-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=98 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.408-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=98 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.408-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=98 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.475-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=97 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.475-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=97 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.475-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=97 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.477-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[4903] string=" write" Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.477-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=97 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.477-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=98 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.477-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=98 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.477-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=98 id=99 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.477-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=99 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.478-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=99 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.478-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=99 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.544-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=98 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.544-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=98 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.544-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=98 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.545-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[496] string=" a" Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.545-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=98 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.545-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=99 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.545-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=99 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.545-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=99 id=100 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.545-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=100 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.546-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=100 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.546-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=100 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.611-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=99 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.611-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=99 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.611-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=99 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.612-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[3925] string=" story" Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.612-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=99 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.612-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=100 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.612-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=100 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.612-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=100 id=101 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.612-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=101 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.614-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=101 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.614-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=101 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.679-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=100 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.679-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=100 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.679-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=100 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.680-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[1003] string=" about" Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.680-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=100 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.680-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=101 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.680-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=101 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.680-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=101 id=102 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.680-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=102 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.681-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=102 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.681-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=102 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.748-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=101 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.748-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=101 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.748-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=101 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.748-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[496] string=" a" Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.748-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=101 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.748-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=102 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.748-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=102 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.748-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=102 id=103 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.748-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=103 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.749-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=103 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.749-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=103 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.815-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=102 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.815-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=102 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.815-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=102 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.816-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[1607] string=" different" Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.816-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=102 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.816-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=103 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.816-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=103 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.816-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=103 id=104 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.816-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=104 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.818-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=104 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.818-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=104 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.883-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=103 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.883-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=103 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.883-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=103 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.883-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[3409] string=" season" Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.883-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=103 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.883-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=104 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.883-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=104 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.883-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=104 id=105 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.883-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=105 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.884-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=105 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.884-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=105 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.950-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=104 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.950-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=104 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.950-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=104 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.950-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[32109] string="?\"," Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.950-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=104 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.950-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=105 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.950-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=105 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.950-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=105 id=106 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.950-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=106 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.951-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=106 Nov 13 14:37:47 OMEN ollama[38785]: time=2025-11-13T14:37:47.951-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=106 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.017-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=105 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.017-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=105 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.017-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=105 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.018-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[623] string=" \"" Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.018-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=105 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.018-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=106 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.018-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=106 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.018-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=106 id=107 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.018-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=107 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.019-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=107 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.019-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=107 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.085-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=106 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.085-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=106 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.085-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=106 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.086-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[8574] string=Can Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.086-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=106 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.086-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=107 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.086-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=107 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.086-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=107 id=108 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.086-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=108 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.087-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=108 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.087-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=108 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.154-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=107 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.154-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=107 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.154-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=107 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.154-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[611] string=" you" Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.154-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=107 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.154-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=108 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.154-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=108 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.155-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=108 id=109 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.155-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=109 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.155-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=109 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.156-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=109 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.222-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=108 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.222-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=108 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.222-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=108 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.223-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[2352] string=" change" Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.223-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=108 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.223-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=109 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.223-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=109 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.223-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=109 id=110 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.223-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=110 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.224-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=110 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.224-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=110 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.291-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=109 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.291-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=109 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.291-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=109 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.291-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[506] string=" the" Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.291-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=109 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.291-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=110 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.291-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=110 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.291-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=110 id=111 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.291-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=111 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.293-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=111 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.293-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=111 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.358-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=110 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.358-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=110 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.358-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=110 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.359-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[3925] string=" story" Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.359-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=110 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.359-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=111 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.359-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=111 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.359-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=111 id=112 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.359-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=112 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.360-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=112 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.360-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=112 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.426-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=111 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.426-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=111 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.426-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=111 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.427-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[531] string=" to" Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.427-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=111 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.427-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=112 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.427-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=112 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.427-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=112 id=113 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.427-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=113 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.429-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=113 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.429-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=113 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.495-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=112 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.495-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=112 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.495-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=112 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.496-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[577] string=" be" Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.496-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=112 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.496-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=113 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.496-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=113 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.496-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=113 id=114 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.496-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=114 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.497-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=114 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.497-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=114 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.563-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=113 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.563-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=113 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.563-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=113 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.564-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[1003] string=" about" Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.564-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=113 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.564-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=114 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.564-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=114 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.564-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=114 id=115 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.564-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=115 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.565-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=115 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.565-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=115 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.631-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=114 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.631-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=114 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.631-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=114 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.632-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[8075] string=" animals" Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.632-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=114 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.632-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=115 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.632-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=115 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.632-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=115 id=116 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.632-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=116 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.633-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=116 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.633-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=116 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.699-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=115 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.699-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=115 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.699-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=115 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.699-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[32109] string="?\"," Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.699-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=115 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.699-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=116 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.700-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=116 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.700-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=116 id=117 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.700-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=117 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.701-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=117 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.701-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=117 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.767-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=116 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.767-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=116 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.767-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=116 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.768-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[623] string=" \"" Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.768-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=116 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.768-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=117 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.768-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=117 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.768-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=117 id=118 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.768-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=118 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.769-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=118 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.769-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=118 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.838-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=117 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.838-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=117 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.838-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=117 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.839-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[3689] string=What Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.839-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=117 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.839-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=118 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.839-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=118 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.839-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=118 id=119 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.839-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=119 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.840-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=119 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.840-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=119 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.908-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=118 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.908-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=118 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.908-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=118 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.909-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[659] string=" are" Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.909-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=118 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.909-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=119 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.909-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=119 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.909-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=119 id=120 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.909-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=120 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.911-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=120 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.911-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=120 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.975-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=119 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.975-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=119 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.975-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=119 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.976-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[1070] string=" some" Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.976-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=119 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.976-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=120 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.976-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=120 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.976-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=120 id=121 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.976-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=121 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.977-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=121 Nov 13 14:37:48 OMEN ollama[38785]: time=2025-11-13T14:37:48.977-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=121 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.043-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=120 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.043-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=120 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.043-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=120 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.044-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[529] string=" of" Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.044-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=120 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.044-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=121 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.044-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=121 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.044-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=121 id=122 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.044-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=122 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.045-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=122 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.045-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=122 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.111-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=121 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.111-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=121 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.111-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=121 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.112-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[822] string=" your" Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.112-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=121 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.112-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=122 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.112-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=122 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.112-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=122 id=123 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.112-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=123 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.116-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=123 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.116-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=123 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.181-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=122 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.181-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=122 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.181-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=122 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.181-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[8126] string=" favorite" Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.181-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=122 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.181-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=123 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.181-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=123 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.181-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=123 id=124 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.181-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=124 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.182-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=124 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.182-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=124 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.248-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=123 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.248-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=123 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.248-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=123 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.249-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[3925] string=" story" Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.249-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=123 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.249-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=124 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.249-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=124 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.249-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=124 id=125 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.249-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=125 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.250-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=125 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.250-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=125 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.316-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=124 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.316-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=124 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.316-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=124 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.317-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[62571] string=" prompts" Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.317-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=124 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.317-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=125 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.317-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=125 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.317-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=125 id=126 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.317-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=126 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.318-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=126 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.318-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=126 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.384-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=125 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.384-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=125 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.384-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=125 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.384-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[236881] string=? Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.384-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=125 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.384-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=126 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.384-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=126 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.384-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=126 id=127 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.384-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=127 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.386-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=127 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.387-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=127 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.453-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=126 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.453-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=126 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.453-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=126 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.453-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[1935] string="\"]" Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.453-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=126 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.453-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=127 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.453-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=127 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.453-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=127 id=128 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.453-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=128 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.454-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=128 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.454-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=128 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.522-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=127 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.522-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=127 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.522-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=127 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.523-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[107] string="\n" Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.523-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=127 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.523-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=128 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.523-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=128 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.523-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=128 id=129 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.523-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=129 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.524-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=129 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.524-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=129 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.590-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=128 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.590-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=128 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.590-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=128 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.591-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[236783] string=} Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.591-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=128 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.591-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=129 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.591-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=129 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.591-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=129 id=130 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.591-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=130 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.592-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=130 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.592-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=130 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.659-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=129 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.659-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=129 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.659-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=129 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.659-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[107] string="\n" Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.659-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=129 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.659-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=130 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.659-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=130 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.659-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=130 id=131 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.659-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=131 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.660-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=131 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.660-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=131 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.726-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=130 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.726-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=130 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.726-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=130 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.726-05:00 level=TRACE source=sentencepiece.go:247 msg=decoded ids=[2717] string=``` Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.726-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=130 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.726-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=131 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.726-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=131 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.726-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=131 id=132 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.726-05:00 level=TRACE source=runner.go:598 msg="forwardBatch iBatch" batchID=132 seqIdx=0 seq.iBatch=0 i+1=1 len(seq.inputs)=1 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.727-05:00 level=TRACE source=runner.go:474 msg="forwardBatch waiting for compute to start" pendingBatch.id=132 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.727-05:00 level=TRACE source=runner.go:650 msg="computeBatch: waiting for inputs to be ready" batchID=132 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.793-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=131 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.793-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=131 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.793-05:00 level=TRACE source=runner.go:759 msg="computeBatch: vocab details" batchID=131 seqIdx=0 len(logits)=262400 len(activeBatch.batch.Outputs)=1 vocabSize=262400 iBatches=[0] Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.794-05:00 level=TRACE source=runner.go:773 msg="computeBatch: EOS" batchID=131 seqIdx=0 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.794-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=131 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.794-05:00 level=TRACE source=runner.go:652 msg="computeBatch: inputs are ready" batchID=132 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.794-05:00 level=TRACE source=runner.go:725 msg="computeBatch: signaling computeStartedCh" batchID=132 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.794-05:00 level=TRACE source=runner.go:476 msg="forwardBatch compute started, setting up next batch" pendingBatch.id=132 id=133 Nov 13 14:37:49 OMEN ollama[38785]: [GIN] 2025/11/13 - 14:37:49 | 200 | 13.529439634s | 172.17.0.2 | POST "/api/chat" Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.794-05:00 level=DEBUG source=sched.go:385 msg="context for request finished" runner.name=registry.ollama.ai/library/gemma3n:latest runner.size="7.6 GiB" runner.vram="0 B" runner.parallel=1 runner.pid=38937 runner.model=/usr/share/ollama/.ollama/models/blobs/sha256-38e8dcc30df4eb0e29eaf5c74ba6ce3f2cd66badad50768fc14362acfb8b8cb6 runner.num_ctx=4096 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.794-05:00 level=DEBUG source=sched.go:290 msg="runner with non-zero duration has gone idle, adding timer" runner.name=registry.ollama.ai/library/gemma3n:latest runner.size="7.6 GiB" runner.vram="0 B" runner.parallel=1 runner.pid=38937 runner.model=/usr/share/ollama/.ollama/models/blobs/sha256-38e8dcc30df4eb0e29eaf5c74ba6ce3f2cd66badad50768fc14362acfb8b8cb6 runner.num_ctx=4096 duration=5m0s Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.794-05:00 level=DEBUG source=sched.go:308 msg="after processing request finished event" runner.name=registry.ollama.ai/library/gemma3n:latest runner.size="7.6 GiB" runner.vram="0 B" runner.parallel=1 runner.pid=38937 runner.model=/usr/share/ollama/.ollama/models/blobs/sha256-38e8dcc30df4eb0e29eaf5c74ba6ce3f2cd66badad50768fc14362acfb8b8cb6 runner.num_ctx=4096 refCount=0 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.861-05:00 level=TRACE source=runner.go:733 msg="computeBatch: logits ready" batchID=132 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.861-05:00 level=TRACE source=runner.go:738 msg="computeBatch: decoding" batchID=132 Nov 13 14:37:49 OMEN ollama[38785]: time=2025-11-13T14:37:49.861-05:00 level=TRACE source=runner.go:657 msg="computeBatch: outputs are ready" batchID=132 ```
Author
Owner

@rick-github commented on GitHub (Nov 13, 2025):

The start of the log that contains the device detection is the bit that's required, from the line that has msg="server config" to the line that has msg="inference compute", inclusive.

<!-- gh-comment-id:3529450632 --> @rick-github commented on GitHub (Nov 13, 2025): The start of the log that contains the device detection is the bit that's required, from the line that has `msg="server config"` to the line that has `msg="inference compute"`, inclusive.
Author
Owner

@ganakee commented on GitHub (Nov 13, 2025):

OK. Sorry. Odd. I copied the whole log from terminal.
I need to leave but will try perhaps tonight.

<!-- gh-comment-id:3529475354 --> @ganakee commented on GitHub (Nov 13, 2025): OK. Sorry. Odd. I copied the whole log from terminal. I need to leave but will try perhaps tonight.
Author
Owner

@ganakee commented on GitHub (Nov 13, 2025):

OK. I tried again before leaving. Re-installed 0.12.11-rc1. OLLAMA_DEBUG=2.

I searched the terminnl output, and there is no msg="server or similar. Hmmmm. I did a full reboot even. I also serched for msg="inference compute" and saw nothing.

<!-- gh-comment-id:3529540715 --> @ganakee commented on GitHub (Nov 13, 2025): OK. I tried again before leaving. Re-installed 0.12.11-rc1. OLLAMA_DEBUG=2. I searched the terminnl output, and there is no msg="server or similar. Hmmmm. I did a full reboot even. I also serched for msg="inference compute" and saw nothing.
Author
Owner

@rick-github commented on GitHub (Nov 13, 2025):

journalctl -u ollama --since "$(systemctl show ollama --property=ActiveEnterTimestamp --value)" | sed -ne '/server config/,/inference compute/p'
<!-- gh-comment-id:3529562868 --> @rick-github commented on GitHub (Nov 13, 2025): ``` journalctl -u ollama --since "$(systemctl show ollama --property=ActiveEnterTimestamp --value)" | sed -ne '/server config/,/inference compute/p' ```
Author
Owner

@ganakee commented on GitHub (Nov 13, 2025):

Thanks @rick-github.

OK, full reboot. OLLAMA_DEBUG=2 (and active). using 0.12.11-rc1 ollama --version = 0.12.11-rc1

This is the entire GNOME Terminal Output....

journalctl -u ollama --since "$(systemctl show ollama --property=ActiveEnterTimestamp --value)" | sed -ne '/server config/,/inference compute/p'
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.743-05:00 level=INFO source=routes.go:1544 msg="server config" env="map[CUDA_VISIBLE_DEVICES: GGML_VK_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION:10.3.0 HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:4096 OLLAMA_DEBUG:DEBUG-4 OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://0.0.0.0:11434 OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/usr/share/ollama/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_REMOTES:[ollama.com] OLLAMA_SCHED_SPREAD:false OLLAMA_VULKAN:false ROCR_VISIBLE_DEVICES:2,3,1,0 http_proxy: https_proxy: no_proxy:]"
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.749-05:00 level=INFO source=images.go:522 msg="total blobs: 29"
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.749-05:00 level=INFO source=images.go:529 msg="total unused blobs removed: 0"
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.750-05:00 level=INFO source=routes.go:1597 msg="Listening on [::]:11434 (version 0.12.11-rc1)"
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.750-05:00 level=DEBUG source=sched.go:120 msg="starting llm scheduler"
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.752-05:00 level=INFO source=runner.go:67 msg="discovering available GPUs..."
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.752-05:00 level=TRACE source=runner.go:421 msg="starting runner for device discovery" libDirs="[/usr/lib/ollama /usr/lib/ollama/rocm]" extraEnvs=map[]
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.753-05:00 level=INFO source=server.go:392 msg="starting runner" cmd="/usr/bin/ollama runner --ollama-engine --port 42781"
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.754-05:00 level=DEBUG source=server.go:393 msg=subprocess PATH=$PATH OLLAMA_HOST=0.0.0.0 ROCR_VISIBLE_DEVICES=2,3,1,0 HSA_OVERRIDE_GFX_VERSION=10.3.0 OLLAMA_DEBUG=2 LD_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/rocm OLLAMA_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/rocm
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.769-05:00 level=INFO source=runner.go:1398 msg="starting ollama engine"
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.769-05:00 level=INFO source=runner.go:1433 msg="Server listening on 127.0.0.1:42781"
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=DEBUG source=gguf.go:590 msg=general.architecture type=string
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=DEBUG source=gguf.go:590 msg=tokenizer.ggml.model type=string
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default=""
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default=""
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=INFO source=ggml.go:136 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama
Nov 13 15:23:57 OMEN ollama[2658]: ggml_cuda_init: failed to initialize ROCm: no ROCm-capable device is detected
Nov 13 15:23:57 OMEN ollama[2658]: load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-haswell.so
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.975-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama/rocm
Nov 13 15:23:57 OMEN ollama[2658]: ggml_cuda_init: failed to initialize ROCm: no ROCm-capable device is detected
Nov 13 15:23:57 OMEN ollama[2658]: load_backend: loaded ROCm backend from /usr/lib/ollama/rocm/libggml-hip.so
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 compiler=cgo(gcc)
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}"
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}"
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}"
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}"
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}"
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default=""
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=runner.go:1373 msg="dummy model load took" duration=216.498661ms
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=runner.go:1378 msg="gathering device infos took" duration=561ns
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=TRACE source=runner.go:448 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/rocm]" devices=[]
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=runner.go:418 msg="bootstrap discovery took" duration=240.544547ms OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/rocm]" extra_envs=map[]
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=TRACE source=runner.go:421 msg="starting runner for device discovery" libDirs="[/usr/lib/ollama /usr/lib/ollama/cuda_v12]" extraEnvs=map[]
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.993-05:00 level=INFO source=server.go:392 msg="starting runner" cmd="/usr/bin/ollama runner --ollama-engine --port 36907"
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.993-05:00 level=DEBUG source=server.go:393 msg=subprocess PATH=$PATH OLLAMA_HOST=0.0.0.0 ROCR_VISIBLE_DEVICES=2,3,1,0 HSA_OVERRIDE_GFX_VERSION=10.3.0 OLLAMA_DEBUG=2 LD_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/cuda_v12 OLLAMA_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/cuda_v12
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.007-05:00 level=INFO source=runner.go:1398 msg="starting ollama engine"
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.007-05:00 level=INFO source=runner.go:1433 msg="Server listening on 127.0.0.1:36907"
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=DEBUG source=gguf.go:590 msg=general.architecture type=string
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=DEBUG source=gguf.go:590 msg=tokenizer.ggml.model type=string
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default=""
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default=""
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=INFO source=ggml.go:136 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama
Nov 13 15:23:58 OMEN ollama[2658]: load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-haswell.so
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.020-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama/cuda_v12
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.022-05:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 compiler=cgo(gcc)
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.022-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.022-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.022-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.022-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}"
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.022-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}"
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}"
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}"
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}"
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default=""
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=runner.go:1373 msg="dummy model load took" duration=8.858069ms
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=runner.go:1378 msg="gathering device infos took" duration=490ns
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=TRACE source=runner.go:448 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v12]" devices=[]
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=runner.go:418 msg="bootstrap discovery took" duration=30.624662ms OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v12]" extra_envs=map[]
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=TRACE source=runner.go:421 msg="starting runner for device discovery" libDirs="[/usr/lib/ollama /usr/lib/ollama/cuda_v13]" extraEnvs=map[]
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=INFO source=server.go:392 msg="starting runner" cmd="/usr/bin/ollama runner --ollama-engine --port 42069"
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=server.go:393 msg=subprocess PATH=$PATH OLLAMA_HOST=0.0.0.0 ROCR_VISIBLE_DEVICES=2,3,1,0 HSA_OVERRIDE_GFX_VERSION=10.3.0 OLLAMA_DEBUG=2 LD_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/cuda_v13 OLLAMA_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/cuda_v13
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.037-05:00 level=INFO source=runner.go:1398 msg="starting ollama engine"
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.037-05:00 level=INFO source=runner.go:1433 msg="Server listening on 127.0.0.1:42069"
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=DEBUG source=gguf.go:590 msg=general.architecture type=string
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=DEBUG source=gguf.go:590 msg=tokenizer.ggml.model type=string
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default=""
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default=""
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=INFO source=ggml.go:136 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama
Nov 13 15:23:58 OMEN ollama[2658]: load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-haswell.so
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.051-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama/cuda_v13
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 compiler=cgo(gcc)
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}"
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}"
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}"
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}"
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}"
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default=""
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=runner.go:1373 msg="dummy model load took" duration=9.027717ms
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=runner.go:1378 msg="gathering device infos took" duration=541ns
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=TRACE source=runner.go:448 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v13]" devices=[]
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=runner.go:418 msg="bootstrap discovery took" duration=31.296201ms OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v13]" extra_envs=map[]
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=runner.go:116 msg="evluating which if any devices to filter out" initial_count=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=TRACE source=runner.go:156 msg="supported GPU library combinations before filtering" supported=map[]
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=runner.go:40 msg="GPU bootstrap discovery took" duration=304.370071ms
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=INFO source=types.go:60 msg="inference compute" id=cpu library=cpu compute="" name=cpu description=cpu libdirs=ollama driver="" pci_id="" type="" total="30.1 GiB" available="28.8 GiB"



<!-- gh-comment-id:3529589654 --> @ganakee commented on GitHub (Nov 13, 2025): Thanks @rick-github. OK, full reboot. OLLAMA_DEBUG=2 (and active). using 0.12.11-rc1 ollama --version = 0.12.11-rc1 This is the entire GNOME Terminal Output.... ``` journalctl -u ollama --since "$(systemctl show ollama --property=ActiveEnterTimestamp --value)" | sed -ne '/server config/,/inference compute/p' Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.743-05:00 level=INFO source=routes.go:1544 msg="server config" env="map[CUDA_VISIBLE_DEVICES: GGML_VK_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION:10.3.0 HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:4096 OLLAMA_DEBUG:DEBUG-4 OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://0.0.0.0:11434 OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/usr/share/ollama/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_REMOTES:[ollama.com] OLLAMA_SCHED_SPREAD:false OLLAMA_VULKAN:false ROCR_VISIBLE_DEVICES:2,3,1,0 http_proxy: https_proxy: no_proxy:]" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.749-05:00 level=INFO source=images.go:522 msg="total blobs: 29" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.749-05:00 level=INFO source=images.go:529 msg="total unused blobs removed: 0" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.750-05:00 level=INFO source=routes.go:1597 msg="Listening on [::]:11434 (version 0.12.11-rc1)" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.750-05:00 level=DEBUG source=sched.go:120 msg="starting llm scheduler" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.752-05:00 level=INFO source=runner.go:67 msg="discovering available GPUs..." Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.752-05:00 level=TRACE source=runner.go:421 msg="starting runner for device discovery" libDirs="[/usr/lib/ollama /usr/lib/ollama/rocm]" extraEnvs=map[] Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.753-05:00 level=INFO source=server.go:392 msg="starting runner" cmd="/usr/bin/ollama runner --ollama-engine --port 42781" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.754-05:00 level=DEBUG source=server.go:393 msg=subprocess PATH=$PATH OLLAMA_HOST=0.0.0.0 ROCR_VISIBLE_DEVICES=2,3,1,0 HSA_OVERRIDE_GFX_VERSION=10.3.0 OLLAMA_DEBUG=2 LD_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/rocm OLLAMA_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/rocm Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.769-05:00 level=INFO source=runner.go:1398 msg="starting ollama engine" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.769-05:00 level=INFO source=runner.go:1433 msg="Server listening on 127.0.0.1:42781" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=DEBUG source=gguf.go:590 msg=general.architecture type=string Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=DEBUG source=gguf.go:590 msg=tokenizer.ggml.model type=string Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default="" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default="" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=INFO source=ggml.go:136 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama Nov 13 15:23:57 OMEN ollama[2658]: ggml_cuda_init: failed to initialize ROCm: no ROCm-capable device is detected Nov 13 15:23:57 OMEN ollama[2658]: load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-haswell.so Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.975-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama/rocm Nov 13 15:23:57 OMEN ollama[2658]: ggml_cuda_init: failed to initialize ROCm: no ROCm-capable device is detected Nov 13 15:23:57 OMEN ollama[2658]: load_backend: loaded ROCm backend from /usr/lib/ollama/rocm/libggml-hip.so Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 compiler=cgo(gcc) Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default="" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=runner.go:1373 msg="dummy model load took" duration=216.498661ms Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=runner.go:1378 msg="gathering device infos took" duration=561ns Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=TRACE source=runner.go:448 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/rocm]" devices=[] Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=runner.go:418 msg="bootstrap discovery took" duration=240.544547ms OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/rocm]" extra_envs=map[] Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=TRACE source=runner.go:421 msg="starting runner for device discovery" libDirs="[/usr/lib/ollama /usr/lib/ollama/cuda_v12]" extraEnvs=map[] Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.993-05:00 level=INFO source=server.go:392 msg="starting runner" cmd="/usr/bin/ollama runner --ollama-engine --port 36907" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.993-05:00 level=DEBUG source=server.go:393 msg=subprocess PATH=$PATH OLLAMA_HOST=0.0.0.0 ROCR_VISIBLE_DEVICES=2,3,1,0 HSA_OVERRIDE_GFX_VERSION=10.3.0 OLLAMA_DEBUG=2 LD_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/cuda_v12 OLLAMA_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/cuda_v12 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.007-05:00 level=INFO source=runner.go:1398 msg="starting ollama engine" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.007-05:00 level=INFO source=runner.go:1433 msg="Server listening on 127.0.0.1:36907" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=DEBUG source=gguf.go:590 msg=general.architecture type=string Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=DEBUG source=gguf.go:590 msg=tokenizer.ggml.model type=string Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default="" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default="" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=INFO source=ggml.go:136 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama Nov 13 15:23:58 OMEN ollama[2658]: load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-haswell.so Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.020-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama/cuda_v12 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.022-05:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 compiler=cgo(gcc) Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.022-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.022-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.022-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.022-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.022-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default="" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=runner.go:1373 msg="dummy model load took" duration=8.858069ms Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=runner.go:1378 msg="gathering device infos took" duration=490ns Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=TRACE source=runner.go:448 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v12]" devices=[] Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=runner.go:418 msg="bootstrap discovery took" duration=30.624662ms OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v12]" extra_envs=map[] Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=TRACE source=runner.go:421 msg="starting runner for device discovery" libDirs="[/usr/lib/ollama /usr/lib/ollama/cuda_v13]" extraEnvs=map[] Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=INFO source=server.go:392 msg="starting runner" cmd="/usr/bin/ollama runner --ollama-engine --port 42069" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=server.go:393 msg=subprocess PATH=$PATH OLLAMA_HOST=0.0.0.0 ROCR_VISIBLE_DEVICES=2,3,1,0 HSA_OVERRIDE_GFX_VERSION=10.3.0 OLLAMA_DEBUG=2 LD_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/cuda_v13 OLLAMA_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/cuda_v13 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.037-05:00 level=INFO source=runner.go:1398 msg="starting ollama engine" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.037-05:00 level=INFO source=runner.go:1433 msg="Server listening on 127.0.0.1:42069" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=DEBUG source=gguf.go:590 msg=general.architecture type=string Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=DEBUG source=gguf.go:590 msg=tokenizer.ggml.model type=string Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default="" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default="" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=INFO source=ggml.go:136 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama Nov 13 15:23:58 OMEN ollama[2658]: load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-haswell.so Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.051-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama/cuda_v13 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 compiler=cgo(gcc) Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default="" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=runner.go:1373 msg="dummy model load took" duration=9.027717ms Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=runner.go:1378 msg="gathering device infos took" duration=541ns Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=TRACE source=runner.go:448 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v13]" devices=[] Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=runner.go:418 msg="bootstrap discovery took" duration=31.296201ms OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v13]" extra_envs=map[] Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=runner.go:116 msg="evluating which if any devices to filter out" initial_count=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=TRACE source=runner.go:156 msg="supported GPU library combinations before filtering" supported=map[] Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=runner.go:40 msg="GPU bootstrap discovery took" duration=304.370071ms Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=INFO source=types.go:60 msg="inference compute" id=cpu library=cpu compute="" name=cpu description=cpu libdirs=ollama driver="" pci_id="" type="" total="30.1 GiB" available="28.8 GiB" ```
Author
Owner

@ganakee commented on GitHub (Nov 13, 2025):

Just in case I missed anything:

 journalctl -u ollama --since "$(systemctl show ollama --property=ActiveEnterTimestamp --value)" | sed -ne '/server config/,/inference compute/p'
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.743-05:00 level=INFO source=routes.go:1544 msg="server config" env="map[CUDA_VISIBLE_DEVICES: GGML_VK_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION:10.3.0 HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:4096 OLLAMA_DEBUG:DEBUG-4 OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://0.0.0.0:11434 OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/usr/share/ollama/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_REMOTES:[ollama.com] OLLAMA_SCHED_SPREAD:false OLLAMA_VULKAN:false ROCR_VISIBLE_DEVICES:2,3,1,0 http_proxy: https_proxy: no_proxy:]"
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.749-05:00 level=INFO source=images.go:522 msg="total blobs: 29"
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.749-05:00 level=INFO source=images.go:529 msg="total unused blobs removed: 0"
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.750-05:00 level=INFO source=routes.go:1597 msg="Listening on [::]:11434 (version 0.12.11-rc1)"
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.750-05:00 level=DEBUG source=sched.go:120 msg="starting llm scheduler"
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.752-05:00 level=INFO source=runner.go:67 msg="discovering available GPUs..."
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.752-05:00 level=TRACE source=runner.go:421 msg="starting runner for device discovery" libDirs="[/usr/lib/ollama /usr/lib/ollama/rocm]" extraEnvs=map[]
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.753-05:00 level=INFO source=server.go:392 msg="starting runner" cmd="/usr/bin/ollama runner --ollama-engine --port 42781"
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.754-05:00 level=DEBUG source=server.go:393 msg=subprocess PATH=$PATH OLLAMA_HOST=0.0.0.0 ROCR_VISIBLE_DEVICES=2,3,1,0 HSA_OVERRIDE_GFX_VERSION=10.3.0 OLLAMA_DEBUG=2 LD_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/rocm OLLAMA_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/rocm
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.769-05:00 level=INFO source=runner.go:1398 msg="starting ollama engine"
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.769-05:00 level=INFO source=runner.go:1433 msg="Server listening on 127.0.0.1:42781"
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=DEBUG source=gguf.go:590 msg=general.architecture type=string
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=DEBUG source=gguf.go:590 msg=tokenizer.ggml.model type=string
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default=""
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default=""
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=INFO source=ggml.go:136 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama
Nov 13 15:23:57 OMEN ollama[2658]: ggml_cuda_init: failed to initialize ROCm: no ROCm-capable device is detected
Nov 13 15:23:57 OMEN ollama[2658]: load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-haswell.so
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.975-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama/rocm
Nov 13 15:23:57 OMEN ollama[2658]: ggml_cuda_init: failed to initialize ROCm: no ROCm-capable device is detected
Nov 13 15:23:57 OMEN ollama[2658]: load_backend: loaded ROCm backend from /usr/lib/ollama/rocm/libggml-hip.so
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 compiler=cgo(gcc)
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}"
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}"
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}"
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}"
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}"
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default=""
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=runner.go:1373 msg="dummy model load took" duration=216.498661ms
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=runner.go:1378 msg="gathering device infos took" duration=561ns
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=TRACE source=runner.go:448 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/rocm]" devices=[]
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=runner.go:418 msg="bootstrap discovery took" duration=240.544547ms OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/rocm]" extra_envs=map[]
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=TRACE source=runner.go:421 msg="starting runner for device discovery" libDirs="[/usr/lib/ollama /usr/lib/ollama/cuda_v12]" extraEnvs=map[]
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.993-05:00 level=INFO source=server.go:392 msg="starting runner" cmd="/usr/bin/ollama runner --ollama-engine --port 36907"
Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.993-05:00 level=DEBUG source=server.go:393 msg=subprocess PATH=$PATH OLLAMA_HOST=0.0.0.0 ROCR_VISIBLE_DEVICES=2,3,1,0 HSA_OVERRIDE_GFX_VERSION=10.3.0 OLLAMA_DEBUG=2 LD_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/cuda_v12 OLLAMA_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/cuda_v12
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.007-05:00 level=INFO source=runner.go:1398 msg="starting ollama engine"
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.007-05:00 level=INFO source=runner.go:1433 msg="Server listening on 127.0.0.1:36907"
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=DEBUG source=gguf.go:590 msg=general.architecture type=string
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=DEBUG source=gguf.go:590 msg=tokenizer.ggml.model type=string
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default=""
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default=""
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=INFO source=ggml.go:136 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama
Nov 13 15:23:58 OMEN ollama[2658]: load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-haswell.so
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.020-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama/cuda_v12
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.022-05:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 compiler=cgo(gcc)
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.022-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.022-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.022-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.022-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}"
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.022-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}"
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}"
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}"
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}"
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default=""
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=runner.go:1373 msg="dummy model load took" duration=8.858069ms
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=runner.go:1378 msg="gathering device infos took" duration=490ns
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=TRACE source=runner.go:448 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v12]" devices=[]
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=runner.go:418 msg="bootstrap discovery took" duration=30.624662ms OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v12]" extra_envs=map[]
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=TRACE source=runner.go:421 msg="starting runner for device discovery" libDirs="[/usr/lib/ollama /usr/lib/ollama/cuda_v13]" extraEnvs=map[]
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=INFO source=server.go:392 msg="starting runner" cmd="/usr/bin/ollama runner --ollama-engine --port 42069"
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=server.go:393 msg=subprocess PATH=$PATH OLLAMA_HOST=0.0.0.0 ROCR_VISIBLE_DEVICES=2,3,1,0 HSA_OVERRIDE_GFX_VERSION=10.3.0 OLLAMA_DEBUG=2 LD_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/cuda_v13 OLLAMA_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/cuda_v13
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.037-05:00 level=INFO source=runner.go:1398 msg="starting ollama engine"
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.037-05:00 level=INFO source=runner.go:1433 msg="Server listening on 127.0.0.1:42069"
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=DEBUG source=gguf.go:590 msg=general.architecture type=string
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=DEBUG source=gguf.go:590 msg=tokenizer.ggml.model type=string
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default=""
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default=""
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=INFO source=ggml.go:136 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama
Nov 13 15:23:58 OMEN ollama[2658]: load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-haswell.so
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.051-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama/cuda_v13
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 compiler=cgo(gcc)
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}"
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}"
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}"
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}"
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}"
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default=""
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=runner.go:1373 msg="dummy model load took" duration=9.027717ms
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=runner.go:1378 msg="gathering device infos took" duration=541ns
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=TRACE source=runner.go:448 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v13]" devices=[]
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=runner.go:418 msg="bootstrap discovery took" duration=31.296201ms OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v13]" extra_envs=map[]
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=runner.go:116 msg="evluating which if any devices to filter out" initial_count=0
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=TRACE source=runner.go:156 msg="supported GPU library combinations before filtering" supported=map[]
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=runner.go:40 msg="GPU bootstrap discovery took" duration=304.370071ms
Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=INFO source=types.go:60 msg="inference compute" id=cpu library=cpu compute="" name=cpu description=cpu libdirs=ollama driver="" pci_id="" type="" total="30.1 GiB" available="28.8 GiB"


<!-- gh-comment-id:3529597904 --> @ganakee commented on GitHub (Nov 13, 2025): Just in case I missed anything: ``` journalctl -u ollama --since "$(systemctl show ollama --property=ActiveEnterTimestamp --value)" | sed -ne '/server config/,/inference compute/p' Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.743-05:00 level=INFO source=routes.go:1544 msg="server config" env="map[CUDA_VISIBLE_DEVICES: GGML_VK_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION:10.3.0 HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:4096 OLLAMA_DEBUG:DEBUG-4 OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://0.0.0.0:11434 OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/usr/share/ollama/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_REMOTES:[ollama.com] OLLAMA_SCHED_SPREAD:false OLLAMA_VULKAN:false ROCR_VISIBLE_DEVICES:2,3,1,0 http_proxy: https_proxy: no_proxy:]" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.749-05:00 level=INFO source=images.go:522 msg="total blobs: 29" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.749-05:00 level=INFO source=images.go:529 msg="total unused blobs removed: 0" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.750-05:00 level=INFO source=routes.go:1597 msg="Listening on [::]:11434 (version 0.12.11-rc1)" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.750-05:00 level=DEBUG source=sched.go:120 msg="starting llm scheduler" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.752-05:00 level=INFO source=runner.go:67 msg="discovering available GPUs..." Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.752-05:00 level=TRACE source=runner.go:421 msg="starting runner for device discovery" libDirs="[/usr/lib/ollama /usr/lib/ollama/rocm]" extraEnvs=map[] Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.753-05:00 level=INFO source=server.go:392 msg="starting runner" cmd="/usr/bin/ollama runner --ollama-engine --port 42781" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.754-05:00 level=DEBUG source=server.go:393 msg=subprocess PATH=$PATH OLLAMA_HOST=0.0.0.0 ROCR_VISIBLE_DEVICES=2,3,1,0 HSA_OVERRIDE_GFX_VERSION=10.3.0 OLLAMA_DEBUG=2 LD_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/rocm OLLAMA_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/rocm Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.769-05:00 level=INFO source=runner.go:1398 msg="starting ollama engine" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.769-05:00 level=INFO source=runner.go:1433 msg="Server listening on 127.0.0.1:42781" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=DEBUG source=gguf.go:590 msg=general.architecture type=string Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=DEBUG source=gguf.go:590 msg=tokenizer.ggml.model type=string Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default="" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default="" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=INFO source=ggml.go:136 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.776-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama Nov 13 15:23:57 OMEN ollama[2658]: ggml_cuda_init: failed to initialize ROCm: no ROCm-capable device is detected Nov 13 15:23:57 OMEN ollama[2658]: load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-haswell.so Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.975-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama/rocm Nov 13 15:23:57 OMEN ollama[2658]: ggml_cuda_init: failed to initialize ROCm: no ROCm-capable device is detected Nov 13 15:23:57 OMEN ollama[2658]: load_backend: loaded ROCm backend from /usr/lib/ollama/rocm/libggml-hip.so Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 compiler=cgo(gcc) Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.991-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default="" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1 Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=runner.go:1373 msg="dummy model load took" duration=216.498661ms Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=runner.go:1378 msg="gathering device infos took" duration=561ns Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=TRACE source=runner.go:448 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/rocm]" devices=[] Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=DEBUG source=runner.go:418 msg="bootstrap discovery took" duration=240.544547ms OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/rocm]" extra_envs=map[] Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.992-05:00 level=TRACE source=runner.go:421 msg="starting runner for device discovery" libDirs="[/usr/lib/ollama /usr/lib/ollama/cuda_v12]" extraEnvs=map[] Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.993-05:00 level=INFO source=server.go:392 msg="starting runner" cmd="/usr/bin/ollama runner --ollama-engine --port 36907" Nov 13 15:23:57 OMEN ollama[2658]: time=2025-11-13T15:23:57.993-05:00 level=DEBUG source=server.go:393 msg=subprocess PATH=$PATH OLLAMA_HOST=0.0.0.0 ROCR_VISIBLE_DEVICES=2,3,1,0 HSA_OVERRIDE_GFX_VERSION=10.3.0 OLLAMA_DEBUG=2 LD_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/cuda_v12 OLLAMA_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/cuda_v12 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.007-05:00 level=INFO source=runner.go:1398 msg="starting ollama engine" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.007-05:00 level=INFO source=runner.go:1433 msg="Server listening on 127.0.0.1:36907" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=DEBUG source=gguf.go:590 msg=general.architecture type=string Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=DEBUG source=gguf.go:590 msg=tokenizer.ggml.model type=string Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default="" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default="" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=INFO source=ggml.go:136 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.014-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama Nov 13 15:23:58 OMEN ollama[2658]: load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-haswell.so Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.020-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama/cuda_v12 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.022-05:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 compiler=cgo(gcc) Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.022-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.022-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.022-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.022-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.022-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default="" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=runner.go:1373 msg="dummy model load took" duration=8.858069ms Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=runner.go:1378 msg="gathering device infos took" duration=490ns Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=TRACE source=runner.go:448 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v12]" devices=[] Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=runner.go:418 msg="bootstrap discovery took" duration=30.624662ms OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v12]" extra_envs=map[] Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=TRACE source=runner.go:421 msg="starting runner for device discovery" libDirs="[/usr/lib/ollama /usr/lib/ollama/cuda_v13]" extraEnvs=map[] Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=INFO source=server.go:392 msg="starting runner" cmd="/usr/bin/ollama runner --ollama-engine --port 42069" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.023-05:00 level=DEBUG source=server.go:393 msg=subprocess PATH=$PATH OLLAMA_HOST=0.0.0.0 ROCR_VISIBLE_DEVICES=2,3,1,0 HSA_OVERRIDE_GFX_VERSION=10.3.0 OLLAMA_DEBUG=2 LD_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/cuda_v13 OLLAMA_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/cuda_v13 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.037-05:00 level=INFO source=runner.go:1398 msg="starting ollama engine" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.037-05:00 level=INFO source=runner.go:1433 msg="Server listening on 127.0.0.1:42069" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=DEBUG source=gguf.go:590 msg=general.architecture type=string Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=DEBUG source=gguf.go:590 msg=tokenizer.ggml.model type=string Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default="" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default="" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=INFO source=ggml.go:136 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.045-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama Nov 13 15:23:58 OMEN ollama[2658]: load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-haswell.so Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.051-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama/cuda_v13 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 compiler=cgo(gcc) Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default="" Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=runner.go:1373 msg="dummy model load took" duration=9.027717ms Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=runner.go:1378 msg="gathering device infos took" duration=541ns Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=TRACE source=runner.go:448 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v13]" devices=[] Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=runner.go:418 msg="bootstrap discovery took" duration=31.296201ms OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v13]" extra_envs=map[] Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=runner.go:116 msg="evluating which if any devices to filter out" initial_count=0 Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=TRACE source=runner.go:156 msg="supported GPU library combinations before filtering" supported=map[] Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=DEBUG source=runner.go:40 msg="GPU bootstrap discovery took" duration=304.370071ms Nov 13 15:23:58 OMEN ollama[2658]: time=2025-11-13T15:23:58.054-05:00 level=INFO source=types.go:60 msg="inference compute" id=cpu library=cpu compute="" name=cpu description=cpu libdirs=ollama driver="" pci_id="" type="" total="30.1 GiB" available="28.8 GiB" ```
Author
Owner

@ganakee commented on GitHub (Nov 14, 2025):

Just provided for comparison. I reverted to 0.12.3 and ran the same prompt using 0.12.3.

Log for Comparison With 0.12.3 (This Works) OLLAMA_LOG=2

journalctl -u ollama --since "$(systemctl show ollama --property=ActiveEnterTimestamp --value)" | sed -ne '/server config/,/inference compute/p'
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.131-05:00 level=INFO source=routes.go:1475 msg="server config" env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION:10.3.0 HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:4096 OLLAMA_DEBUG:DEBUG-4 OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://0.0.0.0:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/usr/share/ollama/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_REMOTES:[ollama.com] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES:2,3,1,0 http_proxy: https_proxy: no_proxy:]"
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.132-05:00 level=INFO source=images.go:518 msg="total blobs: 29"
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.132-05:00 level=INFO source=images.go:525 msg="total unused blobs removed: 0"
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.132-05:00 level=INFO source=routes.go:1528 msg="Listening on [::]:11434 (version 0.12.3)"
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.132-05:00 level=DEBUG source=sched.go:121 msg="starting llm scheduler"
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.133-05:00 level=INFO source=gpu.go:217 msg="looking for compatible GPUs"
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.133-05:00 level=DEBUG source=gpu.go:98 msg="searching for GPU discovery libraries for NVIDIA"
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.133-05:00 level=DEBUG source=gpu.go:520 msg="Searching for GPU library" name=libcuda.so*
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.133-05:00 level=DEBUG source=gpu.go:544 msg="gpu library search" globs="[/usr/lib/ollama/libcuda.so* /libcuda.so* /usr/local/cuda*/targets/*/lib/libcuda.so* /usr/lib/*-linux-gnu/nvidia/current/libcuda.so* /usr/lib/*-linux-gnu/libcuda.so* /usr/lib/wsl/lib/libcuda.so* /usr/lib/wsl/drivers/*/libcuda.so* /opt/cuda/lib*/libcuda.so* /usr/local/cuda/lib*/libcuda.so* /usr/lib*/libcuda.so* /usr/local/lib*/libcuda.so*]"
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.143-05:00 level=DEBUG source=gpu.go:577 msg="discovered GPU libraries" paths=[]
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.143-05:00 level=DEBUG source=gpu.go:520 msg="Searching for GPU library" name=libcudart.so*
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.143-05:00 level=DEBUG source=gpu.go:544 msg="gpu library search" globs="[/usr/lib/ollama/libcudart.so* /libcudart.so* /usr/lib/ollama/cuda_v*/libcudart.so* /usr/local/cuda/lib64/libcudart.so* /usr/lib/x86_64-linux-gnu/nvidia/current/libcudart.so* /usr/lib/x86_64-linux-gnu/libcudart.so* /usr/lib/wsl/lib/libcudart.so* /usr/lib/wsl/drivers/*/libcudart.so* /opt/cuda/lib64/libcudart.so* /usr/local/cuda*/targets/aarch64-linux/lib/libcudart.so* /usr/lib/aarch64-linux-gnu/nvidia/current/libcudart.so* /usr/lib/aarch64-linux-gnu/libcudart.so* /usr/local/cuda/lib*/libcudart.so* /usr/lib*/libcudart.so* /usr/local/lib*/libcudart.so*]"
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.146-05:00 level=DEBUG source=gpu.go:577 msg="discovered GPU libraries" paths="[/usr/lib/ollama/libcudart.so.12.8.90 /usr/lib/ollama/cuda_v12/libcudart.so.12.8.90 /usr/lib/ollama/cuda_v13/libcudart.so.13.0.88 /usr/lib/ollama/cuda_v13/libcudart.so.13.0.96]"
Nov 13 19:01:16 OMEN ollama[26614]: cudaSetDevice err: 35
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.147-05:00 level=DEBUG source=gpu.go:593 msg="Unable to load cudart library /usr/lib/ollama/libcudart.so.12.8.90: your nvidia driver is too old or missing.  If you have a CUDA GPU please upgrade to run ollama"
Nov 13 19:01:16 OMEN ollama[26614]: cudaSetDevice err: 35
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.147-05:00 level=DEBUG source=gpu.go:593 msg="Unable to load cudart library /usr/lib/ollama/cuda_v12/libcudart.so.12.8.90: your nvidia driver is too old or missing.  If you have a CUDA GPU please upgrade to run ollama"
Nov 13 19:01:16 OMEN ollama[26614]: cudaSetDevice err: 35
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.147-05:00 level=DEBUG source=gpu.go:593 msg="Unable to load cudart library /usr/lib/ollama/cuda_v13/libcudart.so.13.0.88: your nvidia driver is too old or missing.  If you have a CUDA GPU please upgrade to run ollama"
Nov 13 19:01:16 OMEN ollama[26614]: cudaSetDevice err: 35
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.148-05:00 level=DEBUG source=gpu.go:593 msg="Unable to load cudart library /usr/lib/ollama/cuda_v13/libcudart.so.13.0.96: your nvidia driver is too old or missing.  If you have a CUDA GPU please upgrade to run ollama"
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.148-05:00 level=DEBUG source=amd_linux.go:102 msg="evaluating amdgpu node /sys/class/kfd/kfd/topology/nodes/0/properties"
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.148-05:00 level=DEBUG source=amd_linux.go:122 msg="detected CPU /sys/class/kfd/kfd/topology/nodes/0/properties"
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.148-05:00 level=DEBUG source=amd_linux.go:102 msg="evaluating amdgpu node /sys/class/kfd/kfd/topology/nodes/1/properties"
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.148-05:00 level=DEBUG source=amd_linux.go:203 msg="mapping amdgpu to drm sysfs nodes" amdgpu=/sys/class/kfd/kfd/topology/nodes/1/properties vendor=4098 device=29679 unique_id=0
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.149-05:00 level=DEBUG source=amd_linux.go:237 msg=matched amdgpu=/sys/class/kfd/kfd/topology/nodes/1/properties drm=/sys/class/drm/card1/device
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.149-05:00 level=DEBUG source=amd_linux.go:343 msg="amdgpu memory" gpu=0 total="8.0 GiB"
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.149-05:00 level=DEBUG source=amd_linux.go:344 msg="amdgpu memory" gpu=0 available="8.0 GiB"
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.149-05:00 level=DEBUG source=amd_common.go:16 msg="evaluating potential rocm lib dir /usr/lib/ollama/rocm"
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.149-05:00 level=DEBUG source=amd_common.go:44 msg="detected ROCM next to ollama executable /usr/lib/ollama/rocm"
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.149-05:00 level=INFO source=amd_linux.go:393 msg="skipping rocm gfx compatibility check" HSA_OVERRIDE_GFX_VERSION=10.3.0
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.149-05:00 level=DEBUG source=amd_linux.go:102 msg="evaluating amdgpu node /sys/class/kfd/kfd/topology/nodes/2/properties"
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.149-05:00 level=DEBUG source=amd_linux.go:203 msg="mapping amdgpu to drm sysfs nodes" amdgpu=/sys/class/kfd/kfd/topology/nodes/2/properties vendor=4098 device=5761 unique_id=0
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.149-05:00 level=DEBUG source=amd_linux.go:216 msg="failed to read sysfs node" file=/sys/class/drm/card1-HDMI-A-1/device/vendor error="open /sys/class/drm/card1-HDMI-A-1/device/vendor: no such file or directory"
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.149-05:00 level=DEBUG source=amd_linux.go:216 msg="failed to read sysfs node" file=/sys/class/drm/card1-Writeback-1/device/vendor error="open /sys/class/drm/card1-Writeback-1/device/vendor: no such file or directory"
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.149-05:00 level=DEBUG source=amd_linux.go:216 msg="failed to read sysfs node" file=/sys/class/drm/card1-eDP-1/device/vendor error="open /sys/class/drm/card1-eDP-1/device/vendor: no such file or directory"
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.149-05:00 level=DEBUG source=amd_linux.go:237 msg=matched amdgpu=/sys/class/kfd/kfd/topology/nodes/2/properties drm=/sys/class/drm/card2/device
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.149-05:00 level=INFO source=amd_linux.go:321 msg="unsupported Radeon iGPU detected skipping" id=1 total="512.0 MiB"
Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.152-05:00 level=INFO source=types.go:131 msg="inference compute" id=0 library=rocm variant="" compute=gfx1032 driver=6.14 name=1002:73ef total="8.0 GiB" available="8.0 GiB"


<!-- gh-comment-id:3530250294 --> @ganakee commented on GitHub (Nov 14, 2025): Just provided for comparison. I reverted to 0.12.3 and ran the same prompt using 0.12.3. ## Log for Comparison With 0.12.3 (This Works) OLLAMA_LOG=2 ``` journalctl -u ollama --since "$(systemctl show ollama --property=ActiveEnterTimestamp --value)" | sed -ne '/server config/,/inference compute/p' Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.131-05:00 level=INFO source=routes.go:1475 msg="server config" env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION:10.3.0 HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:4096 OLLAMA_DEBUG:DEBUG-4 OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://0.0.0.0:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/usr/share/ollama/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_REMOTES:[ollama.com] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES:2,3,1,0 http_proxy: https_proxy: no_proxy:]" Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.132-05:00 level=INFO source=images.go:518 msg="total blobs: 29" Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.132-05:00 level=INFO source=images.go:525 msg="total unused blobs removed: 0" Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.132-05:00 level=INFO source=routes.go:1528 msg="Listening on [::]:11434 (version 0.12.3)" Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.132-05:00 level=DEBUG source=sched.go:121 msg="starting llm scheduler" Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.133-05:00 level=INFO source=gpu.go:217 msg="looking for compatible GPUs" Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.133-05:00 level=DEBUG source=gpu.go:98 msg="searching for GPU discovery libraries for NVIDIA" Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.133-05:00 level=DEBUG source=gpu.go:520 msg="Searching for GPU library" name=libcuda.so* Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.133-05:00 level=DEBUG source=gpu.go:544 msg="gpu library search" globs="[/usr/lib/ollama/libcuda.so* /libcuda.so* /usr/local/cuda*/targets/*/lib/libcuda.so* /usr/lib/*-linux-gnu/nvidia/current/libcuda.so* /usr/lib/*-linux-gnu/libcuda.so* /usr/lib/wsl/lib/libcuda.so* /usr/lib/wsl/drivers/*/libcuda.so* /opt/cuda/lib*/libcuda.so* /usr/local/cuda/lib*/libcuda.so* /usr/lib*/libcuda.so* /usr/local/lib*/libcuda.so*]" Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.143-05:00 level=DEBUG source=gpu.go:577 msg="discovered GPU libraries" paths=[] Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.143-05:00 level=DEBUG source=gpu.go:520 msg="Searching for GPU library" name=libcudart.so* Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.143-05:00 level=DEBUG source=gpu.go:544 msg="gpu library search" globs="[/usr/lib/ollama/libcudart.so* /libcudart.so* /usr/lib/ollama/cuda_v*/libcudart.so* /usr/local/cuda/lib64/libcudart.so* /usr/lib/x86_64-linux-gnu/nvidia/current/libcudart.so* /usr/lib/x86_64-linux-gnu/libcudart.so* /usr/lib/wsl/lib/libcudart.so* /usr/lib/wsl/drivers/*/libcudart.so* /opt/cuda/lib64/libcudart.so* /usr/local/cuda*/targets/aarch64-linux/lib/libcudart.so* /usr/lib/aarch64-linux-gnu/nvidia/current/libcudart.so* /usr/lib/aarch64-linux-gnu/libcudart.so* /usr/local/cuda/lib*/libcudart.so* /usr/lib*/libcudart.so* /usr/local/lib*/libcudart.so*]" Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.146-05:00 level=DEBUG source=gpu.go:577 msg="discovered GPU libraries" paths="[/usr/lib/ollama/libcudart.so.12.8.90 /usr/lib/ollama/cuda_v12/libcudart.so.12.8.90 /usr/lib/ollama/cuda_v13/libcudart.so.13.0.88 /usr/lib/ollama/cuda_v13/libcudart.so.13.0.96]" Nov 13 19:01:16 OMEN ollama[26614]: cudaSetDevice err: 35 Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.147-05:00 level=DEBUG source=gpu.go:593 msg="Unable to load cudart library /usr/lib/ollama/libcudart.so.12.8.90: your nvidia driver is too old or missing. If you have a CUDA GPU please upgrade to run ollama" Nov 13 19:01:16 OMEN ollama[26614]: cudaSetDevice err: 35 Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.147-05:00 level=DEBUG source=gpu.go:593 msg="Unable to load cudart library /usr/lib/ollama/cuda_v12/libcudart.so.12.8.90: your nvidia driver is too old or missing. If you have a CUDA GPU please upgrade to run ollama" Nov 13 19:01:16 OMEN ollama[26614]: cudaSetDevice err: 35 Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.147-05:00 level=DEBUG source=gpu.go:593 msg="Unable to load cudart library /usr/lib/ollama/cuda_v13/libcudart.so.13.0.88: your nvidia driver is too old or missing. If you have a CUDA GPU please upgrade to run ollama" Nov 13 19:01:16 OMEN ollama[26614]: cudaSetDevice err: 35 Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.148-05:00 level=DEBUG source=gpu.go:593 msg="Unable to load cudart library /usr/lib/ollama/cuda_v13/libcudart.so.13.0.96: your nvidia driver is too old or missing. If you have a CUDA GPU please upgrade to run ollama" Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.148-05:00 level=DEBUG source=amd_linux.go:102 msg="evaluating amdgpu node /sys/class/kfd/kfd/topology/nodes/0/properties" Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.148-05:00 level=DEBUG source=amd_linux.go:122 msg="detected CPU /sys/class/kfd/kfd/topology/nodes/0/properties" Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.148-05:00 level=DEBUG source=amd_linux.go:102 msg="evaluating amdgpu node /sys/class/kfd/kfd/topology/nodes/1/properties" Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.148-05:00 level=DEBUG source=amd_linux.go:203 msg="mapping amdgpu to drm sysfs nodes" amdgpu=/sys/class/kfd/kfd/topology/nodes/1/properties vendor=4098 device=29679 unique_id=0 Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.149-05:00 level=DEBUG source=amd_linux.go:237 msg=matched amdgpu=/sys/class/kfd/kfd/topology/nodes/1/properties drm=/sys/class/drm/card1/device Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.149-05:00 level=DEBUG source=amd_linux.go:343 msg="amdgpu memory" gpu=0 total="8.0 GiB" Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.149-05:00 level=DEBUG source=amd_linux.go:344 msg="amdgpu memory" gpu=0 available="8.0 GiB" Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.149-05:00 level=DEBUG source=amd_common.go:16 msg="evaluating potential rocm lib dir /usr/lib/ollama/rocm" Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.149-05:00 level=DEBUG source=amd_common.go:44 msg="detected ROCM next to ollama executable /usr/lib/ollama/rocm" Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.149-05:00 level=INFO source=amd_linux.go:393 msg="skipping rocm gfx compatibility check" HSA_OVERRIDE_GFX_VERSION=10.3.0 Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.149-05:00 level=DEBUG source=amd_linux.go:102 msg="evaluating amdgpu node /sys/class/kfd/kfd/topology/nodes/2/properties" Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.149-05:00 level=DEBUG source=amd_linux.go:203 msg="mapping amdgpu to drm sysfs nodes" amdgpu=/sys/class/kfd/kfd/topology/nodes/2/properties vendor=4098 device=5761 unique_id=0 Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.149-05:00 level=DEBUG source=amd_linux.go:216 msg="failed to read sysfs node" file=/sys/class/drm/card1-HDMI-A-1/device/vendor error="open /sys/class/drm/card1-HDMI-A-1/device/vendor: no such file or directory" Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.149-05:00 level=DEBUG source=amd_linux.go:216 msg="failed to read sysfs node" file=/sys/class/drm/card1-Writeback-1/device/vendor error="open /sys/class/drm/card1-Writeback-1/device/vendor: no such file or directory" Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.149-05:00 level=DEBUG source=amd_linux.go:216 msg="failed to read sysfs node" file=/sys/class/drm/card1-eDP-1/device/vendor error="open /sys/class/drm/card1-eDP-1/device/vendor: no such file or directory" Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.149-05:00 level=DEBUG source=amd_linux.go:237 msg=matched amdgpu=/sys/class/kfd/kfd/topology/nodes/2/properties drm=/sys/class/drm/card2/device Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.149-05:00 level=INFO source=amd_linux.go:321 msg="unsupported Radeon iGPU detected skipping" id=1 total="512.0 MiB" Nov 13 19:01:16 OMEN ollama[26614]: time=2025-11-13T19:01:16.152-05:00 level=INFO source=types.go:131 msg="inference compute" id=0 library=rocm variant="" compute=gfx1032 driver=6.14 name=1002:73ef total="8.0 GiB" available="8.0 GiB" ```
Author
Owner

@dhiltgen commented on GitHub (Nov 14, 2025):

What happens if you don't set ROCR_VISIBLE_DEVICES:2,3,1,0

<!-- gh-comment-id:3530253968 --> @dhiltgen commented on GitHub (Nov 14, 2025): What happens if you don't set `ROCR_VISIBLE_DEVICES:2,3,1,0`
Author
Owner

@ganakee commented on GitHub (Nov 14, 2025):

Thanks @dhiltgen . @rick-github thanks as well for all the help.

While the problem might not be resolved, this does help to remove ROCR_VISIBLE_DEVICES.

I downloaded 0.12.11 (NOTE: RC1 is now deprecated. There is a slight chance that this is a 0.12.11 vs. 0.12.11-rc1 fix.)

I used 0.12.11 (not RC as no longer available). Installed, daemon-reload and restart ollama.service.

I noted that the start time was longer/slower. nvtop showed a pause before the model loaded as opposed to 0.12.3. Perhaps this is some polling?)

The systemd ollama.service file is now:

[Unit]
Description=Ollama Service
After=network-online.target

[Service]
ExecStart=/usr/bin/ollama serve
User=ollama
Group=ollama
Restart=always
RestartSec=3
Environment="PATH=$PATH"
Environment="OLLAMA_HOST=0.0.0.0"
#Environment="CUDA_VISIBLE_DEVICES=1"
# REMOVED 2025-11-13 Environment="ROCR_VISIBLE_DEVICES=2,3,1,0"
#Environment="ROCR_VISIBLE_DEVICES=1,0"
Environment="HSA_OVERRIDE_GFX_VERSION=10.3.0"
#Environment="HSA_OVERRIDE_GFX_VERSION=10.3.0"
Environment="OLLAMA_DEBUG=2"
#Environment="OLLAMA_DEBUG=1" #ORIGINAL

[Install]
WantedBy=multi-user.target

#[Service]
#Environment="OLLAMA_HOST=0.0.0.0:11434"



Log of 0.12.11 After Removing ROCR_VISIBLE_DEVICES and OLLAMA_DEBUG

journalctl -u ollama --since "$(systemctl show ollama --property=ActiveEnterTimestamp --value)" | sed -ne '/server config/,/inference compute/p'
Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.724-05:00 level=INFO source=routes.go:1544 msg="server config" env="map[CUDA_VISIBLE_DEVICES: GGML_VK_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION:10.3.0 HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:4096 OLLAMA_DEBUG:DEBUG-4 OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://0.0.0.0:11434 OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/usr/share/ollama/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_REMOTES:[ollama.com] OLLAMA_SCHED_SPREAD:false OLLAMA_VULKAN:false ROCR_VISIBLE_DEVICES: http_proxy: https_proxy: no_proxy:]"
Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.727-05:00 level=INFO source=images.go:522 msg="total blobs: 29"
Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.728-05:00 level=INFO source=images.go:529 msg="total unused blobs removed: 0"
Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.728-05:00 level=INFO source=routes.go:1597 msg="Listening on [::]:11434 (version 0.12.11)"
Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.728-05:00 level=DEBUG source=sched.go:120 msg="starting llm scheduler"
Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.728-05:00 level=INFO source=runner.go:67 msg="discovering available GPUs..."
Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.728-05:00 level=TRACE source=runner.go:421 msg="starting runner for device discovery" libDirs="[/usr/lib/ollama /usr/lib/ollama/rocm]" extraEnvs=map[]
Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.729-05:00 level=INFO source=server.go:392 msg="starting runner" cmd="/usr/bin/ollama runner --ollama-engine --port 40189"
Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.729-05:00 level=DEBUG source=server.go:393 msg=subprocess PATH=$PATH OLLAMA_HOST=0.0.0.0 HSA_OVERRIDE_GFX_VERSION=10.3.0 OLLAMA_DEBUG=2 LD_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/rocm OLLAMA_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/rocm
Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.738-05:00 level=INFO source=runner.go:1398 msg="starting ollama engine"
Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.739-05:00 level=INFO source=runner.go:1433 msg="Server listening on 127.0.0.1:40189"
Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.740-05:00 level=DEBUG source=gguf.go:590 msg=general.architecture type=string
Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.740-05:00 level=DEBUG source=gguf.go:590 msg=tokenizer.ggml.model type=string
Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.740-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.740-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.740-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0
Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.740-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default=""
Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.740-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default=""
Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.740-05:00 level=INFO source=ggml.go:136 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3
Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.740-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama
Nov 13 19:15:39 OMEN ollama[29102]: ggml_cuda_init: GGML_CUDA_FORCE_MMQ:    no
Nov 13 19:15:39 OMEN ollama[29102]: ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no
Nov 13 19:15:39 OMEN ollama[29102]: ggml_cuda_init: found 2 ROCm devices:
Nov 13 19:15:39 OMEN ollama[29102]:   Device 0: AMD Radeon RX 6650M, gfx1030 (0x1030), VMM: no, Wave Size: 32, ID: 0
Nov 13 19:15:39 OMEN ollama[29102]:   Device 1: AMD Radeon Graphics, gfx1030 (0x1030), VMM: no, Wave Size: 32, ID: 1
Nov 13 19:15:39 OMEN ollama[29102]: load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-haswell.so
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.349-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama/rocm
Nov 13 19:15:39 OMEN ollama[29102]: ggml_cuda_init: GGML_CUDA_FORCE_MMQ:    no
Nov 13 19:15:39 OMEN ollama[29102]: ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no
Nov 13 19:15:39 OMEN ollama[29102]: ggml_cuda_init: found 2 ROCm devices:
Nov 13 19:15:39 OMEN ollama[29102]:   Device 0: AMD Radeon RX 6650M, gfx1030 (0x1030), VMM: no, Wave Size: 32, ID: 0
Nov 13 19:15:39 OMEN ollama[29102]:   Device 1: AMD Radeon Graphics, gfx1030 (0x1030), VMM: no, Wave Size: 32, ID: 1
Nov 13 19:15:39 OMEN ollama[29102]: load_backend: loaded ROCm backend from /usr/lib/ollama/rocm/libggml-hip.so
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 ROCm.0.NO_VMM=1 ROCm.0.PEER_MAX_BATCH_SIZE=128 ROCm.1.NO_VMM=1 ROCm.1.PEER_MAX_BATCH_SIZE=128 compiler=cgo(gcc)
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}"
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}"
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}"
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}"
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}"
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default=""
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=runner.go:1373 msg="dummy model load took" duration=3.631371866s
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=runner.go:1378 msg="gathering device infos took" duration=25.848µs
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.372-05:00 level=TRACE source=runner.go:448 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/rocm]" devices="[{DeviceID:{ID:0 Library:ROCm} Name:ROCm0 Description:AMD Radeon RX 6650M FilterID: Integrated:false PCIID:0000:03:00.0 TotalMemory:8573157376 FreeMemory:8547991552 ComputeMajor:16 ComputeMinor:48 DriverMajor:60342 DriverMinor:13 LibraryPath:[/usr/lib/ollama /usr/lib/ollama/rocm]} {DeviceID:{ID:1 Library:ROCm} Name:ROCm1 Description:AMD Radeon Graphics FilterID: Integrated:true PCIID:0000:09:00.0 TotalMemory:16136155136 FreeMemory:16092319744 ComputeMajor:16 ComputeMinor:48 DriverMajor:60342 DriverMinor:13 LibraryPath:[/usr/lib/ollama /usr/lib/ollama/rocm]}]"
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.372-05:00 level=DEBUG source=runner.go:418 msg="bootstrap discovery took" duration=3.643493951s OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/rocm]" extra_envs=map[]
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.372-05:00 level=TRACE source=runner.go:421 msg="starting runner for device discovery" libDirs="[/usr/lib/ollama /usr/lib/ollama/cuda_v12]" extraEnvs=map[]
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.372-05:00 level=INFO source=server.go:392 msg="starting runner" cmd="/usr/bin/ollama runner --ollama-engine --port 34993"
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.372-05:00 level=DEBUG source=server.go:393 msg=subprocess PATH=$PATH OLLAMA_HOST=0.0.0.0 HSA_OVERRIDE_GFX_VERSION=10.3.0 OLLAMA_DEBUG=2 LD_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/cuda_v12 OLLAMA_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/cuda_v12
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.382-05:00 level=INFO source=runner.go:1398 msg="starting ollama engine"
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.382-05:00 level=INFO source=runner.go:1433 msg="Server listening on 127.0.0.1:34993"
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.383-05:00 level=DEBUG source=gguf.go:590 msg=general.architecture type=string
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.383-05:00 level=DEBUG source=gguf.go:590 msg=tokenizer.ggml.model type=string
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.383-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.384-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.384-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.384-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default=""
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.384-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default=""
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.384-05:00 level=INFO source=ggml.go:136 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.384-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama
Nov 13 19:15:39 OMEN ollama[29102]: load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-haswell.so
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama/cuda_v12
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 compiler=cgo(gcc)
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}"
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}"
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}"
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}"
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}"
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default=""
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=runner.go:1373 msg="dummy model load took" duration=4.964116ms
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=runner.go:1378 msg="gathering device infos took" duration=451ns
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=TRACE source=runner.go:448 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v12]" devices=[]
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.389-05:00 level=DEBUG source=runner.go:418 msg="bootstrap discovery took" duration=16.695638ms OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v12]" extra_envs=map[]
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.389-05:00 level=TRACE source=runner.go:421 msg="starting runner for device discovery" libDirs="[/usr/lib/ollama /usr/lib/ollama/cuda_v13]" extraEnvs=map[]
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.389-05:00 level=INFO source=server.go:392 msg="starting runner" cmd="/usr/bin/ollama runner --ollama-engine --port 39349"
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.389-05:00 level=DEBUG source=server.go:393 msg=subprocess PATH=$PATH OLLAMA_HOST=0.0.0.0 HSA_OVERRIDE_GFX_VERSION=10.3.0 OLLAMA_DEBUG=2 LD_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/cuda_v13 OLLAMA_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/cuda_v13
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.398-05:00 level=INFO source=runner.go:1398 msg="starting ollama engine"
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.398-05:00 level=INFO source=runner.go:1433 msg="Server listening on 127.0.0.1:39349"
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.400-05:00 level=DEBUG source=gguf.go:590 msg=general.architecture type=string
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.400-05:00 level=DEBUG source=gguf.go:590 msg=tokenizer.ggml.model type=string
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.400-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.400-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.400-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.400-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default=""
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.400-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default=""
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.400-05:00 level=INFO source=ggml.go:136 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.400-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama
Nov 13 19:15:39 OMEN ollama[29102]: load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-haswell.so
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.405-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama/cuda_v13
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 compiler=cgo(gcc)
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}"
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}"
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}"
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}"
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}"
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default=""
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=runner.go:1373 msg="dummy model load took" duration=5.937781ms
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=runner.go:1378 msg="gathering device infos took" duration=351ns
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=TRACE source=runner.go:448 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v13]" devices=[]
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=runner.go:418 msg="bootstrap discovery took" duration=17.396633ms OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v13]" extra_envs=map[]
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=runner.go:116 msg="evluating which if any devices to filter out" initial_count=2
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=runner.go:128 msg="verifying device is supported" library=/usr/lib/ollama/rocm description="AMD Radeon RX 6650M" compute=gfx1030 id=0 pci_id=0000:03:00.0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=runner.go:128 msg="verifying device is supported" library=/usr/lib/ollama/rocm description="AMD Radeon Graphics" compute=gfx1030 id=1 pci_id=0000:09:00.0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=TRACE source=runner.go:421 msg="starting runner for device discovery" libDirs="[/usr/lib/ollama /usr/lib/ollama/rocm]" extraEnvs="map[GGML_CUDA_INIT:1 ROCR_VISIBLE_DEVICES:1]"
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=TRACE source=runner.go:421 msg="starting runner for device discovery" libDirs="[/usr/lib/ollama /usr/lib/ollama/rocm]" extraEnvs="map[GGML_CUDA_INIT:1 ROCR_VISIBLE_DEVICES:0]"
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=INFO source=server.go:392 msg="starting runner" cmd="/usr/bin/ollama runner --ollama-engine --port 42143"
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=server.go:393 msg=subprocess PATH=$PATH OLLAMA_HOST=0.0.0.0 HSA_OVERRIDE_GFX_VERSION=10.3.0 OLLAMA_DEBUG=2 LD_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/rocm OLLAMA_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/rocm GGML_CUDA_INIT=1 ROCR_VISIBLE_DEVICES=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=INFO source=server.go:392 msg="starting runner" cmd="/usr/bin/ollama runner --ollama-engine --port 46073"
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=server.go:393 msg=subprocess PATH=$PATH OLLAMA_HOST=0.0.0.0 HSA_OVERRIDE_GFX_VERSION=10.3.0 OLLAMA_DEBUG=2 LD_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/rocm OLLAMA_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/rocm ROCR_VISIBLE_DEVICES=1 GGML_CUDA_INIT=1
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.416-05:00 level=INFO source=runner.go:1398 msg="starting ollama engine"
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.416-05:00 level=INFO source=runner.go:1398 msg="starting ollama engine"
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=INFO source=runner.go:1433 msg="Server listening on 127.0.0.1:42143"
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=INFO source=runner.go:1433 msg="Server listening on 127.0.0.1:46073"
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=DEBUG source=gguf.go:590 msg=general.architecture type=string
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=DEBUG source=gguf.go:590 msg=tokenizer.ggml.model type=string
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=DEBUG source=gguf.go:590 msg=general.architecture type=string
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=DEBUG source=gguf.go:590 msg=tokenizer.ggml.model type=string
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default=""
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default=""
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=INFO source=ggml.go:136 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default=""
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default=""
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=INFO source=ggml.go:136 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3
Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama
Nov 13 19:15:40 OMEN ollama[29102]: ggml_cuda_init: GGML_CUDA_FORCE_MMQ:    no
Nov 13 19:15:40 OMEN ollama[29102]: ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no
Nov 13 19:15:40 OMEN ollama[29102]: ggml_cuda_init: found 1 ROCm devices:
Nov 13 19:15:40 OMEN ollama[29102]:   Device 0: AMD Radeon RX 6650M, gfx1030 (0x1030), VMM: no, Wave Size: 32, ID: 0
Nov 13 19:15:40 OMEN ollama[29102]: ggml_cuda_init: GGML_CUDA_FORCE_MMQ:    no
Nov 13 19:15:40 OMEN ollama[29102]: ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no
Nov 13 19:15:40 OMEN ollama[29102]: ggml_cuda_init: found 1 ROCm devices:
Nov 13 19:15:40 OMEN ollama[29102]:   Device 0: AMD Radeon Graphics, gfx1030 (0x1030), VMM: no, Wave Size: 32, ID: 0
Nov 13 19:15:40 OMEN ollama[29102]: load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-haswell.so
Nov 13 19:15:40 OMEN ollama[29102]: time=2025-11-13T19:15:40.871-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama/rocm
Nov 13 19:15:40 OMEN ollama[29102]: load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-haswell.so
Nov 13 19:15:40 OMEN ollama[29102]: time=2025-11-13T19:15:40.881-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama/rocm
Nov 13 19:15:40 OMEN ollama[29102]: ggml_cuda_init: GGML_CUDA_FORCE_MMQ:    no
Nov 13 19:15:40 OMEN ollama[29102]: ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no
Nov 13 19:15:40 OMEN ollama[29102]: ggml_cuda_init: found 1 ROCm devices:
Nov 13 19:15:40 OMEN ollama[29102]: ggml_cuda_init: initializing rocBLAS on device 0
Nov 13 19:15:40 OMEN ollama[29102]: ggml_cuda_init: GGML_CUDA_FORCE_MMQ:    no
Nov 13 19:15:40 OMEN ollama[29102]: ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no
Nov 13 19:15:40 OMEN ollama[29102]: ggml_cuda_init: found 1 ROCm devices:
Nov 13 19:15:40 OMEN ollama[29102]: ggml_cuda_init: initializing rocBLAS on device 0
Nov 13 19:15:41 OMEN ollama[29102]: ggml_cuda_init: rocBLAS initialized on device 0
Nov 13 19:15:41 OMEN ollama[29102]:   Device 0: AMD Radeon Graphics, gfx1030 (0x1030), VMM: no, Wave Size: 32, ID: 0
Nov 13 19:15:41 OMEN ollama[29102]: load_backend: loaded ROCm backend from /usr/lib/ollama/rocm/libggml-hip.so
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.500-05:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 ROCm.0.NO_VMM=1 ROCm.0.PEER_MAX_BATCH_SIZE=128 compiler=cgo(gcc)
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.500-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.500-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.500-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.500-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}"
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.500-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}"
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.500-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}"
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.500-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}"
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.500-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.500-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.500-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.500-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.500-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}"
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.500-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default=""
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.501-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.501-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.501-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.501-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.501-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.501-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.501-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.501-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.501-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.501-05:00 level=DEBUG source=runner.go:1373 msg="dummy model load took" duration=2.083703264s
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.501-05:00 level=DEBUG source=runner.go:1378 msg="gathering device infos took" duration=19.837µs
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.501-05:00 level=TRACE source=runner.go:448 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/rocm]" devices="[{DeviceID:{ID:0 Library:ROCm} Name:ROCm0 Description:AMD Radeon Graphics FilterID: Integrated:true PCIID:0000:09:00.0 TotalMemory:16136155136 FreeMemory:15820640256 ComputeMajor:16 ComputeMinor:48 DriverMajor:60342 DriverMinor:13 LibraryPath:[/usr/lib/ollama /usr/lib/ollama/rocm]}]"
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.501-05:00 level=DEBUG source=runner.go:418 msg="bootstrap discovery took" duration=2.095034998s OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/rocm]" extra_envs="map[GGML_CUDA_INIT:1 ROCR_VISIBLE_DEVICES:1]"
Nov 13 19:15:41 OMEN ollama[29102]: ggml_cuda_init: rocBLAS initialized on device 0
Nov 13 19:15:41 OMEN ollama[29102]:   Device 0: AMD Radeon RX 6650M, gfx1030 (0x1030), VMM: no, Wave Size: 32, ID: 0
Nov 13 19:15:41 OMEN ollama[29102]: load_backend: loaded ROCm backend from /usr/lib/ollama/rocm/libggml-hip.so
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.634-05:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 ROCm.0.NO_VMM=1 ROCm.0.PEER_MAX_BATCH_SIZE=128 compiler=cgo(gcc)
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.634-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}"
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}"
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}"
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}"
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}"
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default=""
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=runner.go:1373 msg="dummy model load took" duration=2.217840125s
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=runner.go:1378 msg="gathering device infos took" duration=14.858µs
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=TRACE source=runner.go:448 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/rocm]" devices="[{DeviceID:{ID:0 Library:ROCm} Name:ROCm0 Description:AMD Radeon RX 6650M FilterID: Integrated:false PCIID:0000:03:00.0 TotalMemory:8573157376 FreeMemory:8153726976 ComputeMajor:16 ComputeMinor:48 DriverMajor:60342 DriverMinor:13 LibraryPath:[/usr/lib/ollama /usr/lib/ollama/rocm]}]"
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=runner.go:418 msg="bootstrap discovery took" duration=2.229148103s OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/rocm]" extra_envs="map[GGML_CUDA_INIT:1 ROCR_VISIBLE_DEVICES:0]"
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=TRACE source=runner.go:156 msg="supported GPU library combinations before filtering" supported="map[ROCm:map[/usr/lib/ollama/rocm:map[0:0 1:1]]]"
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=runner.go:175 msg="adjusting filtering IDs" FilterID=0 new_ID=0
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=runner.go:175 msg="adjusting filtering IDs" FilterID=1 new_ID=1
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=runner.go:40 msg="GPU bootstrap discovery took" duration=5.907163911s
Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=INFO source=types.go:42 msg="inference compute" id=0 filter_id=0 library=ROCm compute=gfx1030 name=ROCm0 description="AMD Radeon RX 6650M" libdirs=ollama,rocm driver=60342.13 pci_id=0000:03:00.0 type=discrete total="8.0 GiB" available="8.0 GiB"



<!-- gh-comment-id:3530286925 --> @ganakee commented on GitHub (Nov 14, 2025): Thanks @dhiltgen . @rick-github thanks as well for all the help. While the problem might not be resolved, this does help to remove `ROCR_VISIBLE_DEVICES`. I downloaded 0.12.11 (NOTE: RC1 is now deprecated. There is a slight chance that this is a 0.12.11 vs. 0.12.11-rc1 fix.) I used 0.12.11 (not RC as no longer available). Installed, `daemon-reload` and `restart ollama.service`. I noted that the start time was longer/slower. `nvtop` showed a pause before the model loaded as opposed to 0.12.3. Perhaps this is some polling?) The `systemd ollama.service` file is now: ``` [Unit] Description=Ollama Service After=network-online.target [Service] ExecStart=/usr/bin/ollama serve User=ollama Group=ollama Restart=always RestartSec=3 Environment="PATH=$PATH" Environment="OLLAMA_HOST=0.0.0.0" #Environment="CUDA_VISIBLE_DEVICES=1" # REMOVED 2025-11-13 Environment="ROCR_VISIBLE_DEVICES=2,3,1,0" #Environment="ROCR_VISIBLE_DEVICES=1,0" Environment="HSA_OVERRIDE_GFX_VERSION=10.3.0" #Environment="HSA_OVERRIDE_GFX_VERSION=10.3.0" Environment="OLLAMA_DEBUG=2" #Environment="OLLAMA_DEBUG=1" #ORIGINAL [Install] WantedBy=multi-user.target #[Service] #Environment="OLLAMA_HOST=0.0.0.0:11434" ``` ## Log of 0.12.11 After Removing ROCR_VISIBLE_DEVICES and OLLAMA_DEBUG ``` journalctl -u ollama --since "$(systemctl show ollama --property=ActiveEnterTimestamp --value)" | sed -ne '/server config/,/inference compute/p' Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.724-05:00 level=INFO source=routes.go:1544 msg="server config" env="map[CUDA_VISIBLE_DEVICES: GGML_VK_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION:10.3.0 HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:4096 OLLAMA_DEBUG:DEBUG-4 OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://0.0.0.0:11434 OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/usr/share/ollama/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_REMOTES:[ollama.com] OLLAMA_SCHED_SPREAD:false OLLAMA_VULKAN:false ROCR_VISIBLE_DEVICES: http_proxy: https_proxy: no_proxy:]" Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.727-05:00 level=INFO source=images.go:522 msg="total blobs: 29" Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.728-05:00 level=INFO source=images.go:529 msg="total unused blobs removed: 0" Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.728-05:00 level=INFO source=routes.go:1597 msg="Listening on [::]:11434 (version 0.12.11)" Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.728-05:00 level=DEBUG source=sched.go:120 msg="starting llm scheduler" Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.728-05:00 level=INFO source=runner.go:67 msg="discovering available GPUs..." Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.728-05:00 level=TRACE source=runner.go:421 msg="starting runner for device discovery" libDirs="[/usr/lib/ollama /usr/lib/ollama/rocm]" extraEnvs=map[] Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.729-05:00 level=INFO source=server.go:392 msg="starting runner" cmd="/usr/bin/ollama runner --ollama-engine --port 40189" Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.729-05:00 level=DEBUG source=server.go:393 msg=subprocess PATH=$PATH OLLAMA_HOST=0.0.0.0 HSA_OVERRIDE_GFX_VERSION=10.3.0 OLLAMA_DEBUG=2 LD_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/rocm OLLAMA_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/rocm Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.738-05:00 level=INFO source=runner.go:1398 msg="starting ollama engine" Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.739-05:00 level=INFO source=runner.go:1433 msg="Server listening on 127.0.0.1:40189" Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.740-05:00 level=DEBUG source=gguf.go:590 msg=general.architecture type=string Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.740-05:00 level=DEBUG source=gguf.go:590 msg=tokenizer.ggml.model type=string Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.740-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32 Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.740-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32 Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.740-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0 Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.740-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default="" Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.740-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default="" Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.740-05:00 level=INFO source=ggml.go:136 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3 Nov 13 19:15:35 OMEN ollama[29102]: time=2025-11-13T19:15:35.740-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama Nov 13 19:15:39 OMEN ollama[29102]: ggml_cuda_init: GGML_CUDA_FORCE_MMQ: no Nov 13 19:15:39 OMEN ollama[29102]: ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no Nov 13 19:15:39 OMEN ollama[29102]: ggml_cuda_init: found 2 ROCm devices: Nov 13 19:15:39 OMEN ollama[29102]: Device 0: AMD Radeon RX 6650M, gfx1030 (0x1030), VMM: no, Wave Size: 32, ID: 0 Nov 13 19:15:39 OMEN ollama[29102]: Device 1: AMD Radeon Graphics, gfx1030 (0x1030), VMM: no, Wave Size: 32, ID: 1 Nov 13 19:15:39 OMEN ollama[29102]: load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-haswell.so Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.349-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama/rocm Nov 13 19:15:39 OMEN ollama[29102]: ggml_cuda_init: GGML_CUDA_FORCE_MMQ: no Nov 13 19:15:39 OMEN ollama[29102]: ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no Nov 13 19:15:39 OMEN ollama[29102]: ggml_cuda_init: found 2 ROCm devices: Nov 13 19:15:39 OMEN ollama[29102]: Device 0: AMD Radeon RX 6650M, gfx1030 (0x1030), VMM: no, Wave Size: 32, ID: 0 Nov 13 19:15:39 OMEN ollama[29102]: Device 1: AMD Radeon Graphics, gfx1030 (0x1030), VMM: no, Wave Size: 32, ID: 1 Nov 13 19:15:39 OMEN ollama[29102]: load_backend: loaded ROCm backend from /usr/lib/ollama/rocm/libggml-hip.so Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 ROCm.0.NO_VMM=1 ROCm.0.PEER_MAX_BATCH_SIZE=128 ROCm.1.NO_VMM=1 ROCm.1.PEER_MAX_BATCH_SIZE=128 compiler=cgo(gcc) Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default="" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=runner.go:1373 msg="dummy model load took" duration=3.631371866s Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.371-05:00 level=DEBUG source=runner.go:1378 msg="gathering device infos took" duration=25.848µs Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.372-05:00 level=TRACE source=runner.go:448 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/rocm]" devices="[{DeviceID:{ID:0 Library:ROCm} Name:ROCm0 Description:AMD Radeon RX 6650M FilterID: Integrated:false PCIID:0000:03:00.0 TotalMemory:8573157376 FreeMemory:8547991552 ComputeMajor:16 ComputeMinor:48 DriverMajor:60342 DriverMinor:13 LibraryPath:[/usr/lib/ollama /usr/lib/ollama/rocm]} {DeviceID:{ID:1 Library:ROCm} Name:ROCm1 Description:AMD Radeon Graphics FilterID: Integrated:true PCIID:0000:09:00.0 TotalMemory:16136155136 FreeMemory:16092319744 ComputeMajor:16 ComputeMinor:48 DriverMajor:60342 DriverMinor:13 LibraryPath:[/usr/lib/ollama /usr/lib/ollama/rocm]}]" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.372-05:00 level=DEBUG source=runner.go:418 msg="bootstrap discovery took" duration=3.643493951s OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/rocm]" extra_envs=map[] Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.372-05:00 level=TRACE source=runner.go:421 msg="starting runner for device discovery" libDirs="[/usr/lib/ollama /usr/lib/ollama/cuda_v12]" extraEnvs=map[] Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.372-05:00 level=INFO source=server.go:392 msg="starting runner" cmd="/usr/bin/ollama runner --ollama-engine --port 34993" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.372-05:00 level=DEBUG source=server.go:393 msg=subprocess PATH=$PATH OLLAMA_HOST=0.0.0.0 HSA_OVERRIDE_GFX_VERSION=10.3.0 OLLAMA_DEBUG=2 LD_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/cuda_v12 OLLAMA_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/cuda_v12 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.382-05:00 level=INFO source=runner.go:1398 msg="starting ollama engine" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.382-05:00 level=INFO source=runner.go:1433 msg="Server listening on 127.0.0.1:34993" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.383-05:00 level=DEBUG source=gguf.go:590 msg=general.architecture type=string Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.383-05:00 level=DEBUG source=gguf.go:590 msg=tokenizer.ggml.model type=string Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.383-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.384-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.384-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.384-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default="" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.384-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default="" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.384-05:00 level=INFO source=ggml.go:136 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.384-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama Nov 13 19:15:39 OMEN ollama[29102]: load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-haswell.so Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama/cuda_v12 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 compiler=cgo(gcc) Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default="" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=runner.go:1373 msg="dummy model load took" duration=4.964116ms Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=DEBUG source=runner.go:1378 msg="gathering device infos took" duration=451ns Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.388-05:00 level=TRACE source=runner.go:448 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v12]" devices=[] Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.389-05:00 level=DEBUG source=runner.go:418 msg="bootstrap discovery took" duration=16.695638ms OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v12]" extra_envs=map[] Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.389-05:00 level=TRACE source=runner.go:421 msg="starting runner for device discovery" libDirs="[/usr/lib/ollama /usr/lib/ollama/cuda_v13]" extraEnvs=map[] Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.389-05:00 level=INFO source=server.go:392 msg="starting runner" cmd="/usr/bin/ollama runner --ollama-engine --port 39349" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.389-05:00 level=DEBUG source=server.go:393 msg=subprocess PATH=$PATH OLLAMA_HOST=0.0.0.0 HSA_OVERRIDE_GFX_VERSION=10.3.0 OLLAMA_DEBUG=2 LD_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/cuda_v13 OLLAMA_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/cuda_v13 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.398-05:00 level=INFO source=runner.go:1398 msg="starting ollama engine" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.398-05:00 level=INFO source=runner.go:1433 msg="Server listening on 127.0.0.1:39349" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.400-05:00 level=DEBUG source=gguf.go:590 msg=general.architecture type=string Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.400-05:00 level=DEBUG source=gguf.go:590 msg=tokenizer.ggml.model type=string Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.400-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.400-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.400-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.400-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default="" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.400-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default="" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.400-05:00 level=INFO source=ggml.go:136 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.400-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama Nov 13 19:15:39 OMEN ollama[29102]: load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-haswell.so Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.405-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama/cuda_v13 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 compiler=cgo(gcc) Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default="" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=runner.go:1373 msg="dummy model load took" duration=5.937781ms Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=runner.go:1378 msg="gathering device infos took" duration=351ns Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=TRACE source=runner.go:448 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v13]" devices=[] Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=runner.go:418 msg="bootstrap discovery took" duration=17.396633ms OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/cuda_v13]" extra_envs=map[] Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=runner.go:116 msg="evluating which if any devices to filter out" initial_count=2 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=runner.go:128 msg="verifying device is supported" library=/usr/lib/ollama/rocm description="AMD Radeon RX 6650M" compute=gfx1030 id=0 pci_id=0000:03:00.0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=runner.go:128 msg="verifying device is supported" library=/usr/lib/ollama/rocm description="AMD Radeon Graphics" compute=gfx1030 id=1 pci_id=0000:09:00.0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=TRACE source=runner.go:421 msg="starting runner for device discovery" libDirs="[/usr/lib/ollama /usr/lib/ollama/rocm]" extraEnvs="map[GGML_CUDA_INIT:1 ROCR_VISIBLE_DEVICES:1]" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=TRACE source=runner.go:421 msg="starting runner for device discovery" libDirs="[/usr/lib/ollama /usr/lib/ollama/rocm]" extraEnvs="map[GGML_CUDA_INIT:1 ROCR_VISIBLE_DEVICES:0]" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=INFO source=server.go:392 msg="starting runner" cmd="/usr/bin/ollama runner --ollama-engine --port 42143" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=server.go:393 msg=subprocess PATH=$PATH OLLAMA_HOST=0.0.0.0 HSA_OVERRIDE_GFX_VERSION=10.3.0 OLLAMA_DEBUG=2 LD_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/rocm OLLAMA_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/rocm GGML_CUDA_INIT=1 ROCR_VISIBLE_DEVICES=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=INFO source=server.go:392 msg="starting runner" cmd="/usr/bin/ollama runner --ollama-engine --port 46073" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.406-05:00 level=DEBUG source=server.go:393 msg=subprocess PATH=$PATH OLLAMA_HOST=0.0.0.0 HSA_OVERRIDE_GFX_VERSION=10.3.0 OLLAMA_DEBUG=2 LD_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/rocm OLLAMA_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama/rocm ROCR_VISIBLE_DEVICES=1 GGML_CUDA_INIT=1 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.416-05:00 level=INFO source=runner.go:1398 msg="starting ollama engine" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.416-05:00 level=INFO source=runner.go:1398 msg="starting ollama engine" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=INFO source=runner.go:1433 msg="Server listening on 127.0.0.1:42143" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=INFO source=runner.go:1433 msg="Server listening on 127.0.0.1:46073" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=DEBUG source=gguf.go:590 msg=general.architecture type=string Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=DEBUG source=gguf.go:590 msg=tokenizer.ggml.model type=string Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=DEBUG source=gguf.go:590 msg=general.architecture type=string Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=DEBUG source=gguf.go:590 msg=tokenizer.ggml.model type=string Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default="" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default="" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=INFO source=ggml.go:136 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default="" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default="" Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=INFO source=ggml.go:136 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3 Nov 13 19:15:39 OMEN ollama[29102]: time=2025-11-13T19:15:39.417-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama Nov 13 19:15:40 OMEN ollama[29102]: ggml_cuda_init: GGML_CUDA_FORCE_MMQ: no Nov 13 19:15:40 OMEN ollama[29102]: ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no Nov 13 19:15:40 OMEN ollama[29102]: ggml_cuda_init: found 1 ROCm devices: Nov 13 19:15:40 OMEN ollama[29102]: Device 0: AMD Radeon RX 6650M, gfx1030 (0x1030), VMM: no, Wave Size: 32, ID: 0 Nov 13 19:15:40 OMEN ollama[29102]: ggml_cuda_init: GGML_CUDA_FORCE_MMQ: no Nov 13 19:15:40 OMEN ollama[29102]: ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no Nov 13 19:15:40 OMEN ollama[29102]: ggml_cuda_init: found 1 ROCm devices: Nov 13 19:15:40 OMEN ollama[29102]: Device 0: AMD Radeon Graphics, gfx1030 (0x1030), VMM: no, Wave Size: 32, ID: 0 Nov 13 19:15:40 OMEN ollama[29102]: load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-haswell.so Nov 13 19:15:40 OMEN ollama[29102]: time=2025-11-13T19:15:40.871-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama/rocm Nov 13 19:15:40 OMEN ollama[29102]: load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-haswell.so Nov 13 19:15:40 OMEN ollama[29102]: time=2025-11-13T19:15:40.881-05:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama/rocm Nov 13 19:15:40 OMEN ollama[29102]: ggml_cuda_init: GGML_CUDA_FORCE_MMQ: no Nov 13 19:15:40 OMEN ollama[29102]: ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no Nov 13 19:15:40 OMEN ollama[29102]: ggml_cuda_init: found 1 ROCm devices: Nov 13 19:15:40 OMEN ollama[29102]: ggml_cuda_init: initializing rocBLAS on device 0 Nov 13 19:15:40 OMEN ollama[29102]: ggml_cuda_init: GGML_CUDA_FORCE_MMQ: no Nov 13 19:15:40 OMEN ollama[29102]: ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no Nov 13 19:15:40 OMEN ollama[29102]: ggml_cuda_init: found 1 ROCm devices: Nov 13 19:15:40 OMEN ollama[29102]: ggml_cuda_init: initializing rocBLAS on device 0 Nov 13 19:15:41 OMEN ollama[29102]: ggml_cuda_init: rocBLAS initialized on device 0 Nov 13 19:15:41 OMEN ollama[29102]: Device 0: AMD Radeon Graphics, gfx1030 (0x1030), VMM: no, Wave Size: 32, ID: 0 Nov 13 19:15:41 OMEN ollama[29102]: load_backend: loaded ROCm backend from /usr/lib/ollama/rocm/libggml-hip.so Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.500-05:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 ROCm.0.NO_VMM=1 ROCm.0.PEER_MAX_BATCH_SIZE=128 compiler=cgo(gcc) Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.500-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0 Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.500-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0 Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.500-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0 Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.500-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}" Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.500-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}" Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.500-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}" Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.500-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}" Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.500-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.500-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0 Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.500-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.500-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0 Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.500-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}" Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.500-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default="" Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.501-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0 Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.501-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0 Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.501-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0 Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.501-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0 Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.501-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0 Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.501-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0 Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.501-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0 Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.501-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000 Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.501-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1 Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.501-05:00 level=DEBUG source=runner.go:1373 msg="dummy model load took" duration=2.083703264s Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.501-05:00 level=DEBUG source=runner.go:1378 msg="gathering device infos took" duration=19.837µs Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.501-05:00 level=TRACE source=runner.go:448 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/rocm]" devices="[{DeviceID:{ID:0 Library:ROCm} Name:ROCm0 Description:AMD Radeon Graphics FilterID: Integrated:true PCIID:0000:09:00.0 TotalMemory:16136155136 FreeMemory:15820640256 ComputeMajor:16 ComputeMinor:48 DriverMajor:60342 DriverMinor:13 LibraryPath:[/usr/lib/ollama /usr/lib/ollama/rocm]}]" Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.501-05:00 level=DEBUG source=runner.go:418 msg="bootstrap discovery took" duration=2.095034998s OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/rocm]" extra_envs="map[GGML_CUDA_INIT:1 ROCR_VISIBLE_DEVICES:1]" Nov 13 19:15:41 OMEN ollama[29102]: ggml_cuda_init: rocBLAS initialized on device 0 Nov 13 19:15:41 OMEN ollama[29102]: Device 0: AMD Radeon RX 6650M, gfx1030 (0x1030), VMM: no, Wave Size: 32, ID: 0 Nov 13 19:15:41 OMEN ollama[29102]: load_backend: loaded ROCm backend from /usr/lib/ollama/rocm/libggml-hip.so Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.634-05:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 ROCm.0.NO_VMM=1 ROCm.0.PEER_MAX_BATCH_SIZE=128 compiler=cgo(gcc) Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.634-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0 Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0 Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0 Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}" Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}" Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}" Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}" Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0 Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0 Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}" Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default="" Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0 Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0 Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0 Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0 Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0 Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0 Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0 Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000 Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1 Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=runner.go:1373 msg="dummy model load took" duration=2.217840125s Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=runner.go:1378 msg="gathering device infos took" duration=14.858µs Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=TRACE source=runner.go:448 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/rocm]" devices="[{DeviceID:{ID:0 Library:ROCm} Name:ROCm0 Description:AMD Radeon RX 6650M FilterID: Integrated:false PCIID:0000:03:00.0 TotalMemory:8573157376 FreeMemory:8153726976 ComputeMajor:16 ComputeMinor:48 DriverMajor:60342 DriverMinor:13 LibraryPath:[/usr/lib/ollama /usr/lib/ollama/rocm]}]" Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=runner.go:418 msg="bootstrap discovery took" duration=2.229148103s OLLAMA_LIBRARY_PATH="[/usr/lib/ollama /usr/lib/ollama/rocm]" extra_envs="map[GGML_CUDA_INIT:1 ROCR_VISIBLE_DEVICES:0]" Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=TRACE source=runner.go:156 msg="supported GPU library combinations before filtering" supported="map[ROCm:map[/usr/lib/ollama/rocm:map[0:0 1:1]]]" Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=runner.go:175 msg="adjusting filtering IDs" FilterID=0 new_ID=0 Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=runner.go:175 msg="adjusting filtering IDs" FilterID=1 new_ID=1 Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=DEBUG source=runner.go:40 msg="GPU bootstrap discovery took" duration=5.907163911s Nov 13 19:15:41 OMEN ollama[29102]: time=2025-11-13T19:15:41.635-05:00 level=INFO source=types.go:42 msg="inference compute" id=0 filter_id=0 library=ROCm compute=gfx1030 name=ROCm0 description="AMD Radeon RX 6650M" libdirs=ollama,rocm driver=60342.13 pci_id=0000:03:00.0 type=discrete total="8.0 GiB" available="8.0 GiB" ```
Author
Owner

@ChenxiWu-Lab commented on GitHub (Nov 14, 2025):

Hello, I encountered a similar problem while using my R9700. Ollama initially worked on the GPU for inference, but then suddenly reverted to CPU. I even thought there was something wrong with my Dify or my Ubuntu system. However, reinstalling Ubuntu Server LTS 24.4.3 didn't improve the situation.

My graphics card isn't on the official support list, but since it initially worked for GPU inference, it seems to be supported.

Ummmm, hopefully we can both resolve this issue.

<!-- gh-comment-id:3532685182 --> @ChenxiWu-Lab commented on GitHub (Nov 14, 2025): Hello, I encountered a similar problem while using my R9700. Ollama initially worked on the GPU for inference, but then suddenly reverted to CPU. I even thought there was something wrong with my Dify or my Ubuntu system. However, reinstalling Ubuntu Server LTS 24.4.3 didn't improve the situation. My graphics card isn't on the official support list, but since it initially worked for GPU inference, it seems to be supported. Ummmm, hopefully we can both resolve this issue.
Author
Owner

@dhiltgen commented on GitHub (Nov 14, 2025):

While the problem might not be resolved, this does help to remove ROCR_VISIBLE_DEVICES.

@ganakee from your logs, it looks like it correctly identified 1 compatible GPU, and did so in about 6 seconds. Your ROCR_VISIBLE_DEVICES setting was pointing to invalid GPUs, which was likely what was leading to the problem. Since your other GPU appears to be an iGPU, it most likely isn't gfx1030, so it seems like the system is behaving as designed. I'm going to go ahead and close the issue now, but if it isn't running inference on your GPU, please share logs showing a model load falling back to CPU and I'll reopen.

@ChenxiWu-Lab your GPU is gfx1201 which is unrelated to this issue. Support for that GPU is problematic currently given ROCm compatibility challenges, and other issues exist tracking that.

<!-- gh-comment-id:3533616129 --> @dhiltgen commented on GitHub (Nov 14, 2025): > While the problem might not be resolved, this does help to remove ROCR_VISIBLE_DEVICES. @ganakee from your logs, it looks like it correctly identified 1 compatible GPU, and did so in about 6 seconds. Your ROCR_VISIBLE_DEVICES setting was pointing to invalid GPUs, which was likely what was leading to the problem. Since your other GPU appears to be an iGPU, it most likely isn't gfx1030, so it seems like the system is behaving as designed. I'm going to go ahead and close the issue now, but if it isn't running inference on your GPU, please share logs showing a model load falling back to CPU and I'll reopen. @ChenxiWu-Lab your GPU is gfx1201 which is unrelated to this issue. Support for that GPU is problematic currently given ROCm compatibility challenges, and other issues exist tracking that.
Author
Owner

@ganakee commented on GitHub (Nov 14, 2025):

@dhiltgen Great. I will watch it.

My comment that this might not be resolved was only suggesting that something may have changed code-wise between 0.12.3 and 0.12.6 regarding "polling" the GPU cards. The config worked for over a year until 0.12.6. I have seen a few bug reports possibly related to GPU polling and just mentioned that the issue (not complaint or concern) might be an unintended change. No problem. Just trying to be helpful and appreciated the help.

<!-- gh-comment-id:3533938648 --> @ganakee commented on GitHub (Nov 14, 2025): @dhiltgen Great. I will watch it. My comment that this might not be resolved was only suggesting that something may have changed code-wise between 0.12.3 and 0.12.6 regarding "polling" the GPU cards. The config worked for over a year until 0.12.6. I have seen a few bug reports possibly related to GPU polling and just mentioned that the issue (not complaint or concern) might be an unintended change. No problem. Just trying to be helpful and appreciated the help.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#55164