[GH-ISSUE #11628] Ollama on Linux Detects GPU but Runs on CPU Only #7682

Closed
opened 2026-04-12 19:47:17 -05:00 by GiteaMirror · 12 comments
Owner

Originally created by @20246688 on GitHub (Aug 1, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/11628

Hi,
I installed the Linux version of Ollama on a server, and I copied over the model files that I had previously downloaded using the Windows version. I’ve set the model path and IP address correctly, and the logs do show that the GPU is detected.

However, during inference, the service still seems to fully consume the CPU, while the GPU usage remains almost idle. As a result, the generation is very slow.

Could you please help me understand why this might be happening?

Originally created by @20246688 on GitHub (Aug 1, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/11628 Hi, I installed the Linux version of Ollama on a server, and I copied over the model files that I had previously downloaded using the Windows version. I’ve set the model path and IP address correctly, and the logs do show that the GPU is detected. However, during inference, the service still seems to fully consume the CPU, while the GPU usage remains almost idle. As a result, the generation is very slow. Could you please help me understand why this might be happening?
Author
Owner

@t-deux commented on GitHub (Aug 1, 2025):

Same here but only with some vision models. Eg : Granite3.2, Gemma3, Qwen2.5vl. But work with mistral-small3.2 with the Unsloth version...
I try forcing gpu_num to 100 but not working this time. And this is not a VRAM capacity problem, it just doesn't work but tell that all is on GPU.

<!-- gh-comment-id:3145266474 --> @t-deux commented on GitHub (Aug 1, 2025): Same here but only with some vision models. Eg : Granite3.2, Gemma3, Qwen2.5vl. But work with mistral-small3.2 with the Unsloth version... I try forcing gpu_num to 100 but not working this time. And this is not a VRAM capacity problem, it just doesn't work but tell that all is on GPU.
Author
Owner

@rick-github commented on GitHub (Aug 2, 2025):

Server logs will help in debugging.

<!-- gh-comment-id:3146053154 --> @rick-github commented on GitHub (Aug 2, 2025): [Server logs](https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#how-to-troubleshoot-issues) will help in debugging.
Author
Owner

@dcmoore commented on GitHub (Aug 3, 2025):

I'm also having this issue. I'm running llama3:8b using ollama version 0.10.1.r1.g4183bb0574a2. When I run ollama ps, I see processor has allocated 100% GPU. When I start the ollama server, I see my GPU get recognized ...msg="inference compute"...library=cuda variant=v12 compute=12.0 driver=12.9 name="NVIDIA GeForce RTX 5080" total="15.4 GiB" available="14.4 GiB". When I chat with the model, I see my CPU utilization shoot up and stay up during generation, while my GPU utilization remains untouched. I can see the GPU offload log as well (which seems to indicate there isn't a problem):

...msg=offload library=cuda layers.requested=-1 layers.model=33 layers.offload=33 layers.split="" memory.available="[14.4 GiB]" memory.gpu_overhead="0 B" memory.required.full="5.4 GiB" memory.required.partial="5.4 GiB" memory.required.kv="512.0 MiB" memory.required.allocations="[5.4 GiB]" memory.weights.total="4.1 GiB" memory.weights.repeating="3.7 GiB" memory.weights.nonrepeating="411.0 MiB" memory.graph.full="296.0 MiB" memory.graph.partial="677.5 MiB"

Occasionally I'll see a WARN log ...msg="gpu VRAM usage didn't recover within timeout"...runner.size="5.4 GiB" runner.vram="5.4 GiB" runner.parallel=1. However, that doesn't happen all of the time.

<!-- gh-comment-id:3147022954 --> @dcmoore commented on GitHub (Aug 3, 2025): I'm also having this issue. I'm running `llama3:8b` using ollama version `0.10.1.r1.g4183bb0574a2`. When I run ollama ps, I see processor has allocated 100% GPU. When I start the ollama server, I see my GPU get recognized `...msg="inference compute"...library=cuda variant=v12 compute=12.0 driver=12.9 name="NVIDIA GeForce RTX 5080" total="15.4 GiB" available="14.4 GiB"`. When I chat with the model, I see my CPU utilization shoot up and stay up during generation, while my GPU utilization remains untouched. I can see the GPU offload log as well (which seems to indicate there isn't a problem): > ...msg=offload library=cuda layers.requested=-1 layers.model=33 layers.offload=33 layers.split="" memory.available="[14.4 GiB]" memory.gpu_overhead="0 B" memory.required.full="5.4 GiB" memory.required.partial="5.4 GiB" memory.required.kv="512.0 MiB" memory.required.allocations="[5.4 GiB]" memory.weights.total="4.1 GiB" memory.weights.repeating="3.7 GiB" memory.weights.nonrepeating="411.0 MiB" memory.graph.full="296.0 MiB" memory.graph.partial="677.5 MiB" Occasionally I'll see a WARN log `...msg="gpu VRAM usage didn't recover within timeout"...runner.size="5.4 GiB" runner.vram="5.4 GiB" runner.parallel=1`. However, that doesn't happen all of the time.
Author
Owner

@rick-github commented on GitHub (Aug 3, 2025):

Server logs will help in debugging.

<!-- gh-comment-id:3147045578 --> @rick-github commented on GitHub (Aug 3, 2025): [Server logs](https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#how-to-troubleshoot-issues) will help in debugging.
Author
Owner

@dcmoore commented on GitHub (Aug 3, 2025):

@rick-github the info I posted was copied from the server logs. Are you asking for the full log? Or was there anything in particular you were looking for?

<!-- gh-comment-id:3147209203 --> @dcmoore commented on GitHub (Aug 3, 2025): @rick-github the info I posted was copied from the server logs. Are you asking for the full log? Or was there anything in particular you were looking for?
Author
Owner

@rick-github commented on GitHub (Aug 3, 2025):

Full log.

<!-- gh-comment-id:3147216075 --> @rick-github commented on GitHub (Aug 3, 2025): Full log.
Author
Owner

@dcmoore commented on GitHub (Aug 3, 2025):

Here's a full log with some machine specific things (memory addresses, username, etc) redacted.

OLLAMA_DEBUG=1 ollama serve

time=2025-08-02T22:45:44.863-07:00 level=INFO source=routes.go:1238 msg="server config" env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:4096 OLLAMA_DEBUG:DEBUG OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/home/USER-REDACTED/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES: http_proxy: https_proxy: no_proxy:]"
time=2025-08-02T22:45:44.863-07:00 level=INFO source=images.go:476 msg="total blobs: 16"
time=2025-08-02T22:45:44.863-07:00 level=INFO source=images.go:483 msg="total unused blobs removed: 0"
time=2025-08-02T22:45:44.863-07:00 level=INFO source=routes.go:1291 msg="Listening on 127.0.0.1:11434 (version 0.10.1.r1.g4183bb0574a2)"
time=2025-08-02T22:45:44.863-07:00 level=DEBUG source=sched.go:106 msg="starting llm scheduler"
time=2025-08-02T22:45:44.863-07:00 level=INFO source=gpu.go:217 msg="looking for compatible GPUs"
time=2025-08-02T22:45:44.864-07:00 level=DEBUG source=gpu.go:98 msg="searching for GPU discovery libraries for NVIDIA"
time=2025-08-02T22:45:44.864-07:00 level=DEBUG source=gpu.go:501 msg="Searching for GPU library" name=libcuda.so

time=2025-08-02T22:45:44.864-07:00 level=DEBUG source=gpu.go:525 msg="gpu library search" globs="[/usr/lib/ollama/libcuda.so* /home/USER-REDACTED/libcuda.so* /usr/local/cuda*/targets//lib/libcuda.so /usr/lib/-linux-gnu/nvidia/current/libcuda.so /usr/lib/-linux-gnu/libcuda.so /usr/lib/wsl/lib/libcuda.so* /usr/lib/wsl/drivers//libcuda.so /opt/cuda/lib*/libcuda.so* /usr/local/cuda/lib*/libcuda.so* /usr/lib*/libcuda.so* /usr/local/lib*/libcuda.so*]"
time=2025-08-02T22:45:44.878-07:00 level=DEBUG source=gpu.go:558 msg="discovered GPU libraries" paths="[/usr/lib/libcuda.so.575.64.05 /usr/lib32/libcuda.so.575.64.05 /usr/lib64/libcuda.so.575.64.05]"
initializing /usr/lib/libcuda.so.575.64.05
dlsym: cuInit - ADDRESS_REDACTED
dlsym: cuDriverGetVersion - ADDRESS_REDACTED
dlsym: cuDeviceGetCount - ADDRESS_REDACTED
dlsym: cuDeviceGet - ADDRESS_REDACTED
dlsym: cuDeviceGetAttribute - ADDRESS_REDACTED
dlsym: cuDeviceGetUuid - ADDRESS_REDACTED
dlsym: cuDeviceGetName - ADDRESS_REDACTED
dlsym: cuCtxCreate_v3 - ADDRESS_REDACTED
dlsym: cuMemGetInfo_v2 - ADDRESS_REDACTED
dlsym: cuCtxDestroy - ADDRESS_REDACTED
calling cuInit
calling cuDriverGetVersion
raw version 0x2f3a
CUDA driver version: 12.9
calling cuDeviceGetCount
device count 1
time=2025-08-02T22:45:44.996-07:00 level=DEBUG source=gpu.go:125 msg="detected GPUs" count=1 library=/usr/lib/libcuda.so.575.64.05
[GPU-UUID-REDACTED] CUDA totalMem 15817mb
[GPU-UUID-REDACTED] CUDA freeMem 14747mb
[GPU-UUID-REDACTED] Compute Capability 12.0
time=2025-08-02T22:45:45.117-07:00 level=WARN source=amd_linux.go:61 msg="ollama recommends running the https://www.amd.com/en/support/download/linux-drivers.html" error="amdgpu version file missing: /sys/module/amdgpu/version stat /sys/module/amdgpu/version: no such file or directory"
time=2025-08-02T22:45:45.117-07:00 level=DEBUG source=amd_linux.go:101 msg="evaluating amdgpu node /sys/class/kfd/kfd/topology/nodes/0/properties"
time=2025-08-02T22:45:45.117-07:00 level=DEBUG source=amd_linux.go:121 msg="detected CPU /sys/class/kfd/kfd/topology/nodes/0/properties"
time=2025-08-02T22:45:45.117-07:00 level=DEBUG source=amd_linux.go:101 msg="evaluating amdgpu node /sys/class/kfd/kfd/topology/nodes/1/properties"
time=2025-08-02T22:45:45.117-07:00 level=DEBUG source=amd_linux.go:206 msg="mapping amdgpu to drm sysfs nodes" amdgpu=/sys/class/kfd/kfd/topology/nodes/1/properties vendor=4098 device=5710 unique_id=0
time=2025-08-02T22:45:45.117-07:00 level=DEBUG source=amd_linux.go:219 msg="failed to read sysfs node" file=/sys/class/drm/card0-DP-1/device/vendor error="open /sys/class/drm/card0-DP-1/device/vendor: no such file or directory"
time=2025-08-02T22:45:45.117-07:00 level=DEBUG source=amd_linux.go:219 msg="failed to read sysfs node" file=/sys/class/drm/card0-DP-2/device/vendor error="open /sys/class/drm/card0-DP-2/device/vendor: no such file or directory"
time=2025-08-02T22:45:45.117-07:00 level=DEBUG source=amd_linux.go:219 msg="failed to read sysfs node" file=/sys/class/drm/card0-DP-3/device/vendor error="open /sys/class/drm/card0-DP-3/device/vendor: no such file or directory"
time=2025-08-02T22:45:45.117-07:00 level=DEBUG source=amd_linux.go:219 msg="failed to read sysfs node" file=/sys/class/drm/card0-HDMI-A-1/device/vendor error="open /sys/class/drm/card0-HDMI-A-1/device/vendor: no such file or directory"
time=2025-08-02T22:45:45.117-07:00 level=DEBUG source=amd_linux.go:240 msg=matched amdgpu=/sys/class/kfd/kfd/topology/nodes/1/properties drm=/sys/class/drm/card1/device
time=2025-08-02T22:45:45.117-07:00 level=INFO source=amd_linux.go:296 msg="unsupported Radeon iGPU detected skipping" id=0 total="512.0 MiB"
time=2025-08-02T22:45:45.117-07:00 level=INFO source=amd_linux.go:402 msg="no compatible amdgpu devices detected"
releasing cuda driver library
time=2025-08-02T22:45:45.117-07:00 level=INFO source=types.go:130 msg="inference compute" id=GPU-UUID-REDACTED library=cuda variant=v12 compute=12.0 driver=12.9 name="NVIDIA GeForce RTX 5080" total="15.4 GiB" available="14.4 GiB"
[GIN] 2025/08/02 - 22:45:51 | 200 | 26.679µs | 127.0.0.1 | HEAD "/"
time=2025-08-02T22:45:51.426-07:00 level=DEBUG source=ggml.go:206 msg="key with type not found" key=general.alignment default=32
[GIN] 2025/08/02 - 22:45:51 | 200 | 35.591855ms | 127.0.0.1 | POST "/api/show"
time=2025-08-02T22:45:51.450-07:00 level=DEBUG source=gpu.go:391 msg="updating system memory data" before.total="61.9 GiB" before.free="57.3 GiB" before.free_swap="61.9 GiB" now.total="61.9 GiB" now.free="57.2 GiB" now.free_swap="61.9 GiB"
initializing /usr/lib/libcuda.so.575.64.05
dlsym: cuInit - ADDRESS_REDACTED
dlsym: cuDriverGetVersion - ADDRESS_REDACTED
dlsym: cuDeviceGetCount - ADDRESS_REDACTED
dlsym: cuDeviceGet - ADDRESS_REDACTED
dlsym: cuDeviceGetAttribute - ADDRESS_REDACTED
dlsym: cuDeviceGetUuid - ADDRESS_REDACTED
dlsym: cuDeviceGetName - ADDRESS_REDACTED
dlsym: cuCtxCreate_v3 - ADDRESS_REDACTED
dlsym: cuMemGetInfo_v2 - ADDRESS_REDACTED
dlsym: cuCtxDestroy - ADDRESS_REDACTED
calling cuInit
calling cuDriverGetVersion
raw version 0x2f3a
CUDA driver version: 12.9
calling cuDeviceGetCount
device count 1
time=2025-08-02T22:45:51.569-07:00 level=DEBUG source=gpu.go:441 msg="updating cuda memory data" gpu=GPU-UUID-REDACTED name="NVIDIA GeForce RTX 5080" overhead="0 B" before.total="15.4 GiB" before.free="14.4 GiB" now.total="15.4 GiB" now.free="14.4 GiB" now.used="1.0 GiB"
releasing cuda driver library
time=2025-08-02T22:45:51.569-07:00 level=DEBUG source=sched.go:183 msg="updating default concurrency" OLLAMA_MAX_LOADED_MODELS=3 gpu_count=1
time=2025-08-02T22:45:51.579-07:00 level=DEBUG source=ggml.go:206 msg="key with type not found" key=general.alignment default=32
time=2025-08-02T22:45:51.604-07:00 level=DEBUG source=sched.go:226 msg="loading first model" model=/home/USER-REDACTED/.ollama/models/blobs/sha256-REDACTED
time=2025-08-02T22:45:51.604-07:00 level=DEBUG source=memory.go:111 msg=evaluating library=cuda gpu_count=1 available="[14.4 GiB]"
time=2025-08-02T22:45:51.605-07:00 level=DEBUG source=ggml.go:206 msg="key with type not found" key=llama.vision.block_count default=0
time=2025-08-02T22:45:51.605-07:00 level=DEBUG source=ggml.go:206 msg="key with type not found" key=llama.attention.key_length default=128
time=2025-08-02T22:45:51.605-07:00 level=DEBUG source=ggml.go:206 msg="key with type not found" key=llama.attention.value_length default=128
time=2025-08-02T22:45:51.605-07:00 level=INFO source=sched.go:786 msg="new model will fit in available VRAM in single GPU, loading" model=/home/USER-REDACTED/.ollama/models/blobs/sha256-REDACTED gpu=GPU-UUID-REDACTED parallel=1 available=15464136704 required="5.4 GiB"
time=2025-08-02T22:45:51.605-07:00 level=DEBUG source=gpu.go:391 msg="updating system memory data" before.total="61.9 GiB" before.free="57.2 GiB" before.free_swap="61.9 GiB" now.total="61.9 GiB" now.free="57.1 GiB" now.free_swap="61.9 GiB"
initializing /usr/lib/libcuda.so.575.64.05
dlsym: cuInit - ADDRESS_REDACTED
dlsym: cuDriverGetVersion - ADDRESS_REDACTED
dlsym: cuDeviceGetCount - ADDRESS_REDACTED
dlsym: cuDeviceGet - ADDRESS_REDACTED
dlsym: cuDeviceGetAttribute - ADDRESS_REDACTED
dlsym: cuDeviceGetUuid - ADDRESS_REDACTED
dlsym: cuDeviceGetName - ADDRESS_REDACTED
dlsym: cuCtxCreate_v3 - ADDRESS_REDACTED
dlsym: cuMemGetInfo_v2 - ADDRESS_REDACTED
dlsym: cuCtxDestroy - ADDRESS_REDACTED
calling cuInit
calling cuDriverGetVersion
raw version 0x2f3a
CUDA driver version: 12.9
calling cuDeviceGetCount
device count 1
time=2025-08-02T22:45:51.714-07:00 level=DEBUG source=gpu.go:441 msg="updating cuda memory data" gpu=GPU-UUID-REDACTED name="NVIDIA GeForce RTX 5080" overhead="0 B" before.total="15.4 GiB" before.free="14.4 GiB" now.total="15.4 GiB" now.free="14.4 GiB" now.used="1.0 GiB"
releasing cuda driver library
time=2025-08-02T22:45:51.714-07:00 level=INFO source=server.go:135 msg="system memory" total="61.9 GiB" free="57.1 GiB" free_swap="61.9 GiB"
time=2025-08-02T22:45:51.714-07:00 level=DEBUG source=memory.go:111 msg=evaluating library=cuda gpu_count=1 available="[14.4 GiB]"
time=2025-08-02T22:45:51.714-07:00 level=DEBUG source=ggml.go:206 msg="key with type not found" key=llama.vision.block_count default=0
time=2025-08-02T22:45:51.714-07:00 level=DEBUG source=ggml.go:206 msg="key with type not found" key=llama.attention.key_length default=128
time=2025-08-02T22:45:51.714-07:00 level=DEBUG source=ggml.go:206 msg="key with type not found" key=llama.attention.value_length default=128
time=2025-08-02T22:45:51.714-07:00 level=INFO source=server.go:175 msg=offload library=cuda layers.requested=-1 layers.model=33 layers.offload=33 layers.split="" memory.available="[14.4 GiB]" memory.gpu_overhead="0 B" memory.required.full="5.4 GiB" memory.required.partial="5.4 GiB" memory.required.kv="512.0 MiB" memory.required.allocations="[5.4 GiB]" memory.weights.total="4.1 GiB" memory.weights.repeating="3.7 GiB" memory.weights.nonrepeating="411.0 MiB" memory.graph.full="296.0 MiB" memory.graph.partial="677.5 MiB"
time=2025-08-02T22:45:51.714-07:00 level=DEBUG source=server.go:291 msg="compatible gpu libraries" compatible=[]
llama_model_loader: loaded meta data with 22 key-value pairs and 291 tensors from /home/USER-REDACTED/.ollama/models/blobs/sha256-REDACTED (version GGUF V3 (latest))
llama_model_loader: Dumping metadata keys/values. Note: KV overrides do not apply in this output.
llama_model_loader: - kv 0: general.architecture str = llama
llama_model_loader: - kv 1: general.name str = Meta-Llama-3-8B-Instruct
llama_model_loader: - kv 2: llama.block_count u32 = 32
llama_model_loader: - kv 3: llama.context_length u32 = 8192
llama_model_loader: - kv 4: llama.embedding_length u32 = 4096
llama_model_loader: - kv 5: llama.feed_forward_length u32 = 14336
llama_model_loader: - kv 6: llama.attention.head_count u32 = 32
llama_model_loader: - kv 7: llama.attention.head_count_kv u32 = 8
llama_model_loader: - kv 8: llama.rope.freq_base f32 = 500000.000000
llama_model_loader: - kv 9: llama.attention.layer_norm_rms_epsilon f32 = 0.000010
llama_model_loader: - kv 10: general.file_type u32 = 2
llama_model_loader: - kv 11: llama.vocab_size u32 = 128256
llama_model_loader: - kv 12: llama.rope.dimension_count u32 = 128
llama_model_loader: - kv 13: tokenizer.ggml.model str = gpt2
llama_model_loader: - kv 14: tokenizer.ggml.pre str = llama-bpe
llama_model_loader: - kv 15: tokenizer.ggml.tokens arr[str,128256] = ["!", """, "#", "$", "%", "&", "'", ...
llama_model_loader: - kv 16: tokenizer.ggml.token_type arr[i32,128256] = [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, ...
llama_model_loader: - kv 17: tokenizer.ggml.merges arr[str,280147] = ["Ġ Ġ", "Ġ ĠĠĠ", "ĠĠ ĠĠ", "...
llama_model_loader: - kv 18: tokenizer.ggml.bos_token_id u32 = 128000
llama_model_loader: - kv 19: tokenizer.ggml.eos_token_id u32 = 128009
llama_model_loader: - kv 20: tokenizer.chat_template str = {% set loop_messages = messages %}{% ...
llama_model_loader: - kv 21: general.quantization_version u32 = 2
llama_model_loader: - type f32: 65 tensors
llama_model_loader: - type q4_0: 225 tensors
llama_model_loader: - type q6_K: 1 tensors
print_info: file format = GGUF V3 (latest)
print_info: file type = Q4_0
print_info: file size = 4.33 GiB (4.64 BPW)
init_tokenizer: initializing tokenizer for type 2
load: control token: 128255 '<|reserved_special_token_250|>' is not marked as EOG
load: control token: 128254 '<|reserved_special_token_249|>' is not marked as EOG
load: control token: 128253 '<|reserved_special_token_248|>' is not marked as EOG
load: control token: 128251 '<|reserved_special_token_246|>' is not marked as EOG
load: control token: 128246 '<|reserved_special_token_241|>' is not marked as EOG
load: control token: 128243 '<|reserved_special_token_238|>' is not marked as EOG
load: control token: 128240 '<|reserved_special_token_235|>' is not marked as EOG
load: control token: 128239 '<|reserved_special_token_234|>' is not marked as EOG
load: control token: 128238 '<|reserved_special_token_233|>' is not marked as EOG
load: control token: 128237 '<|reserved_special_token_232|>' is not marked as EOG
load: control token: 128232 '<|reserved_special_token_227|>' is not marked as EOG
load: control token: 128228 '<|reserved_special_token_223|>' is not marked as EOG
load: control token: 128227 '<|reserved_special_token_222|>' is not marked as EOG
load: control token: 128225 '<|reserved_special_token_220|>' is not marked as EOG
load: control token: 128222 '<|reserved_special_token_217|>' is not marked as EOG
load: control token: 128215 '<|reserved_special_token_210|>' is not marked as EOG
load: control token: 128211 '<|reserved_special_token_206|>' is not marked as EOG
load: control token: 128210 '<|reserved_special_token_205|>' is not marked as EOG
load: control token: 128204 '<|reserved_special_token_199|>' is not marked as EOG
load: control token: 128203 '<|reserved_special_token_198|>' is not marked as EOG
load: control token: 128201 '<|reserved_special_token_196|>' is not marked as EOG
load: control token: 128197 '<|reserved_special_token_192|>' is not marked as EOG
load: control token: 128196 '<|reserved_special_token_191|>' is not marked as EOG
load: control token: 128195 '<|reserved_special_token_190|>' is not marked as EOG
load: control token: 128193 '<|reserved_special_token_188|>' is not marked as EOG
load: control token: 128191 '<|reserved_special_token_186|>' is not marked as EOG
load: control token: 128190 '<|reserved_special_token_185|>' is not marked as EOG
load: control token: 128185 '<|reserved_special_token_180|>' is not marked as EOG
load: control token: 128184 '<|reserved_special_token_179|>' is not marked as EOG
load: control token: 128182 '<|reserved_special_token_177|>' is not marked as EOG
load: control token: 128181 '<|reserved_special_token_176|>' is not marked as EOG
load: control token: 128177 '<|reserved_special_token_172|>' is not marked as EOG
load: control token: 128176 '<|reserved_special_token_171|>' is not marked as EOG
load: control token: 128175 '<|reserved_special_token_170|>' is not marked as EOG
load: control token: 128174 '<|reserved_special_token_169|>' is not marked as EOG
load: control token: 128173 '<|reserved_special_token_168|>' is not marked as EOG
load: control token: 128172 '<|reserved_special_token_167|>' is not marked as EOG
load: control token: 128168 '<|reserved_special_token_163|>' is not marked as EOG
load: control token: 128167 '<|reserved_special_token_162|>' is not marked as EOG
load: control token: 128166 '<|reserved_special_token_161|>' is not marked as EOG
load: control token: 128165 '<|reserved_special_token_160|>' is not marked as EOG
load: control token: 128162 '<|reserved_special_token_157|>' is not marked as EOG
load: control token: 128159 '<|reserved_special_token_154|>' is not marked as EOG
load: control token: 128155 '<|reserved_special_token_150|>' is not marked as EOG
load: control token: 128153 '<|reserved_special_token_148|>' is not marked as EOG
load: control token: 128152 '<|reserved_special_token_147|>' is not marked as EOG
load: control token: 128151 '<|reserved_special_token_146|>' is not marked as EOG
load: control token: 128148 '<|reserved_special_token_143|>' is not marked as EOG
load: control token: 128146 '<|reserved_special_token_141|>' is not marked as EOG
load: control token: 128144 '<|reserved_special_token_139|>' is not marked as EOG
load: control token: 128143 '<|reserved_special_token_138|>' is not marked as EOG
load: control token: 128141 '<|reserved_special_token_136|>' is not marked as EOG
load: control token: 128139 '<|reserved_special_token_134|>' is not marked as EOG
load: control token: 128138 '<|reserved_special_token_133|>' is not marked as EOG
load: control token: 128135 '<|reserved_special_token_130|>' is not marked as EOG
load: control token: 128133 '<|reserved_special_token_128|>' is not marked as EOG
load: control token: 128132 '<|reserved_special_token_127|>' is not marked as EOG
load: control token: 128131 '<|reserved_special_token_126|>' is not marked as EOG
load: control token: 128130 '<|reserved_special_token_125|>' is not marked as EOG
load: control token: 128128 '<|reserved_special_token_123|>' is not marked as EOG
load: control token: 128125 '<|reserved_special_token_120|>' is not marked as EOG
load: control token: 128121 '<|reserved_special_token_116|>' is not marked as EOG
load: control token: 128120 '<|reserved_special_token_115|>' is not marked as EOG
load: control token: 128119 '<|reserved_special_token_114|>' is not marked as EOG
load: control token: 128116 '<|reserved_special_token_111|>' is not marked as EOG
load: control token: 128112 '<|reserved_special_token_107|>' is not marked as EOG
load: control token: 128109 '<|reserved_special_token_104|>' is not marked as EOG
load: control token: 128107 '<|reserved_special_token_102|>' is not marked as EOG
load: control token: 128106 '<|reserved_special_token_101|>' is not marked as EOG
load: control token: 128105 '<|reserved_special_token_100|>' is not marked as EOG
load: control token: 128103 '<|reserved_special_token_98|>' is not marked as EOG
load: control token: 128100 '<|reserved_special_token_95|>' is not marked as EOG
load: control token: 128099 '<|reserved_special_token_94|>' is not marked as EOG
load: control token: 128098 '<|reserved_special_token_93|>' is not marked as EOG
load: control token: 128094 '<|reserved_special_token_89|>' is not marked as EOG
load: control token: 128088 '<|reserved_special_token_83|>' is not marked as EOG
load: control token: 128087 '<|reserved_special_token_82|>' is not marked as EOG
load: control token: 128086 '<|reserved_special_token_81|>' is not marked as EOG
load: control token: 128084 '<|reserved_special_token_79|>' is not marked as EOG
load: control token: 128082 '<|reserved_special_token_77|>' is not marked as EOG
load: control token: 128078 '<|reserved_special_token_73|>' is not marked as EOG
load: control token: 128075 '<|reserved_special_token_70|>' is not marked as EOG
load: control token: 128073 '<|reserved_special_token_68|>' is not marked as EOG
load: control token: 128072 '<|reserved_special_token_67|>' is not marked as EOG
load: control token: 128070 '<|reserved_special_token_65|>' is not marked as EOG
load: control token: 128065 '<|reserved_special_token_60|>' is not marked as EOG
load: control token: 128064 '<|reserved_special_token_59|>' is not marked as EOG
load: control token: 128062 '<|reserved_special_token_57|>' is not marked as EOG
load: control token: 128060 '<|reserved_special_token_55|>' is not marked as EOG
load: control token: 128059 '<|reserved_special_token_54|>' is not marked as EOG
load: control token: 128057 '<|reserved_special_token_52|>' is not marked as EOG
load: control token: 128056 '<|reserved_special_token_51|>' is not marked as EOG
load: control token: 128054 '<|reserved_special_token_49|>' is not marked as EOG
load: control token: 128051 '<|reserved_special_token_46|>' is not marked as EOG
load: control token: 128043 '<|reserved_special_token_38|>' is not marked as EOG
load: control token: 128042 '<|reserved_special_token_37|>' is not marked as EOG
load: control token: 128041 '<|reserved_special_token_36|>' is not marked as EOG
load: control token: 128040 '<|reserved_special_token_35|>' is not marked as EOG
load: control token: 128035 '<|reserved_special_token_30|>' is not marked as EOG
load: control token: 128033 '<|reserved_special_token_28|>' is not marked as EOG
load: control token: 128032 '<|reserved_special_token_27|>' is not marked as EOG
load: control token: 128029 '<|reserved_special_token_24|>' is not marked as EOG
load: control token: 128025 '<|reserved_special_token_20|>' is not marked as EOG
load: control token: 128024 '<|reserved_special_token_19|>' is not marked as EOG
load: control token: 128021 '<|reserved_special_token_16|>' is not marked as EOG
load: control token: 128020 '<|reserved_special_token_15|>' is not marked as EOG
load: control token: 128019 '<|reserved_special_token_14|>' is not marked as EOG
load: control token: 128018 '<|reserved_special_token_13|>' is not marked as EOG
load: control token: 128015 '<|reserved_special_token_10|>' is not marked as EOG
load: control token: 128013 '<|reserved_special_token_8|>' is not marked as EOG
load: control token: 128012 '<|reserved_special_token_7|>' is not marked as EOG
load: control token: 128010 '<|reserved_special_token_5|>' is not marked as EOG
load: control token: 128005 '<|reserved_special_token_3|>' is not marked as EOG
load: control token: 128004 '<|reserved_special_token_2|>' is not marked as EOG
load: control token: 128002 '<|reserved_special_token_0|>' is not marked as EOG
load: control token: 128249 '<|reserved_special_token_244|>' is not marked as EOG
load: control token: 128187 '<|reserved_special_token_182|>' is not marked as EOG
load: control token: 128180 '<|reserved_special_token_175|>' is not marked as EOG
load: control token: 128134 '<|reserved_special_token_129|>' is not marked as EOG
load: control token: 128179 '<|reserved_special_token_174|>' is not marked as EOG
load: control token: 128037 '<|reserved_special_token_32|>' is not marked as EOG
load: control token: 128045 '<|reserved_special_token_40|>' is not marked as EOG
load: control token: 128089 '<|reserved_special_token_84|>' is not marked as EOG
load: control token: 128212 '<|reserved_special_token_207|>' is not marked as EOG
load: control token: 128104 '<|reserved_special_token_99|>' is not marked as EOG
load: control token: 128205 '<|reserved_special_token_200|>' is not marked as EOG
load: control token: 128142 '<|reserved_special_token_137|>' is not marked as EOG
load: control token: 128028 '<|reserved_special_token_23|>' is not marked as EOG
load: control token: 128126 '<|reserved_special_token_121|>' is not marked as EOG
load: control token: 128198 '<|reserved_special_token_193|>' is not marked as EOG
load: control token: 128071 '<|reserved_special_token_66|>' is not marked as EOG
load: control token: 128092 '<|reserved_special_token_87|>' is not marked as EOG
load: control token: 128183 '<|reserved_special_token_178|>' is not marked as EOG
load: control token: 128140 '<|reserved_special_token_135|>' is not marked as EOG
load: control token: 128226 '<|reserved_special_token_221|>' is not marked as EOG
load: control token: 128007 '<|end_header_id|>' is not marked as EOG
load: control token: 128052 '<|reserved_special_token_47|>' is not marked as EOG
load: control token: 128053 '<|reserved_special_token_48|>' is not marked as EOG
load: control token: 128058 '<|reserved_special_token_53|>' is not marked as EOG
load: control token: 128150 '<|reserved_special_token_145|>' is not marked as EOG
load: control token: 128149 '<|reserved_special_token_144|>' is not marked as EOG
load: control token: 128209 '<|reserved_special_token_204|>' is not marked as EOG
load: control token: 128169 '<|reserved_special_token_164|>' is not marked as EOG
load: control token: 128157 '<|reserved_special_token_152|>' is not marked as EOG
load: control token: 128038 '<|reserved_special_token_33|>' is not marked as EOG
load: control token: 128178 '<|reserved_special_token_173|>' is not marked as EOG
load: control token: 128091 '<|reserved_special_token_86|>' is not marked as EOG
load: control token: 128115 '<|reserved_special_token_110|>' is not marked as EOG
load: control token: 128233 '<|reserved_special_token_228|>' is not marked as EOG
load: control token: 128145 '<|reserved_special_token_140|>' is not marked as EOG
load: control token: 128039 '<|reserved_special_token_34|>' is not marked as EOG
load: control token: 128136 '<|reserved_special_token_131|>' is not marked as EOG
load: control token: 128170 '<|reserved_special_token_165|>' is not marked as EOG
load: control token: 128236 '<|reserved_special_token_231|>' is not marked as EOG
load: control token: 128154 '<|reserved_special_token_149|>' is not marked as EOG
load: control token: 128049 '<|reserved_special_token_44|>' is not marked as EOG
load: control token: 128023 '<|reserved_special_token_18|>' is not marked as EOG
load: control token: 128003 '<|reserved_special_token_1|>' is not marked as EOG
load: control token: 128016 '<|reserved_special_token_11|>' is not marked as EOG
load: control token: 128113 '<|reserved_special_token_108|>' is not marked as EOG
load: control token: 128158 '<|reserved_special_token_153|>' is not marked as EOG
load: control token: 128223 '<|reserved_special_token_218|>' is not marked as EOG
load: control token: 128156 '<|reserved_special_token_151|>' is not marked as EOG
load: control token: 128008 '<|reserved_special_token_4|>' is not marked as EOG
load: control token: 128085 '<|reserved_special_token_80|>' is not marked as EOG
load: control token: 128160 '<|reserved_special_token_155|>' is not marked as EOG
load: control token: 128001 '<|end_of_text|>' is not marked as EOG
load: control token: 128110 '<|reserved_special_token_105|>' is not marked as EOG
load: control token: 128247 '<|reserved_special_token_242|>' is not marked as EOG
load: control token: 128122 '<|reserved_special_token_117|>' is not marked as EOG
load: control token: 128050 '<|reserved_special_token_45|>' is not marked as EOG
load: control token: 128221 '<|reserved_special_token_216|>' is not marked as EOG
load: control token: 128244 '<|reserved_special_token_239|>' is not marked as EOG
load: control token: 128248 '<|reserved_special_token_243|>' is not marked as EOG
load: control token: 128213 '<|reserved_special_token_208|>' is not marked as EOG
load: control token: 128006 '<|start_header_id|>' is not marked as EOG
load: control token: 128208 '<|reserved_special_token_203|>' is not marked as EOG
load: control token: 128074 '<|reserved_special_token_69|>' is not marked as EOG
load: control token: 128234 '<|reserved_special_token_229|>' is not marked as EOG
load: control token: 128083 '<|reserved_special_token_78|>' is not marked as EOG
load: control token: 128224 '<|reserved_special_token_219|>' is not marked as EOG
load: control token: 128055 '<|reserved_special_token_50|>' is not marked as EOG
load: control token: 128097 '<|reserved_special_token_92|>' is not marked as EOG
load: control token: 128206 '<|reserved_special_token_201|>' is not marked as EOG
load: control token: 128081 '<|reserved_special_token_76|>' is not marked as EOG
load: control token: 128068 '<|reserved_special_token_63|>' is not marked as EOG
load: control token: 128067 '<|reserved_special_token_62|>' is not marked as EOG
load: control token: 128046 '<|reserved_special_token_41|>' is not marked as EOG
load: control token: 128194 '<|reserved_special_token_189|>' is not marked as EOG
load: control token: 128069 '<|reserved_special_token_64|>' is not marked as EOG
load: control token: 128000 '<|begin_of_text|>' is not marked as EOG
load: control token: 128220 '<|reserved_special_token_215|>' is not marked as EOG
load: control token: 128214 '<|reserved_special_token_209|>' is not marked as EOG
load: control token: 128108 '<|reserved_special_token_103|>' is not marked as EOG
load: control token: 128200 '<|reserved_special_token_195|>' is not marked as EOG
load: control token: 128048 '<|reserved_special_token_43|>' is not marked as EOG
load: control token: 128027 '<|reserved_special_token_22|>' is not marked as EOG
load: control token: 128114 '<|reserved_special_token_109|>' is not marked as EOG
load: control token: 128235 '<|reserved_special_token_230|>' is not marked as EOG
load: control token: 128252 '<|reserved_special_token_247|>' is not marked as EOG
load: control token: 128199 '<|reserved_special_token_194|>' is not marked as EOG
load: control token: 128129 '<|reserved_special_token_124|>' is not marked as EOG
load: control token: 128245 '<|reserved_special_token_240|>' is not marked as EOG
load: control token: 128164 '<|reserved_special_token_159|>' is not marked as EOG
load: control token: 128124 '<|reserved_special_token_119|>' is not marked as EOG
load: control token: 128102 '<|reserved_special_token_97|>' is not marked as EOG
load: control token: 128036 '<|reserved_special_token_31|>' is not marked as EOG
load: control token: 128229 '<|reserved_special_token_224|>' is not marked as EOG
load: control token: 128163 '<|reserved_special_token_158|>' is not marked as EOG
load: control token: 128127 '<|reserved_special_token_122|>' is not marked as EOG
load: control token: 128111 '<|reserved_special_token_106|>' is not marked as EOG
load: control token: 128231 '<|reserved_special_token_226|>' is not marked as EOG
load: control token: 128188 '<|reserved_special_token_183|>' is not marked as EOG
load: control token: 128061 '<|reserved_special_token_56|>' is not marked as EOG
load: control token: 128137 '<|reserved_special_token_132|>' is not marked as EOG
load: control token: 128093 '<|reserved_special_token_88|>' is not marked as EOG
load: control token: 128095 '<|reserved_special_token_90|>' is not marked as EOG
load: control token: 128189 '<|reserved_special_token_184|>' is not marked as EOG
load: control token: 128090 '<|reserved_special_token_85|>' is not marked as EOG
load: control token: 128147 '<|reserved_special_token_142|>' is not marked as EOG
load: control token: 128219 '<|reserved_special_token_214|>' is not marked as EOG
load: control token: 128230 '<|reserved_special_token_225|>' is not marked as EOG
load: control token: 128217 '<|reserved_special_token_212|>' is not marked as EOG
load: control token: 128031 '<|reserved_special_token_26|>' is not marked as EOG
load: control token: 128030 '<|reserved_special_token_25|>' is not marked as EOG
load: control token: 128250 '<|reserved_special_token_245|>' is not marked as EOG
load: control token: 128192 '<|reserved_special_token_187|>' is not marked as EOG
load: control token: 128096 '<|reserved_special_token_91|>' is not marked as EOG
load: control token: 128186 '<|reserved_special_token_181|>' is not marked as EOG
load: control token: 128207 '<|reserved_special_token_202|>' is not marked as EOG
load: control token: 128171 '<|reserved_special_token_166|>' is not marked as EOG
load: control token: 128080 '<|reserved_special_token_75|>' is not marked as EOG
load: control token: 128077 '<|reserved_special_token_72|>' is not marked as EOG
load: control token: 128101 '<|reserved_special_token_96|>' is not marked as EOG
load: control token: 128079 '<|reserved_special_token_74|>' is not marked as EOG
load: control token: 128216 '<|reserved_special_token_211|>' is not marked as EOG
load: control token: 128014 '<|reserved_special_token_9|>' is not marked as EOG
load: control token: 128047 '<|reserved_special_token_42|>' is not marked as EOG
load: control token: 128202 '<|reserved_special_token_197|>' is not marked as EOG
load: control token: 128044 '<|reserved_special_token_39|>' is not marked as EOG
load: control token: 128161 '<|reserved_special_token_156|>' is not marked as EOG
load: control token: 128017 '<|reserved_special_token_12|>' is not marked as EOG
load: control token: 128066 '<|reserved_special_token_61|>' is not marked as EOG
load: control token: 128242 '<|reserved_special_token_237|>' is not marked as EOG
load: control token: 128118 '<|reserved_special_token_113|>' is not marked as EOG
load: control token: 128076 '<|reserved_special_token_71|>' is not marked as EOG
load: control token: 128034 '<|reserved_special_token_29|>' is not marked as EOG
load: control token: 128241 '<|reserved_special_token_236|>' is not marked as EOG
load: control token: 128026 '<|reserved_special_token_21|>' is not marked as EOG
load: control token: 128218 '<|reserved_special_token_213|>' is not marked as EOG
load: control token: 128063 '<|reserved_special_token_58|>' is not marked as EOG
load: control token: 128117 '<|reserved_special_token_112|>' is not marked as EOG
load: control token: 128011 '<|reserved_special_token_6|>' is not marked as EOG
load: control token: 128022 '<|reserved_special_token_17|>' is not marked as EOG
load: control token: 128123 '<|reserved_special_token_118|>' is not marked as EOG
load: special tokens cache size = 256
load: token to piece cache size = 0.8000 MB
print_info: arch = llama
print_info: vocab_only = 1
print_info: model type = ?B
print_info: model params = 8.03 B
print_info: general.name = Meta-Llama-3-8B-Instruct
print_info: vocab type = BPE
print_info: n_vocab = 128256
print_info: n_merges = 280147
print_info: BOS token = 128000 '<|begin_of_text|>'
print_info: EOS token = 128009 '<|eot_id|>'
print_info: EOT token = 128009 '<|eot_id|>'
print_info: LF token = 198 'Ċ'
print_info: EOG token = 128009 '<|eot_id|>'
print_info: max token length = 256
llama_model_load: vocab only - skipping tensors
time=2025-08-02T22:45:51.838-07:00 level=INFO source=server.go:438 msg="starting llama server" cmd="/usr/bin/ollama runner --model /home/USER-REDACTED/.ollama/models/blobs/sha256-REDACTED --ctx-size 4096 --batch-size 512 --n-gpu-layers 33 --threads 8 --parallel 1 --port 42175"
time=2025-08-02T22:45:51.838-07:00 level=DEBUG source=server.go:439 msg=subprocess OLLAMA_DEBUG=1 PATH=/usr/local/sbin:/usr/local/bin:/usr/bin:/usr/lib/jvm/default/bin:/usr/bin/site_perl:/usr/bin/vendor_perl:/usr/bin/core_perl OLLAMA_MAX_LOADED_MODELS=3 OLLAMA_LIBRARY_PATH=/usr/lib/ollama LD_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama CUDA_VISIBLE_DEVICES=GPU-UUID-REDACTED
time=2025-08-02T22:45:51.838-07:00 level=INFO source=sched.go:481 msg="loaded runners" count=1
time=2025-08-02T22:45:51.838-07:00 level=INFO source=server.go:598 msg="waiting for llama runner to start responding"
time=2025-08-02T22:45:51.838-07:00 level=INFO source=server.go:632 msg="waiting for server to become available" status="llm server not responding"
time=2025-08-02T22:45:51.844-07:00 level=INFO source=runner.go:815 msg="starting go runner"
time=2025-08-02T22:45:51.845-07:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama
load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-icelake.so
time=2025-08-02T22:45:51.847-07:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.AVX512=1 CPU.0.AVX512_VBMI=1 CPU.0.AVX512_VNNI=1 CPU.0.AVX512_BF16=1 CPU.0.LLAMAFILE=1 CPU.1.SSE3=1 CPU.1.SSSE3=1 CPU.1.AVX=1 CPU.1.AVX2=1 CPU.1.F16C=1 CPU.1.FMA=1 CPU.1.BMI2=1 CPU.1.AVX512=1 CPU.1.AVX512_VBMI=1 CPU.1.AVX512_VNNI=1 CPU.1.AVX512_BF16=1 CPU.1.LLAMAFILE=1 compiler=cgo(gcc)
time=2025-08-02T22:45:51.848-07:00 level=INFO source=runner.go:874 msg="Server listening on 127.0.0.1:42175"
llama_model_loader: loaded meta data with 22 key-value pairs and 291 tensors from /home/USER-REDACTED/.ollama/models/blobs/sha256-REDACTED (version GGUF V3 (latest))
llama_model_loader: Dumping metadata keys/values. Note: KV overrides do not apply in this output.
llama_model_loader: - kv 0: general.architecture str = llama
llama_model_loader: - kv 1: general.name str = Meta-Llama-3-8B-Instruct
llama_model_loader: - kv 2: llama.block_count u32 = 32
llama_model_loader: - kv 3: llama.context_length u32 = 8192
llama_model_loader: - kv 4: llama.embedding_length u32 = 4096
llama_model_loader: - kv 5: llama.feed_forward_length u32 = 14336
llama_model_loader: - kv 6: llama.attention.head_count u32 = 32
llama_model_loader: - kv 7: llama.attention.head_count_kv u32 = 8
llama_model_loader: - kv 8: llama.rope.freq_base f32 = 500000.000000
llama_model_loader: - kv 9: llama.attention.layer_norm_rms_epsilon f32 = 0.000010
llama_model_loader: - kv 10: general.file_type u32 = 2
llama_model_loader: - kv 11: llama.vocab_size u32 = 128256
llama_model_loader: - kv 12: llama.rope.dimension_count u32 = 128
llama_model_loader: - kv 13: tokenizer.ggml.model str = gpt2
llama_model_loader: - kv 14: tokenizer.ggml.pre str = llama-bpe
llama_model_loader: - kv 15: tokenizer.ggml.tokens arr[str,128256] = ["!", """, "#", "$", "%", "&", "'", ...
llama_model_loader: - kv 16: tokenizer.ggml.token_type arr[i32,128256] = [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, ...
llama_model_loader: - kv 17: tokenizer.ggml.merges arr[str,280147] = ["Ġ Ġ", "Ġ ĠĠĠ", "ĠĠ ĠĠ", "...
llama_model_loader: - kv 18: tokenizer.ggml.bos_token_id u32 = 128000
llama_model_loader: - kv 19: tokenizer.ggml.eos_token_id u32 = 128009
llama_model_loader: - kv 20: tokenizer.chat_template str = {% set loop_messages = messages %}{% ...
llama_model_loader: - kv 21: general.quantization_version u32 = 2
llama_model_loader: - type f32: 65 tensors
llama_model_loader: - type q4_0: 225 tensors
llama_model_loader: - type q6_K: 1 tensors
print_info: file format = GGUF V3 (latest)
print_info: file type = Q4_0
print_info: file size = 4.33 GiB (4.64 BPW)
init_tokenizer: initializing tokenizer for type 2
load: control token: 128255 '<|reserved_special_token_250|>' is not marked as EOG
load: control token: 128254 '<|reserved_special_token_249|>' is not marked as EOG
load: control token: 128253 '<|reserved_special_token_248|>' is not marked as EOG
load: control token: 128251 '<|reserved_special_token_246|>' is not marked as EOG
load: control token: 128246 '<|reserved_special_token_241|>' is not marked as EOG
load: control token: 128243 '<|reserved_special_token_238|>' is not marked as EOG
load: control token: 128240 '<|reserved_special_token_235|>' is not marked as EOG
load: control token: 128239 '<|reserved_special_token_234|>' is not marked as EOG
load: control token: 128238 '<|reserved_special_token_233|>' is not marked as EOG
load: control token: 128237 '<|reserved_special_token_232|>' is not marked as EOG
load: control token: 128232 '<|reserved_special_token_227|>' is not marked as EOG
load: control token: 128228 '<|reserved_special_token_223|>' is not marked as EOG
load: control token: 128227 '<|reserved_special_token_222|>' is not marked as EOG
load: control token: 128225 '<|reserved_special_token_220|>' is not marked as EOG
load: control token: 128222 '<|reserved_special_token_217|>' is not marked as EOG
load: control token: 128215 '<|reserved_special_token_210|>' is not marked as EOG
load: control token: 128211 '<|reserved_special_token_206|>' is not marked as EOG
load: control token: 128210 '<|reserved_special_token_205|>' is not marked as EOG
load: control token: 128204 '<|reserved_special_token_199|>' is not marked as EOG
load: control token: 128203 '<|reserved_special_token_198|>' is not marked as EOG
load: control token: 128201 '<|reserved_special_token_196|>' is not marked as EOG
load: control token: 128197 '<|reserved_special_token_192|>' is not marked as EOG
load: control token: 128196 '<|reserved_special_token_191|>' is not marked as EOG
load: control token: 128195 '<|reserved_special_token_190|>' is not marked as EOG
load: control token: 128193 '<|reserved_special_token_188|>' is not marked as EOG
load: control token: 128191 '<|reserved_special_token_186|>' is not marked as EOG
load: control token: 128190 '<|reserved_special_token_185|>' is not marked as EOG
load: control token: 128185 '<|reserved_special_token_180|>' is not marked as EOG
load: control token: 128184 '<|reserved_special_token_179|>' is not marked as EOG
load: control token: 128182 '<|reserved_special_token_177|>' is not marked as EOG
load: control token: 128181 '<|reserved_special_token_176|>' is not marked as EOG
load: control token: 128177 '<|reserved_special_token_172|>' is not marked as EOG
load: control token: 128176 '<|reserved_special_token_171|>' is not marked as EOG
load: control token: 128175 '<|reserved_special_token_170|>' is not marked as EOG
load: control token: 128174 '<|reserved_special_token_169|>' is not marked as EOG
load: control token: 128173 '<|reserved_special_token_168|>' is not marked as EOG
load: control token: 128172 '<|reserved_special_token_167|>' is not marked as EOG
load: control token: 128168 '<|reserved_special_token_163|>' is not marked as EOG
load: control token: 128167 '<|reserved_special_token_162|>' is not marked as EOG
load: control token: 128166 '<|reserved_special_token_161|>' is not marked as EOG
load: control token: 128165 '<|reserved_special_token_160|>' is not marked as EOG
load: control token: 128162 '<|reserved_special_token_157|>' is not marked as EOG
load: control token: 128159 '<|reserved_special_token_154|>' is not marked as EOG
load: control token: 128155 '<|reserved_special_token_150|>' is not marked as EOG
load: control token: 128153 '<|reserved_special_token_148|>' is not marked as EOG
load: control token: 128152 '<|reserved_special_token_147|>' is not marked as EOG
load: control token: 128151 '<|reserved_special_token_146|>' is not marked as EOG
load: control token: 128148 '<|reserved_special_token_143|>' is not marked as EOG
load: control token: 128146 '<|reserved_special_token_141|>' is not marked as EOG
load: control token: 128144 '<|reserved_special_token_139|>' is not marked as EOG
load: control token: 128143 '<|reserved_special_token_138|>' is not marked as EOG
load: control token: 128141 '<|reserved_special_token_136|>' is not marked as EOG
load: control token: 128139 '<|reserved_special_token_134|>' is not marked as EOG
load: control token: 128138 '<|reserved_special_token_133|>' is not marked as EOG
load: control token: 128135 '<|reserved_special_token_130|>' is not marked as EOG
load: control token: 128133 '<|reserved_special_token_128|>' is not marked as EOG
load: control token: 128132 '<|reserved_special_token_127|>' is not marked as EOG
load: control token: 128131 '<|reserved_special_token_126|>' is not marked as EOG
load: control token: 128130 '<|reserved_special_token_125|>' is not marked as EOG
load: control token: 128128 '<|reserved_special_token_123|>' is not marked as EOG
load: control token: 128125 '<|reserved_special_token_120|>' is not marked as EOG
load: control token: 128121 '<|reserved_special_token_116|>' is not marked as EOG
load: control token: 128120 '<|reserved_special_token_115|>' is not marked as EOG
load: control token: 128119 '<|reserved_special_token_114|>' is not marked as EOG
load: control token: 128116 '<|reserved_special_token_111|>' is not marked as EOG
load: control token: 128112 '<|reserved_special_token_107|>' is not marked as EOG
load: control token: 128109 '<|reserved_special_token_104|>' is not marked as EOG
load: control token: 128107 '<|reserved_special_token_102|>' is not marked as EOG
load: control token: 128106 '<|reserved_special_token_101|>' is not marked as EOG
load: control token: 128105 '<|reserved_special_token_100|>' is not marked as EOG
load: control token: 128103 '<|reserved_special_token_98|>' is not marked as EOG
load: control token: 128100 '<|reserved_special_token_95|>' is not marked as EOG
load: control token: 128099 '<|reserved_special_token_94|>' is not marked as EOG
load: control token: 128098 '<|reserved_special_token_93|>' is not marked as EOG
load: control token: 128094 '<|reserved_special_token_89|>' is not marked as EOG
load: control token: 128088 '<|reserved_special_token_83|>' is not marked as EOG
load: control token: 128087 '<|reserved_special_token_82|>' is not marked as EOG
load: control token: 128086 '<|reserved_special_token_81|>' is not marked as EOG
load: control token: 128084 '<|reserved_special_token_79|>' is not marked as EOG
load: control token: 128082 '<|reserved_special_token_77|>' is not marked as EOG
load: control token: 128078 '<|reserved_special_token_73|>' is not marked as EOG
load: control token: 128075 '<|reserved_special_token_70|>' is not marked as EOG
load: control token: 128073 '<|reserved_special_token_68|>' is not marked as EOG
load: control token: 128072 '<|reserved_special_token_67|>' is not marked as EOG
load: control token: 128070 '<|reserved_special_token_65|>' is not marked as EOG
load: control token: 128065 '<|reserved_special_token_60|>' is not marked as EOG
load: control token: 128064 '<|reserved_special_token_59|>' is not marked as EOG
load: control token: 128062 '<|reserved_special_token_57|>' is not marked as EOG
load: control token: 128060 '<|reserved_special_token_55|>' is not marked as EOG
load: control token: 128059 '<|reserved_special_token_54|>' is not marked as EOG
load: control token: 128057 '<|reserved_special_token_52|>' is not marked as EOG
load: control token: 128056 '<|reserved_special_token_51|>' is not marked as EOG
load: control token: 128054 '<|reserved_special_token_49|>' is not marked as EOG
load: control token: 128051 '<|reserved_special_token_46|>' is not marked as EOG
load: control token: 128043 '<|reserved_special_token_38|>' is not marked as EOG
load: control token: 128042 '<|reserved_special_token_37|>' is not marked as EOG
load: control token: 128041 '<|reserved_special_token_36|>' is not marked as EOG
load: control token: 128040 '<|reserved_special_token_35|>' is not marked as EOG
load: control token: 128035 '<|reserved_special_token_30|>' is not marked as EOG
load: control token: 128033 '<|reserved_special_token_28|>' is not marked as EOG
load: control token: 128032 '<|reserved_special_token_27|>' is not marked as EOG
load: control token: 128029 '<|reserved_special_token_24|>' is not marked as EOG
load: control token: 128025 '<|reserved_special_token_20|>' is not marked as EOG
load: control token: 128024 '<|reserved_special_token_19|>' is not marked as EOG
load: control token: 128021 '<|reserved_special_token_16|>' is not marked as EOG
load: control token: 128020 '<|reserved_special_token_15|>' is not marked as EOG
load: control token: 128019 '<|reserved_special_token_14|>' is not marked as EOG
load: control token: 128018 '<|reserved_special_token_13|>' is not marked as EOG
load: control token: 128015 '<|reserved_special_token_10|>' is not marked as EOG
load: control token: 128013 '<|reserved_special_token_8|>' is not marked as EOG
load: control token: 128012 '<|reserved_special_token_7|>' is not marked as EOG
load: control token: 128010 '<|reserved_special_token_5|>' is not marked as EOG
load: control token: 128005 '<|reserved_special_token_3|>' is not marked as EOG
load: control token: 128004 '<|reserved_special_token_2|>' is not marked as EOG
load: control token: 128002 '<|reserved_special_token_0|>' is not marked as EOG
load: control token: 128249 '<|reserved_special_token_244|>' is not marked as EOG
load: control token: 128187 '<|reserved_special_token_182|>' is not marked as EOG
load: control token: 128180 '<|reserved_special_token_175|>' is not marked as EOG
load: control token: 128134 '<|reserved_special_token_129|>' is not marked as EOG
load: control token: 128179 '<|reserved_special_token_174|>' is not marked as EOG
load: control token: 128037 '<|reserved_special_token_32|>' is not marked as EOG
load: control token: 128045 '<|reserved_special_token_40|>' is not marked as EOG
load: control token: 128089 '<|reserved_special_token_84|>' is not marked as EOG
load: control token: 128212 '<|reserved_special_token_207|>' is not marked as EOG
load: control token: 128104 '<|reserved_special_token_99|>' is not marked as EOG
load: control token: 128205 '<|reserved_special_token_200|>' is not marked as EOG
load: control token: 128142 '<|reserved_special_token_137|>' is not marked as EOG
load: control token: 128028 '<|reserved_special_token_23|>' is not marked as EOG
load: control token: 128126 '<|reserved_special_token_121|>' is not marked as EOG
load: control token: 128198 '<|reserved_special_token_193|>' is not marked as EOG
load: control token: 128071 '<|reserved_special_token_66|>' is not marked as EOG
load: control token: 128092 '<|reserved_special_token_87|>' is not marked as EOG
load: control token: 128183 '<|reserved_special_token_178|>' is not marked as EOG
load: control token: 128140 '<|reserved_special_token_135|>' is not marked as EOG
load: control token: 128226 '<|reserved_special_token_221|>' is not marked as EOG
load: control token: 128007 '<|end_header_id|>' is not marked as EOG
load: control token: 128052 '<|reserved_special_token_47|>' is not marked as EOG
load: control token: 128053 '<|reserved_special_token_48|>' is not marked as EOG
load: control token: 128058 '<|reserved_special_token_53|>' is not marked as EOG
load: control token: 128150 '<|reserved_special_token_145|>' is not marked as EOG
load: control token: 128149 '<|reserved_special_token_144|>' is not marked as EOG
load: control token: 128209 '<|reserved_special_token_204|>' is not marked as EOG
load: control token: 128169 '<|reserved_special_token_164|>' is not marked as EOG
load: control token: 128157 '<|reserved_special_token_152|>' is not marked as EOG
load: control token: 128038 '<|reserved_special_token_33|>' is not marked as EOG
load: control token: 128178 '<|reserved_special_token_173|>' is not marked as EOG
load: control token: 128091 '<|reserved_special_token_86|>' is not marked as EOG
load: control token: 128115 '<|reserved_special_token_110|>' is not marked as EOG
load: control token: 128233 '<|reserved_special_token_228|>' is not marked as EOG
load: control token: 128145 '<|reserved_special_token_140|>' is not marked as EOG
load: control token: 128039 '<|reserved_special_token_34|>' is not marked as EOG
load: control token: 128136 '<|reserved_special_token_131|>' is not marked as EOG
load: control token: 128170 '<|reserved_special_token_165|>' is not marked as EOG
load: control token: 128236 '<|reserved_special_token_231|>' is not marked as EOG
load: control token: 128154 '<|reserved_special_token_149|>' is not marked as EOG
load: control token: 128049 '<|reserved_special_token_44|>' is not marked as EOG
load: control token: 128023 '<|reserved_special_token_18|>' is not marked as EOG
load: control token: 128003 '<|reserved_special_token_1|>' is not marked as EOG
load: control token: 128016 '<|reserved_special_token_11|>' is not marked as EOG
load: control token: 128113 '<|reserved_special_token_108|>' is not marked as EOG
load: control token: 128158 '<|reserved_special_token_153|>' is not marked as EOG
load: control token: 128223 '<|reserved_special_token_218|>' is not marked as EOG
load: control token: 128156 '<|reserved_special_token_151|>' is not marked as EOG
load: control token: 128008 '<|reserved_special_token_4|>' is not marked as EOG
load: control token: 128085 '<|reserved_special_token_80|>' is not marked as EOG
load: control token: 128160 '<|reserved_special_token_155|>' is not marked as EOG
load: control token: 128001 '<|end_of_text|>' is not marked as EOG
load: control token: 128110 '<|reserved_special_token_105|>' is not marked as EOG
load: control token: 128247 '<|reserved_special_token_242|>' is not marked as EOG
load: control token: 128122 '<|reserved_special_token_117|>' is not marked as EOG
load: control token: 128050 '<|reserved_special_token_45|>' is not marked as EOG
load: control token: 128221 '<|reserved_special_token_216|>' is not marked as EOG
load: control token: 128244 '<|reserved_special_token_239|>' is not marked as EOG
load: control token: 128248 '<|reserved_special_token_243|>' is not marked as EOG
load: control token: 128213 '<|reserved_special_token_208|>' is not marked as EOG
load: control token: 128006 '<|start_header_id|>' is not marked as EOG
load: control token: 128208 '<|reserved_special_token_203|>' is not marked as EOG
load: control token: 128074 '<|reserved_special_token_69|>' is not marked as EOG
load: control token: 128234 '<|reserved_special_token_229|>' is not marked as EOG
load: control token: 128083 '<|reserved_special_token_78|>' is not marked as EOG
load: control token: 128224 '<|reserved_special_token_219|>' is not marked as EOG
load: control token: 128055 '<|reserved_special_token_50|>' is not marked as EOG
load: control token: 128097 '<|reserved_special_token_92|>' is not marked as EOG
load: control token: 128206 '<|reserved_special_token_201|>' is not marked as EOG
load: control token: 128081 '<|reserved_special_token_76|>' is not marked as EOG
load: control token: 128068 '<|reserved_special_token_63|>' is not marked as EOG
load: control token: 128067 '<|reserved_special_token_62|>' is not marked as EOG
load: control token: 128046 '<|reserved_special_token_41|>' is not marked as EOG
load: control token: 128194 '<|reserved_special_token_189|>' is not marked as EOG
load: control token: 128069 '<|reserved_special_token_64|>' is not marked as EOG
load: control token: 128000 '<|begin_of_text|>' is not marked as EOG
load: control token: 128220 '<|reserved_special_token_215|>' is not marked as EOG
load: control token: 128214 '<|reserved_special_token_209|>' is not marked as EOG
load: control token: 128108 '<|reserved_special_token_103|>' is not marked as EOG
load: control token: 128200 '<|reserved_special_token_195|>' is not marked as EOG
load: control token: 128048 '<|reserved_special_token_43|>' is not marked as EOG
load: control token: 128027 '<|reserved_special_token_22|>' is not marked as EOG
load: control token: 128114 '<|reserved_special_token_109|>' is not marked as EOG
load: control token: 128235 '<|reserved_special_token_230|>' is not marked as EOG
load: control token: 128252 '<|reserved_special_token_247|>' is not marked as EOG
load: control token: 128199 '<|reserved_special_token_194|>' is not marked as EOG
load: control token: 128129 '<|reserved_special_token_124|>' is not marked as EOG
load: control token: 128245 '<|reserved_special_token_240|>' is not marked as EOG
load: control token: 128164 '<|reserved_special_token_159|>' is not marked as EOG
load: control token: 128124 '<|reserved_special_token_119|>' is not marked as EOG
load: control token: 128102 '<|reserved_special_token_97|>' is not marked as EOG
load: control token: 128036 '<|reserved_special_token_31|>' is not marked as EOG
load: control token: 128229 '<|reserved_special_token_224|>' is not marked as EOG
load: control token: 128163 '<|reserved_special_token_158|>' is not marked as EOG
load: control token: 128127 '<|reserved_special_token_122|>' is not marked as EOG
load: control token: 128111 '<|reserved_special_token_106|>' is not marked as EOG
load: control token: 128231 '<|reserved_special_token_226|>' is not marked as EOG
load: control token: 128188 '<|reserved_special_token_183|>' is not marked as EOG
load: control token: 128061 '<|reserved_special_token_56|>' is not marked as EOG
load: control token: 128137 '<|reserved_special_token_132|>' is not marked as EOG
load: control token: 128093 '<|reserved_special_token_88|>' is not marked as EOG
load: control token: 128095 '<|reserved_special_token_90|>' is not marked as EOG
load: control token: 128189 '<|reserved_special_token_184|>' is not marked as EOG
load: control token: 128090 '<|reserved_special_token_85|>' is not marked as EOG
load: control token: 128147 '<|reserved_special_token_142|>' is not marked as EOG
load: control token: 128219 '<|reserved_special_token_214|>' is not marked as EOG
load: control token: 128230 '<|reserved_special_token_225|>' is not marked as EOG
load: control token: 128217 '<|reserved_special_token_212|>' is not marked as EOG
load: control token: 128031 '<|reserved_special_token_26|>' is not marked as EOG
load: control token: 128030 '<|reserved_special_token_25|>' is not marked as EOG
load: control token: 128250 '<|reserved_special_token_245|>' is not marked as EOG
load: control token: 128192 '<|reserved_special_token_187|>' is not marked as EOG
load: control token: 128096 '<|reserved_special_token_91|>' is not marked as EOG
load: control token: 128186 '<|reserved_special_token_181|>' is not marked as EOG
load: control token: 128207 '<|reserved_special_token_202|>' is not marked as EOG
load: control token: 128171 '<|reserved_special_token_166|>' is not marked as EOG
load: control token: 128080 '<|reserved_special_token_75|>' is not marked as EOG
load: control token: 128077 '<|reserved_special_token_72|>' is not marked as EOG
load: control token: 128101 '<|reserved_special_token_96|>' is not marked as EOG
load: control token: 128079 '<|reserved_special_token_74|>' is not marked as EOG
load: control token: 128216 '<|reserved_special_token_211|>' is not marked as EOG
load: control token: 128014 '<|reserved_special_token_9|>' is not marked as EOG
load: control token: 128047 '<|reserved_special_token_42|>' is not marked as EOG
load: control token: 128202 '<|reserved_special_token_197|>' is not marked as EOG
load: control token: 128044 '<|reserved_special_token_39|>' is not marked as EOG
load: control token: 128161 '<|reserved_special_token_156|>' is not marked as EOG
load: control token: 128017 '<|reserved_special_token_12|>' is not marked as EOG
load: control token: 128066 '<|reserved_special_token_61|>' is not marked as EOG
load: control token: 128242 '<|reserved_special_token_237|>' is not marked as EOG
load: control token: 128118 '<|reserved_special_token_113|>' is not marked as EOG
load: control token: 128076 '<|reserved_special_token_71|>' is not marked as EOG
load: control token: 128034 '<|reserved_special_token_29|>' is not marked as EOG
load: control token: 128241 '<|reserved_special_token_236|>' is not marked as EOG
load: control token: 128026 '<|reserved_special_token_21|>' is not marked as EOG
load: control token: 128218 '<|reserved_special_token_213|>' is not marked as EOG
load: control token: 128063 '<|reserved_special_token_58|>' is not marked as EOG
load: control token: 128117 '<|reserved_special_token_112|>' is not marked as EOG
load: control token: 128011 '<|reserved_special_token_6|>' is not marked as EOG
load: control token: 128022 '<|reserved_special_token_17|>' is not marked as EOG
load: control token: 128123 '<|reserved_special_token_118|>' is not marked as EOG
load: special tokens cache size = 256
load: token to piece cache size = 0.8000 MB
print_info: arch = llama
print_info: vocab_only = 0
print_info: n_ctx_train = 8192
print_info: n_embd = 4096
print_info: n_layer = 32
print_info: n_head = 32
print_info: n_head_kv = 8
print_info: n_rot = 128
print_info: n_swa = 0
print_info: n_swa_pattern = 1
print_info: n_embd_head_k = 128
print_info: n_embd_head_v = 128
print_info: n_gqa = 4
print_info: n_embd_k_gqa = 1024
print_info: n_embd_v_gqa = 1024
print_info: f_norm_eps = 0.0e+00
print_info: f_norm_rms_eps = 1.0e-05
print_info: f_clamp_kqv = 0.0e+00
print_info: f_max_alibi_bias = 0.0e+00
print_info: f_logit_scale = 0.0e+00
print_info: f_attn_scale = 0.0e+00
print_info: n_ff = 14336
print_info: n_expert = 0
print_info: n_expert_used = 0
print_info: causal attn = 1
print_info: pooling type = 0
print_info: rope type = 0
print_info: rope scaling = linear
print_info: freq_base_train = 500000.0
print_info: freq_scale_train = 1
print_info: n_ctx_orig_yarn = 8192
print_info: rope_finetuned = unknown
print_info: ssm_d_conv = 0
print_info: ssm_d_inner = 0
print_info: ssm_d_state = 0
print_info: ssm_dt_rank = 0
print_info: ssm_dt_b_c_rms = 0
print_info: model type = 8B
print_info: model params = 8.03 B
print_info: general.name = Meta-Llama-3-8B-Instruct
print_info: vocab type = BPE
print_info: n_vocab = 128256
print_info: n_merges = 280147
print_info: BOS token = 128000 '<|begin_of_text|>'
print_info: EOS token = 128009 '<|eot_id|>'
print_info: EOT token = 128009 '<|eot_id|>'
print_info: LF token = 198 'Ċ'
print_info: EOG token = 128009 '<|eot_id|>'
print_info: max token length = 256
load_tensors: loading model tensors, this can take a while... (mmap = true)
load_tensors: layer 0 assigned to device CPU, is_swa = 0
load_tensors: layer 1 assigned to device CPU, is_swa = 0
load_tensors: layer 2 assigned to device CPU, is_swa = 0
load_tensors: layer 3 assigned to device CPU, is_swa = 0
load_tensors: layer 4 assigned to device CPU, is_swa = 0
load_tensors: layer 5 assigned to device CPU, is_swa = 0
load_tensors: layer 6 assigned to device CPU, is_swa = 0
load_tensors: layer 7 assigned to device CPU, is_swa = 0
load_tensors: layer 8 assigned to device CPU, is_swa = 0
load_tensors: layer 9 assigned to device CPU, is_swa = 0
load_tensors: layer 10 assigned to device CPU, is_swa = 0
load_tensors: layer 11 assigned to device CPU, is_swa = 0
load_tensors: layer 12 assigned to device CPU, is_swa = 0
load_tensors: layer 13 assigned to device CPU, is_swa = 0
load_tensors: layer 14 assigned to device CPU, is_swa = 0
load_tensors: layer 15 assigned to device CPU, is_swa = 0
load_tensors: layer 16 assigned to device CPU, is_swa = 0
load_tensors: layer 17 assigned to device CPU, is_swa = 0
load_tensors: layer 18 assigned to device CPU, is_swa = 0
load_tensors: layer 19 assigned to device CPU, is_swa = 0
load_tensors: layer 20 assigned to device CPU, is_swa = 0
load_tensors: layer 21 assigned to device CPU, is_swa = 0
load_tensors: layer 22 assigned to device CPU, is_swa = 0
load_tensors: layer 23 assigned to device CPU, is_swa = 0
load_tensors: layer 24 assigned to device CPU, is_swa = 0
load_tensors: layer 25 assigned to device CPU, is_swa = 0
load_tensors: layer 26 assigned to device CPU, is_swa = 0
load_tensors: layer 27 assigned to device CPU, is_swa = 0
load_tensors: layer 28 assigned to device CPU, is_swa = 0
load_tensors: layer 29 assigned to device CPU, is_swa = 0
load_tensors: layer 30 assigned to device CPU, is_swa = 0
load_tensors: layer 31 assigned to device CPU, is_swa = 0
load_tensors: layer 32 assigned to device CPU, is_swa = 0
time=2025-08-02T22:45:52.090-07:00 level=INFO source=server.go:632 msg="waiting for server to become available" status="llm server loading model"
load_tensors: CPU_Mapped model buffer size = 4437.80 MiB
llama_context: constructing llama_context
llama_context: n_seq_max = 1
llama_context: n_ctx = 4096
llama_context: n_ctx_per_seq = 4096
llama_context: n_batch = 512
llama_context: n_ubatch = 512
llama_context: causal_attn = 1
llama_context: flash_attn = 0
llama_context: freq_base = 500000.0
llama_context: freq_scale = 1
llama_context: n_ctx_per_seq (4096) < n_ctx_train (8192) -- the full capacity of the model will not be utilized
set_abort_callback: call
llama_context: CPU output buffer size = 0.50 MiB
create_memory: n_ctx = 4096 (padded)
llama_kv_cache_unified: kv_size = 4096, type_k = 'f16', type_v = 'f16', n_layer = 32, can_shift = 1, padding = 32
llama_kv_cache_unified: layer 0: dev = CPU
llama_kv_cache_unified: layer 1: dev = CPU
llama_kv_cache_unified: layer 2: dev = CPU
llama_kv_cache_unified: layer 3: dev = CPU
llama_kv_cache_unified: layer 4: dev = CPU
llama_kv_cache_unified: layer 5: dev = CPU
llama_kv_cache_unified: layer 6: dev = CPU
llama_kv_cache_unified: layer 7: dev = CPU
llama_kv_cache_unified: layer 8: dev = CPU
llama_kv_cache_unified: layer 9: dev = CPU
llama_kv_cache_unified: layer 10: dev = CPU
llama_kv_cache_unified: layer 11: dev = CPU
llama_kv_cache_unified: layer 12: dev = CPU
llama_kv_cache_unified: layer 13: dev = CPU
llama_kv_cache_unified: layer 14: dev = CPU
llama_kv_cache_unified: layer 15: dev = CPU
llama_kv_cache_unified: layer 16: dev = CPU
llama_kv_cache_unified: layer 17: dev = CPU
llama_kv_cache_unified: layer 18: dev = CPU
llama_kv_cache_unified: layer 19: dev = CPU
llama_kv_cache_unified: layer 20: dev = CPU
llama_kv_cache_unified: layer 21: dev = CPU
llama_kv_cache_unified: layer 22: dev = CPU
llama_kv_cache_unified: layer 23: dev = CPU
llama_kv_cache_unified: layer 24: dev = CPU
llama_kv_cache_unified: layer 25: dev = CPU
llama_kv_cache_unified: layer 26: dev = CPU
llama_kv_cache_unified: layer 27: dev = CPU
llama_kv_cache_unified: layer 28: dev = CPU
llama_kv_cache_unified: layer 29: dev = CPU
llama_kv_cache_unified: layer 30: dev = CPU
llama_kv_cache_unified: layer 31: dev = CPU
llama_kv_cache_unified: CPU KV buffer size = 512.00 MiB
llama_kv_cache_unified: KV self size = 512.00 MiB, K (f16): 256.00 MiB, V (f16): 256.00 MiB
llama_context: enumerating backends
llama_context: backend_ptrs.size() = 1
llama_context: max_nodes = 65536
llama_context: worst-case: n_tokens = 512, n_seqs = 1, n_outputs = 0
llama_context: reserving graph for n_tokens = 512, n_seqs = 1
llama_context: reserving graph for n_tokens = 1, n_seqs = 1
llama_context: reserving graph for n_tokens = 512, n_seqs = 1
llama_context: CPU compute buffer size = 296.01 MiB
llama_context: graph nodes = 1094
llama_context: graph splits = 1
time=2025-08-02T22:45:52.341-07:00 level=INFO source=server.go:637 msg="llama runner started in 0.50 seconds"
time=2025-08-02T22:45:52.341-07:00 level=DEBUG source=sched.go:493 msg="finished setting up" runner.name=registry.ollama.ai/library/llama3:8b runner.inference=cuda runner.devices=1 runner.size="5.4 GiB" runner.vram="5.4 GiB" runner.parallel=1 runner.pid=REDACTED runner.model=/home/USER-REDACTED/.ollama/models/blobs/sha256-REDACTED runner.num_ctx=4096
[GIN] 2025/08/02 - 22:45:52 | 200 | 913.876018ms | 127.0.0.1 | POST "/api/generate"
time=2025-08-02T22:45:52.341-07:00 level=DEBUG source=sched.go:501 msg="context for request finished"
time=2025-08-02T22:45:52.341-07:00 level=DEBUG source=sched.go:341 msg="runner with non-zero duration has gone idle, adding timer" runner.name=registry.ollama.ai/library/llama3:8b runner.inference=cuda runner.devices=1 runner.size="5.4 GiB" runner.vram="5.4 GiB" runner.parallel=1 runner.pid=REDACTED runner.model=/home/USER-REDACTED/.ollama/models/blobs/sha256-REDACTED runner.num_ctx=4096 duration=5m0s
time=2025-08-02T22:45:52.341-07:00 level=DEBUG source=sched.go:359 msg="after processing request finished event" runner.name=registry.ollama.ai/library/llama3:8b runner.inference=cuda runner.devices=1 runner.size="5.4 GiB" runner.vram="5.4 GiB" runner.parallel=1 runner.pid=REDACTED runner.model=/home/USER-REDACTED/.ollama/models/blobs/sha256-REDACTED runner.num_ctx=4096 refCount=0
time=2025-08-02T22:46:01.318-07:00 level=DEBUG source=sched.go:613 msg="evaluating already loaded" model=/home/USER-REDACTED/.ollama/models/blobs/sha256-REDACTED
time=2025-08-02T22:46:01.318-07:00 level=DEBUG source=server.go:736 msg="completion request" images=0 prompt=106 format=""
time=2025-08-02T22:46:01.318-07:00 level=DEBUG source=cache.go:104 msg="loading cache slot" id=0 cache=0 prompt=11 used=0 remaining=11
[GIN] 2025/08/02 - 22:46:04 | 200 | 3.625428917s | 127.0.0.1 | POST "/api/chat"
time=2025-08-02T22:46:04.918-07:00 level=DEBUG source=sched.go:432 msg="context for request finished" runner.name=registry.ollama.ai/library/llama3:8b runner.inference=cuda runner.devices=1 runner.size="5.4 GiB" runner.vram="5.4 GiB" runner.parallel=1 runner.pid=REDACTED runner.model=/home/USER-REDACTED/.ollama/models/blobs/sha256-REDACTED runner.num_ctx=4096
time=2025-08-02T22:46:04.918-07:00 level=DEBUG source=sched.go:341 msg="runner with non-zero duration has gone idle, adding timer" runner.name=registry.ollama.ai/library/llama3:8b runner.inference=cuda runner.devices=1 runner.size="5.4 GiB" runner.vram="5.4 GiB" runner.parallel=1 runner.pid=REDACTED runner.model=/home/USER-REDACTED/.ollama/models/blobs/sha256-REDACTED runner.num_ctx=4096 duration=5m0s
time=2025-08-02T22:46:04.918-07:00 level=DEBUG source=sched.go:359 msg="after processing request finished event" runner.name=registry.ollama.ai/library/llama3:8b runner.inference=cuda runner.devices=1 runner.size="5.4 GiB" runner.vram="5.4 GiB" runner.parallel=1 runner.pid=REDACTED runner.model=/home/USER-REDACTED/.ollama/models/blobs/sha256-REDACTED runner.num_ctx=4096 refCount=0

<!-- gh-comment-id:3147313328 --> @dcmoore commented on GitHub (Aug 3, 2025): Here's a full log with some machine specific things (memory addresses, username, etc) redacted. `OLLAMA_DEBUG=1 ollama serve` > time=2025-08-02T22:45:44.863-07:00 level=INFO source=routes.go:1238 msg="server config" env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:4096 OLLAMA_DEBUG:DEBUG OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/home/USER-REDACTED/.ollama/models OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:1 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES: http_proxy: https_proxy: no_proxy:]" time=2025-08-02T22:45:44.863-07:00 level=INFO source=images.go:476 msg="total blobs: 16" time=2025-08-02T22:45:44.863-07:00 level=INFO source=images.go:483 msg="total unused blobs removed: 0" time=2025-08-02T22:45:44.863-07:00 level=INFO source=routes.go:1291 msg="Listening on 127.0.0.1:11434 (version 0.10.1.r1.g4183bb0574a2)" time=2025-08-02T22:45:44.863-07:00 level=DEBUG source=sched.go:106 msg="starting llm scheduler" time=2025-08-02T22:45:44.863-07:00 level=INFO source=gpu.go:217 msg="looking for compatible GPUs" time=2025-08-02T22:45:44.864-07:00 level=DEBUG source=gpu.go:98 msg="searching for GPU discovery libraries for NVIDIA" time=2025-08-02T22:45:44.864-07:00 level=DEBUG source=gpu.go:501 msg="Searching for GPU library" name=libcuda.so* time=2025-08-02T22:45:44.864-07:00 level=DEBUG source=gpu.go:525 msg="gpu library search" globs="[/usr/lib/ollama/libcuda.so* /home/USER-REDACTED/libcuda.so* /usr/local/cuda*/targets/*/lib/libcuda.so* /usr/lib/*-linux-gnu/nvidia/current/libcuda.so* /usr/lib/*-linux-gnu/libcuda.so* /usr/lib/wsl/lib/libcuda.so* /usr/lib/wsl/drivers/*/libcuda.so* /opt/cuda/lib*/libcuda.so* /usr/local/cuda/lib*/libcuda.so* /usr/lib*/libcuda.so* /usr/local/lib*/libcuda.so*]" time=2025-08-02T22:45:44.878-07:00 level=DEBUG source=gpu.go:558 msg="discovered GPU libraries" paths="[/usr/lib/libcuda.so.575.64.05 /usr/lib32/libcuda.so.575.64.05 /usr/lib64/libcuda.so.575.64.05]" initializing /usr/lib/libcuda.so.575.64.05 dlsym: cuInit - ADDRESS_REDACTED dlsym: cuDriverGetVersion - ADDRESS_REDACTED dlsym: cuDeviceGetCount - ADDRESS_REDACTED dlsym: cuDeviceGet - ADDRESS_REDACTED dlsym: cuDeviceGetAttribute - ADDRESS_REDACTED dlsym: cuDeviceGetUuid - ADDRESS_REDACTED dlsym: cuDeviceGetName - ADDRESS_REDACTED dlsym: cuCtxCreate_v3 - ADDRESS_REDACTED dlsym: cuMemGetInfo_v2 - ADDRESS_REDACTED dlsym: cuCtxDestroy - ADDRESS_REDACTED calling cuInit calling cuDriverGetVersion raw version 0x2f3a CUDA driver version: 12.9 calling cuDeviceGetCount device count 1 time=2025-08-02T22:45:44.996-07:00 level=DEBUG source=gpu.go:125 msg="detected GPUs" count=1 library=/usr/lib/libcuda.so.575.64.05 [GPU-UUID-REDACTED] CUDA totalMem 15817mb [GPU-UUID-REDACTED] CUDA freeMem 14747mb [GPU-UUID-REDACTED] Compute Capability 12.0 time=2025-08-02T22:45:45.117-07:00 level=WARN source=amd_linux.go:61 msg="ollama recommends running the https://www.amd.com/en/support/download/linux-drivers.html" error="amdgpu version file missing: /sys/module/amdgpu/version stat /sys/module/amdgpu/version: no such file or directory" time=2025-08-02T22:45:45.117-07:00 level=DEBUG source=amd_linux.go:101 msg="evaluating amdgpu node /sys/class/kfd/kfd/topology/nodes/0/properties" time=2025-08-02T22:45:45.117-07:00 level=DEBUG source=amd_linux.go:121 msg="detected CPU /sys/class/kfd/kfd/topology/nodes/0/properties" time=2025-08-02T22:45:45.117-07:00 level=DEBUG source=amd_linux.go:101 msg="evaluating amdgpu node /sys/class/kfd/kfd/topology/nodes/1/properties" time=2025-08-02T22:45:45.117-07:00 level=DEBUG source=amd_linux.go:206 msg="mapping amdgpu to drm sysfs nodes" amdgpu=/sys/class/kfd/kfd/topology/nodes/1/properties vendor=4098 device=5710 unique_id=0 time=2025-08-02T22:45:45.117-07:00 level=DEBUG source=amd_linux.go:219 msg="failed to read sysfs node" file=/sys/class/drm/card0-DP-1/device/vendor error="open /sys/class/drm/card0-DP-1/device/vendor: no such file or directory" time=2025-08-02T22:45:45.117-07:00 level=DEBUG source=amd_linux.go:219 msg="failed to read sysfs node" file=/sys/class/drm/card0-DP-2/device/vendor error="open /sys/class/drm/card0-DP-2/device/vendor: no such file or directory" time=2025-08-02T22:45:45.117-07:00 level=DEBUG source=amd_linux.go:219 msg="failed to read sysfs node" file=/sys/class/drm/card0-DP-3/device/vendor error="open /sys/class/drm/card0-DP-3/device/vendor: no such file or directory" time=2025-08-02T22:45:45.117-07:00 level=DEBUG source=amd_linux.go:219 msg="failed to read sysfs node" file=/sys/class/drm/card0-HDMI-A-1/device/vendor error="open /sys/class/drm/card0-HDMI-A-1/device/vendor: no such file or directory" time=2025-08-02T22:45:45.117-07:00 level=DEBUG source=amd_linux.go:240 msg=matched amdgpu=/sys/class/kfd/kfd/topology/nodes/1/properties drm=/sys/class/drm/card1/device time=2025-08-02T22:45:45.117-07:00 level=INFO source=amd_linux.go:296 msg="unsupported Radeon iGPU detected skipping" id=0 total="512.0 MiB" time=2025-08-02T22:45:45.117-07:00 level=INFO source=amd_linux.go:402 msg="no compatible amdgpu devices detected" releasing cuda driver library time=2025-08-02T22:45:45.117-07:00 level=INFO source=types.go:130 msg="inference compute" id=GPU-UUID-REDACTED library=cuda variant=v12 compute=12.0 driver=12.9 name="NVIDIA GeForce RTX 5080" total="15.4 GiB" available="14.4 GiB" [GIN] 2025/08/02 - 22:45:51 | 200 | 26.679µs | 127.0.0.1 | HEAD "/" time=2025-08-02T22:45:51.426-07:00 level=DEBUG source=ggml.go:206 msg="key with type not found" key=general.alignment default=32 [GIN] 2025/08/02 - 22:45:51 | 200 | 35.591855ms | 127.0.0.1 | POST "/api/show" time=2025-08-02T22:45:51.450-07:00 level=DEBUG source=gpu.go:391 msg="updating system memory data" before.total="61.9 GiB" before.free="57.3 GiB" before.free_swap="61.9 GiB" now.total="61.9 GiB" now.free="57.2 GiB" now.free_swap="61.9 GiB" initializing /usr/lib/libcuda.so.575.64.05 dlsym: cuInit - ADDRESS_REDACTED dlsym: cuDriverGetVersion - ADDRESS_REDACTED dlsym: cuDeviceGetCount - ADDRESS_REDACTED dlsym: cuDeviceGet - ADDRESS_REDACTED dlsym: cuDeviceGetAttribute - ADDRESS_REDACTED dlsym: cuDeviceGetUuid - ADDRESS_REDACTED dlsym: cuDeviceGetName - ADDRESS_REDACTED dlsym: cuCtxCreate_v3 - ADDRESS_REDACTED dlsym: cuMemGetInfo_v2 - ADDRESS_REDACTED dlsym: cuCtxDestroy - ADDRESS_REDACTED calling cuInit calling cuDriverGetVersion raw version 0x2f3a CUDA driver version: 12.9 calling cuDeviceGetCount device count 1 time=2025-08-02T22:45:51.569-07:00 level=DEBUG source=gpu.go:441 msg="updating cuda memory data" gpu=GPU-UUID-REDACTED name="NVIDIA GeForce RTX 5080" overhead="0 B" before.total="15.4 GiB" before.free="14.4 GiB" now.total="15.4 GiB" now.free="14.4 GiB" now.used="1.0 GiB" releasing cuda driver library time=2025-08-02T22:45:51.569-07:00 level=DEBUG source=sched.go:183 msg="updating default concurrency" OLLAMA_MAX_LOADED_MODELS=3 gpu_count=1 time=2025-08-02T22:45:51.579-07:00 level=DEBUG source=ggml.go:206 msg="key with type not found" key=general.alignment default=32 time=2025-08-02T22:45:51.604-07:00 level=DEBUG source=sched.go:226 msg="loading first model" model=/home/USER-REDACTED/.ollama/models/blobs/sha256-REDACTED time=2025-08-02T22:45:51.604-07:00 level=DEBUG source=memory.go:111 msg=evaluating library=cuda gpu_count=1 available="[14.4 GiB]" time=2025-08-02T22:45:51.605-07:00 level=DEBUG source=ggml.go:206 msg="key with type not found" key=llama.vision.block_count default=0 time=2025-08-02T22:45:51.605-07:00 level=DEBUG source=ggml.go:206 msg="key with type not found" key=llama.attention.key_length default=128 time=2025-08-02T22:45:51.605-07:00 level=DEBUG source=ggml.go:206 msg="key with type not found" key=llama.attention.value_length default=128 time=2025-08-02T22:45:51.605-07:00 level=INFO source=sched.go:786 msg="new model will fit in available VRAM in single GPU, loading" model=/home/USER-REDACTED/.ollama/models/blobs/sha256-REDACTED gpu=GPU-UUID-REDACTED parallel=1 available=15464136704 required="5.4 GiB" time=2025-08-02T22:45:51.605-07:00 level=DEBUG source=gpu.go:391 msg="updating system memory data" before.total="61.9 GiB" before.free="57.2 GiB" before.free_swap="61.9 GiB" now.total="61.9 GiB" now.free="57.1 GiB" now.free_swap="61.9 GiB" initializing /usr/lib/libcuda.so.575.64.05 dlsym: cuInit - ADDRESS_REDACTED dlsym: cuDriverGetVersion - ADDRESS_REDACTED dlsym: cuDeviceGetCount - ADDRESS_REDACTED dlsym: cuDeviceGet - ADDRESS_REDACTED dlsym: cuDeviceGetAttribute - ADDRESS_REDACTED dlsym: cuDeviceGetUuid - ADDRESS_REDACTED dlsym: cuDeviceGetName - ADDRESS_REDACTED dlsym: cuCtxCreate_v3 - ADDRESS_REDACTED dlsym: cuMemGetInfo_v2 - ADDRESS_REDACTED dlsym: cuCtxDestroy - ADDRESS_REDACTED calling cuInit calling cuDriverGetVersion raw version 0x2f3a CUDA driver version: 12.9 calling cuDeviceGetCount device count 1 time=2025-08-02T22:45:51.714-07:00 level=DEBUG source=gpu.go:441 msg="updating cuda memory data" gpu=GPU-UUID-REDACTED name="NVIDIA GeForce RTX 5080" overhead="0 B" before.total="15.4 GiB" before.free="14.4 GiB" now.total="15.4 GiB" now.free="14.4 GiB" now.used="1.0 GiB" releasing cuda driver library time=2025-08-02T22:45:51.714-07:00 level=INFO source=server.go:135 msg="system memory" total="61.9 GiB" free="57.1 GiB" free_swap="61.9 GiB" time=2025-08-02T22:45:51.714-07:00 level=DEBUG source=memory.go:111 msg=evaluating library=cuda gpu_count=1 available="[14.4 GiB]" time=2025-08-02T22:45:51.714-07:00 level=DEBUG source=ggml.go:206 msg="key with type not found" key=llama.vision.block_count default=0 time=2025-08-02T22:45:51.714-07:00 level=DEBUG source=ggml.go:206 msg="key with type not found" key=llama.attention.key_length default=128 time=2025-08-02T22:45:51.714-07:00 level=DEBUG source=ggml.go:206 msg="key with type not found" key=llama.attention.value_length default=128 time=2025-08-02T22:45:51.714-07:00 level=INFO source=server.go:175 msg=offload library=cuda layers.requested=-1 layers.model=33 layers.offload=33 layers.split="" memory.available="[14.4 GiB]" memory.gpu_overhead="0 B" memory.required.full="5.4 GiB" memory.required.partial="5.4 GiB" memory.required.kv="512.0 MiB" memory.required.allocations="[5.4 GiB]" memory.weights.total="4.1 GiB" memory.weights.repeating="3.7 GiB" memory.weights.nonrepeating="411.0 MiB" memory.graph.full="296.0 MiB" memory.graph.partial="677.5 MiB" time=2025-08-02T22:45:51.714-07:00 level=DEBUG source=server.go:291 msg="compatible gpu libraries" compatible=[] llama_model_loader: loaded meta data with 22 key-value pairs and 291 tensors from /home/USER-REDACTED/.ollama/models/blobs/sha256-REDACTED (version GGUF V3 (latest)) llama_model_loader: Dumping metadata keys/values. Note: KV overrides do not apply in this output. llama_model_loader: - kv 0: general.architecture str = llama llama_model_loader: - kv 1: general.name str = Meta-Llama-3-8B-Instruct llama_model_loader: - kv 2: llama.block_count u32 = 32 llama_model_loader: - kv 3: llama.context_length u32 = 8192 llama_model_loader: - kv 4: llama.embedding_length u32 = 4096 llama_model_loader: - kv 5: llama.feed_forward_length u32 = 14336 llama_model_loader: - kv 6: llama.attention.head_count u32 = 32 llama_model_loader: - kv 7: llama.attention.head_count_kv u32 = 8 llama_model_loader: - kv 8: llama.rope.freq_base f32 = 500000.000000 llama_model_loader: - kv 9: llama.attention.layer_norm_rms_epsilon f32 = 0.000010 llama_model_loader: - kv 10: general.file_type u32 = 2 llama_model_loader: - kv 11: llama.vocab_size u32 = 128256 llama_model_loader: - kv 12: llama.rope.dimension_count u32 = 128 llama_model_loader: - kv 13: tokenizer.ggml.model str = gpt2 llama_model_loader: - kv 14: tokenizer.ggml.pre str = llama-bpe llama_model_loader: - kv 15: tokenizer.ggml.tokens arr[str,128256] = ["!", "\"", "#", "$", "%", "&", "'", ... llama_model_loader: - kv 16: tokenizer.ggml.token_type arr[i32,128256] = [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, ... llama_model_loader: - kv 17: tokenizer.ggml.merges arr[str,280147] = ["Ġ Ġ", "Ġ ĠĠĠ", "ĠĠ ĠĠ", "... llama_model_loader: - kv 18: tokenizer.ggml.bos_token_id u32 = 128000 llama_model_loader: - kv 19: tokenizer.ggml.eos_token_id u32 = 128009 llama_model_loader: - kv 20: tokenizer.chat_template str = {% set loop_messages = messages %}{% ... llama_model_loader: - kv 21: general.quantization_version u32 = 2 llama_model_loader: - type f32: 65 tensors llama_model_loader: - type q4_0: 225 tensors llama_model_loader: - type q6_K: 1 tensors print_info: file format = GGUF V3 (latest) print_info: file type = Q4_0 print_info: file size = 4.33 GiB (4.64 BPW) init_tokenizer: initializing tokenizer for type 2 load: control token: 128255 '<|reserved_special_token_250|>' is not marked as EOG load: control token: 128254 '<|reserved_special_token_249|>' is not marked as EOG load: control token: 128253 '<|reserved_special_token_248|>' is not marked as EOG load: control token: 128251 '<|reserved_special_token_246|>' is not marked as EOG load: control token: 128246 '<|reserved_special_token_241|>' is not marked as EOG load: control token: 128243 '<|reserved_special_token_238|>' is not marked as EOG load: control token: 128240 '<|reserved_special_token_235|>' is not marked as EOG load: control token: 128239 '<|reserved_special_token_234|>' is not marked as EOG load: control token: 128238 '<|reserved_special_token_233|>' is not marked as EOG load: control token: 128237 '<|reserved_special_token_232|>' is not marked as EOG load: control token: 128232 '<|reserved_special_token_227|>' is not marked as EOG load: control token: 128228 '<|reserved_special_token_223|>' is not marked as EOG load: control token: 128227 '<|reserved_special_token_222|>' is not marked as EOG load: control token: 128225 '<|reserved_special_token_220|>' is not marked as EOG load: control token: 128222 '<|reserved_special_token_217|>' is not marked as EOG load: control token: 128215 '<|reserved_special_token_210|>' is not marked as EOG load: control token: 128211 '<|reserved_special_token_206|>' is not marked as EOG load: control token: 128210 '<|reserved_special_token_205|>' is not marked as EOG load: control token: 128204 '<|reserved_special_token_199|>' is not marked as EOG load: control token: 128203 '<|reserved_special_token_198|>' is not marked as EOG load: control token: 128201 '<|reserved_special_token_196|>' is not marked as EOG load: control token: 128197 '<|reserved_special_token_192|>' is not marked as EOG load: control token: 128196 '<|reserved_special_token_191|>' is not marked as EOG load: control token: 128195 '<|reserved_special_token_190|>' is not marked as EOG load: control token: 128193 '<|reserved_special_token_188|>' is not marked as EOG load: control token: 128191 '<|reserved_special_token_186|>' is not marked as EOG load: control token: 128190 '<|reserved_special_token_185|>' is not marked as EOG load: control token: 128185 '<|reserved_special_token_180|>' is not marked as EOG load: control token: 128184 '<|reserved_special_token_179|>' is not marked as EOG load: control token: 128182 '<|reserved_special_token_177|>' is not marked as EOG load: control token: 128181 '<|reserved_special_token_176|>' is not marked as EOG load: control token: 128177 '<|reserved_special_token_172|>' is not marked as EOG load: control token: 128176 '<|reserved_special_token_171|>' is not marked as EOG load: control token: 128175 '<|reserved_special_token_170|>' is not marked as EOG load: control token: 128174 '<|reserved_special_token_169|>' is not marked as EOG load: control token: 128173 '<|reserved_special_token_168|>' is not marked as EOG load: control token: 128172 '<|reserved_special_token_167|>' is not marked as EOG load: control token: 128168 '<|reserved_special_token_163|>' is not marked as EOG load: control token: 128167 '<|reserved_special_token_162|>' is not marked as EOG load: control token: 128166 '<|reserved_special_token_161|>' is not marked as EOG load: control token: 128165 '<|reserved_special_token_160|>' is not marked as EOG load: control token: 128162 '<|reserved_special_token_157|>' is not marked as EOG load: control token: 128159 '<|reserved_special_token_154|>' is not marked as EOG load: control token: 128155 '<|reserved_special_token_150|>' is not marked as EOG load: control token: 128153 '<|reserved_special_token_148|>' is not marked as EOG load: control token: 128152 '<|reserved_special_token_147|>' is not marked as EOG load: control token: 128151 '<|reserved_special_token_146|>' is not marked as EOG load: control token: 128148 '<|reserved_special_token_143|>' is not marked as EOG load: control token: 128146 '<|reserved_special_token_141|>' is not marked as EOG load: control token: 128144 '<|reserved_special_token_139|>' is not marked as EOG load: control token: 128143 '<|reserved_special_token_138|>' is not marked as EOG load: control token: 128141 '<|reserved_special_token_136|>' is not marked as EOG load: control token: 128139 '<|reserved_special_token_134|>' is not marked as EOG load: control token: 128138 '<|reserved_special_token_133|>' is not marked as EOG load: control token: 128135 '<|reserved_special_token_130|>' is not marked as EOG load: control token: 128133 '<|reserved_special_token_128|>' is not marked as EOG load: control token: 128132 '<|reserved_special_token_127|>' is not marked as EOG load: control token: 128131 '<|reserved_special_token_126|>' is not marked as EOG load: control token: 128130 '<|reserved_special_token_125|>' is not marked as EOG load: control token: 128128 '<|reserved_special_token_123|>' is not marked as EOG load: control token: 128125 '<|reserved_special_token_120|>' is not marked as EOG load: control token: 128121 '<|reserved_special_token_116|>' is not marked as EOG load: control token: 128120 '<|reserved_special_token_115|>' is not marked as EOG load: control token: 128119 '<|reserved_special_token_114|>' is not marked as EOG load: control token: 128116 '<|reserved_special_token_111|>' is not marked as EOG load: control token: 128112 '<|reserved_special_token_107|>' is not marked as EOG load: control token: 128109 '<|reserved_special_token_104|>' is not marked as EOG load: control token: 128107 '<|reserved_special_token_102|>' is not marked as EOG load: control token: 128106 '<|reserved_special_token_101|>' is not marked as EOG load: control token: 128105 '<|reserved_special_token_100|>' is not marked as EOG load: control token: 128103 '<|reserved_special_token_98|>' is not marked as EOG load: control token: 128100 '<|reserved_special_token_95|>' is not marked as EOG load: control token: 128099 '<|reserved_special_token_94|>' is not marked as EOG load: control token: 128098 '<|reserved_special_token_93|>' is not marked as EOG load: control token: 128094 '<|reserved_special_token_89|>' is not marked as EOG load: control token: 128088 '<|reserved_special_token_83|>' is not marked as EOG load: control token: 128087 '<|reserved_special_token_82|>' is not marked as EOG load: control token: 128086 '<|reserved_special_token_81|>' is not marked as EOG load: control token: 128084 '<|reserved_special_token_79|>' is not marked as EOG load: control token: 128082 '<|reserved_special_token_77|>' is not marked as EOG load: control token: 128078 '<|reserved_special_token_73|>' is not marked as EOG load: control token: 128075 '<|reserved_special_token_70|>' is not marked as EOG load: control token: 128073 '<|reserved_special_token_68|>' is not marked as EOG load: control token: 128072 '<|reserved_special_token_67|>' is not marked as EOG load: control token: 128070 '<|reserved_special_token_65|>' is not marked as EOG load: control token: 128065 '<|reserved_special_token_60|>' is not marked as EOG load: control token: 128064 '<|reserved_special_token_59|>' is not marked as EOG load: control token: 128062 '<|reserved_special_token_57|>' is not marked as EOG load: control token: 128060 '<|reserved_special_token_55|>' is not marked as EOG load: control token: 128059 '<|reserved_special_token_54|>' is not marked as EOG load: control token: 128057 '<|reserved_special_token_52|>' is not marked as EOG load: control token: 128056 '<|reserved_special_token_51|>' is not marked as EOG load: control token: 128054 '<|reserved_special_token_49|>' is not marked as EOG load: control token: 128051 '<|reserved_special_token_46|>' is not marked as EOG load: control token: 128043 '<|reserved_special_token_38|>' is not marked as EOG load: control token: 128042 '<|reserved_special_token_37|>' is not marked as EOG load: control token: 128041 '<|reserved_special_token_36|>' is not marked as EOG load: control token: 128040 '<|reserved_special_token_35|>' is not marked as EOG load: control token: 128035 '<|reserved_special_token_30|>' is not marked as EOG load: control token: 128033 '<|reserved_special_token_28|>' is not marked as EOG load: control token: 128032 '<|reserved_special_token_27|>' is not marked as EOG load: control token: 128029 '<|reserved_special_token_24|>' is not marked as EOG load: control token: 128025 '<|reserved_special_token_20|>' is not marked as EOG load: control token: 128024 '<|reserved_special_token_19|>' is not marked as EOG load: control token: 128021 '<|reserved_special_token_16|>' is not marked as EOG load: control token: 128020 '<|reserved_special_token_15|>' is not marked as EOG load: control token: 128019 '<|reserved_special_token_14|>' is not marked as EOG load: control token: 128018 '<|reserved_special_token_13|>' is not marked as EOG load: control token: 128015 '<|reserved_special_token_10|>' is not marked as EOG load: control token: 128013 '<|reserved_special_token_8|>' is not marked as EOG load: control token: 128012 '<|reserved_special_token_7|>' is not marked as EOG load: control token: 128010 '<|reserved_special_token_5|>' is not marked as EOG load: control token: 128005 '<|reserved_special_token_3|>' is not marked as EOG load: control token: 128004 '<|reserved_special_token_2|>' is not marked as EOG load: control token: 128002 '<|reserved_special_token_0|>' is not marked as EOG load: control token: 128249 '<|reserved_special_token_244|>' is not marked as EOG load: control token: 128187 '<|reserved_special_token_182|>' is not marked as EOG load: control token: 128180 '<|reserved_special_token_175|>' is not marked as EOG load: control token: 128134 '<|reserved_special_token_129|>' is not marked as EOG load: control token: 128179 '<|reserved_special_token_174|>' is not marked as EOG load: control token: 128037 '<|reserved_special_token_32|>' is not marked as EOG load: control token: 128045 '<|reserved_special_token_40|>' is not marked as EOG load: control token: 128089 '<|reserved_special_token_84|>' is not marked as EOG load: control token: 128212 '<|reserved_special_token_207|>' is not marked as EOG load: control token: 128104 '<|reserved_special_token_99|>' is not marked as EOG load: control token: 128205 '<|reserved_special_token_200|>' is not marked as EOG load: control token: 128142 '<|reserved_special_token_137|>' is not marked as EOG load: control token: 128028 '<|reserved_special_token_23|>' is not marked as EOG load: control token: 128126 '<|reserved_special_token_121|>' is not marked as EOG load: control token: 128198 '<|reserved_special_token_193|>' is not marked as EOG load: control token: 128071 '<|reserved_special_token_66|>' is not marked as EOG load: control token: 128092 '<|reserved_special_token_87|>' is not marked as EOG load: control token: 128183 '<|reserved_special_token_178|>' is not marked as EOG load: control token: 128140 '<|reserved_special_token_135|>' is not marked as EOG load: control token: 128226 '<|reserved_special_token_221|>' is not marked as EOG load: control token: 128007 '<|end_header_id|>' is not marked as EOG load: control token: 128052 '<|reserved_special_token_47|>' is not marked as EOG load: control token: 128053 '<|reserved_special_token_48|>' is not marked as EOG load: control token: 128058 '<|reserved_special_token_53|>' is not marked as EOG load: control token: 128150 '<|reserved_special_token_145|>' is not marked as EOG load: control token: 128149 '<|reserved_special_token_144|>' is not marked as EOG load: control token: 128209 '<|reserved_special_token_204|>' is not marked as EOG load: control token: 128169 '<|reserved_special_token_164|>' is not marked as EOG load: control token: 128157 '<|reserved_special_token_152|>' is not marked as EOG load: control token: 128038 '<|reserved_special_token_33|>' is not marked as EOG load: control token: 128178 '<|reserved_special_token_173|>' is not marked as EOG load: control token: 128091 '<|reserved_special_token_86|>' is not marked as EOG load: control token: 128115 '<|reserved_special_token_110|>' is not marked as EOG load: control token: 128233 '<|reserved_special_token_228|>' is not marked as EOG load: control token: 128145 '<|reserved_special_token_140|>' is not marked as EOG load: control token: 128039 '<|reserved_special_token_34|>' is not marked as EOG load: control token: 128136 '<|reserved_special_token_131|>' is not marked as EOG load: control token: 128170 '<|reserved_special_token_165|>' is not marked as EOG load: control token: 128236 '<|reserved_special_token_231|>' is not marked as EOG load: control token: 128154 '<|reserved_special_token_149|>' is not marked as EOG load: control token: 128049 '<|reserved_special_token_44|>' is not marked as EOG load: control token: 128023 '<|reserved_special_token_18|>' is not marked as EOG load: control token: 128003 '<|reserved_special_token_1|>' is not marked as EOG load: control token: 128016 '<|reserved_special_token_11|>' is not marked as EOG load: control token: 128113 '<|reserved_special_token_108|>' is not marked as EOG load: control token: 128158 '<|reserved_special_token_153|>' is not marked as EOG load: control token: 128223 '<|reserved_special_token_218|>' is not marked as EOG load: control token: 128156 '<|reserved_special_token_151|>' is not marked as EOG load: control token: 128008 '<|reserved_special_token_4|>' is not marked as EOG load: control token: 128085 '<|reserved_special_token_80|>' is not marked as EOG load: control token: 128160 '<|reserved_special_token_155|>' is not marked as EOG load: control token: 128001 '<|end_of_text|>' is not marked as EOG load: control token: 128110 '<|reserved_special_token_105|>' is not marked as EOG load: control token: 128247 '<|reserved_special_token_242|>' is not marked as EOG load: control token: 128122 '<|reserved_special_token_117|>' is not marked as EOG load: control token: 128050 '<|reserved_special_token_45|>' is not marked as EOG load: control token: 128221 '<|reserved_special_token_216|>' is not marked as EOG load: control token: 128244 '<|reserved_special_token_239|>' is not marked as EOG load: control token: 128248 '<|reserved_special_token_243|>' is not marked as EOG load: control token: 128213 '<|reserved_special_token_208|>' is not marked as EOG load: control token: 128006 '<|start_header_id|>' is not marked as EOG load: control token: 128208 '<|reserved_special_token_203|>' is not marked as EOG load: control token: 128074 '<|reserved_special_token_69|>' is not marked as EOG load: control token: 128234 '<|reserved_special_token_229|>' is not marked as EOG load: control token: 128083 '<|reserved_special_token_78|>' is not marked as EOG load: control token: 128224 '<|reserved_special_token_219|>' is not marked as EOG load: control token: 128055 '<|reserved_special_token_50|>' is not marked as EOG load: control token: 128097 '<|reserved_special_token_92|>' is not marked as EOG load: control token: 128206 '<|reserved_special_token_201|>' is not marked as EOG load: control token: 128081 '<|reserved_special_token_76|>' is not marked as EOG load: control token: 128068 '<|reserved_special_token_63|>' is not marked as EOG load: control token: 128067 '<|reserved_special_token_62|>' is not marked as EOG load: control token: 128046 '<|reserved_special_token_41|>' is not marked as EOG load: control token: 128194 '<|reserved_special_token_189|>' is not marked as EOG load: control token: 128069 '<|reserved_special_token_64|>' is not marked as EOG load: control token: 128000 '<|begin_of_text|>' is not marked as EOG load: control token: 128220 '<|reserved_special_token_215|>' is not marked as EOG load: control token: 128214 '<|reserved_special_token_209|>' is not marked as EOG load: control token: 128108 '<|reserved_special_token_103|>' is not marked as EOG load: control token: 128200 '<|reserved_special_token_195|>' is not marked as EOG load: control token: 128048 '<|reserved_special_token_43|>' is not marked as EOG load: control token: 128027 '<|reserved_special_token_22|>' is not marked as EOG load: control token: 128114 '<|reserved_special_token_109|>' is not marked as EOG load: control token: 128235 '<|reserved_special_token_230|>' is not marked as EOG load: control token: 128252 '<|reserved_special_token_247|>' is not marked as EOG load: control token: 128199 '<|reserved_special_token_194|>' is not marked as EOG load: control token: 128129 '<|reserved_special_token_124|>' is not marked as EOG load: control token: 128245 '<|reserved_special_token_240|>' is not marked as EOG load: control token: 128164 '<|reserved_special_token_159|>' is not marked as EOG load: control token: 128124 '<|reserved_special_token_119|>' is not marked as EOG load: control token: 128102 '<|reserved_special_token_97|>' is not marked as EOG load: control token: 128036 '<|reserved_special_token_31|>' is not marked as EOG load: control token: 128229 '<|reserved_special_token_224|>' is not marked as EOG load: control token: 128163 '<|reserved_special_token_158|>' is not marked as EOG load: control token: 128127 '<|reserved_special_token_122|>' is not marked as EOG load: control token: 128111 '<|reserved_special_token_106|>' is not marked as EOG load: control token: 128231 '<|reserved_special_token_226|>' is not marked as EOG load: control token: 128188 '<|reserved_special_token_183|>' is not marked as EOG load: control token: 128061 '<|reserved_special_token_56|>' is not marked as EOG load: control token: 128137 '<|reserved_special_token_132|>' is not marked as EOG load: control token: 128093 '<|reserved_special_token_88|>' is not marked as EOG load: control token: 128095 '<|reserved_special_token_90|>' is not marked as EOG load: control token: 128189 '<|reserved_special_token_184|>' is not marked as EOG load: control token: 128090 '<|reserved_special_token_85|>' is not marked as EOG load: control token: 128147 '<|reserved_special_token_142|>' is not marked as EOG load: control token: 128219 '<|reserved_special_token_214|>' is not marked as EOG load: control token: 128230 '<|reserved_special_token_225|>' is not marked as EOG load: control token: 128217 '<|reserved_special_token_212|>' is not marked as EOG load: control token: 128031 '<|reserved_special_token_26|>' is not marked as EOG load: control token: 128030 '<|reserved_special_token_25|>' is not marked as EOG load: control token: 128250 '<|reserved_special_token_245|>' is not marked as EOG load: control token: 128192 '<|reserved_special_token_187|>' is not marked as EOG load: control token: 128096 '<|reserved_special_token_91|>' is not marked as EOG load: control token: 128186 '<|reserved_special_token_181|>' is not marked as EOG load: control token: 128207 '<|reserved_special_token_202|>' is not marked as EOG load: control token: 128171 '<|reserved_special_token_166|>' is not marked as EOG load: control token: 128080 '<|reserved_special_token_75|>' is not marked as EOG load: control token: 128077 '<|reserved_special_token_72|>' is not marked as EOG load: control token: 128101 '<|reserved_special_token_96|>' is not marked as EOG load: control token: 128079 '<|reserved_special_token_74|>' is not marked as EOG load: control token: 128216 '<|reserved_special_token_211|>' is not marked as EOG load: control token: 128014 '<|reserved_special_token_9|>' is not marked as EOG load: control token: 128047 '<|reserved_special_token_42|>' is not marked as EOG load: control token: 128202 '<|reserved_special_token_197|>' is not marked as EOG load: control token: 128044 '<|reserved_special_token_39|>' is not marked as EOG load: control token: 128161 '<|reserved_special_token_156|>' is not marked as EOG load: control token: 128017 '<|reserved_special_token_12|>' is not marked as EOG load: control token: 128066 '<|reserved_special_token_61|>' is not marked as EOG load: control token: 128242 '<|reserved_special_token_237|>' is not marked as EOG load: control token: 128118 '<|reserved_special_token_113|>' is not marked as EOG load: control token: 128076 '<|reserved_special_token_71|>' is not marked as EOG load: control token: 128034 '<|reserved_special_token_29|>' is not marked as EOG load: control token: 128241 '<|reserved_special_token_236|>' is not marked as EOG load: control token: 128026 '<|reserved_special_token_21|>' is not marked as EOG load: control token: 128218 '<|reserved_special_token_213|>' is not marked as EOG load: control token: 128063 '<|reserved_special_token_58|>' is not marked as EOG load: control token: 128117 '<|reserved_special_token_112|>' is not marked as EOG load: control token: 128011 '<|reserved_special_token_6|>' is not marked as EOG load: control token: 128022 '<|reserved_special_token_17|>' is not marked as EOG load: control token: 128123 '<|reserved_special_token_118|>' is not marked as EOG load: special tokens cache size = 256 load: token to piece cache size = 0.8000 MB print_info: arch = llama print_info: vocab_only = 1 print_info: model type = ?B print_info: model params = 8.03 B print_info: general.name = Meta-Llama-3-8B-Instruct print_info: vocab type = BPE print_info: n_vocab = 128256 print_info: n_merges = 280147 print_info: BOS token = 128000 '<|begin_of_text|>' print_info: EOS token = 128009 '<|eot_id|>' print_info: EOT token = 128009 '<|eot_id|>' print_info: LF token = 198 'Ċ' print_info: EOG token = 128009 '<|eot_id|>' print_info: max token length = 256 llama_model_load: vocab only - skipping tensors time=2025-08-02T22:45:51.838-07:00 level=INFO source=server.go:438 msg="starting llama server" cmd="/usr/bin/ollama runner --model /home/USER-REDACTED/.ollama/models/blobs/sha256-REDACTED --ctx-size 4096 --batch-size 512 --n-gpu-layers 33 --threads 8 --parallel 1 --port 42175" time=2025-08-02T22:45:51.838-07:00 level=DEBUG source=server.go:439 msg=subprocess OLLAMA_DEBUG=1 PATH=/usr/local/sbin:/usr/local/bin:/usr/bin:/usr/lib/jvm/default/bin:/usr/bin/site_perl:/usr/bin/vendor_perl:/usr/bin/core_perl OLLAMA_MAX_LOADED_MODELS=3 OLLAMA_LIBRARY_PATH=/usr/lib/ollama LD_LIBRARY_PATH=/usr/lib/ollama:/usr/lib/ollama CUDA_VISIBLE_DEVICES=GPU-UUID-REDACTED time=2025-08-02T22:45:51.838-07:00 level=INFO source=sched.go:481 msg="loaded runners" count=1 time=2025-08-02T22:45:51.838-07:00 level=INFO source=server.go:598 msg="waiting for llama runner to start responding" time=2025-08-02T22:45:51.838-07:00 level=INFO source=server.go:632 msg="waiting for server to become available" status="llm server not responding" time=2025-08-02T22:45:51.844-07:00 level=INFO source=runner.go:815 msg="starting go runner" time=2025-08-02T22:45:51.845-07:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-icelake.so time=2025-08-02T22:45:51.847-07:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.AVX512=1 CPU.0.AVX512_VBMI=1 CPU.0.AVX512_VNNI=1 CPU.0.AVX512_BF16=1 CPU.0.LLAMAFILE=1 CPU.1.SSE3=1 CPU.1.SSSE3=1 CPU.1.AVX=1 CPU.1.AVX2=1 CPU.1.F16C=1 CPU.1.FMA=1 CPU.1.BMI2=1 CPU.1.AVX512=1 CPU.1.AVX512_VBMI=1 CPU.1.AVX512_VNNI=1 CPU.1.AVX512_BF16=1 CPU.1.LLAMAFILE=1 compiler=cgo(gcc) time=2025-08-02T22:45:51.848-07:00 level=INFO source=runner.go:874 msg="Server listening on 127.0.0.1:42175" llama_model_loader: loaded meta data with 22 key-value pairs and 291 tensors from /home/USER-REDACTED/.ollama/models/blobs/sha256-REDACTED (version GGUF V3 (latest)) llama_model_loader: Dumping metadata keys/values. Note: KV overrides do not apply in this output. llama_model_loader: - kv 0: general.architecture str = llama llama_model_loader: - kv 1: general.name str = Meta-Llama-3-8B-Instruct llama_model_loader: - kv 2: llama.block_count u32 = 32 llama_model_loader: - kv 3: llama.context_length u32 = 8192 llama_model_loader: - kv 4: llama.embedding_length u32 = 4096 llama_model_loader: - kv 5: llama.feed_forward_length u32 = 14336 llama_model_loader: - kv 6: llama.attention.head_count u32 = 32 llama_model_loader: - kv 7: llama.attention.head_count_kv u32 = 8 llama_model_loader: - kv 8: llama.rope.freq_base f32 = 500000.000000 llama_model_loader: - kv 9: llama.attention.layer_norm_rms_epsilon f32 = 0.000010 llama_model_loader: - kv 10: general.file_type u32 = 2 llama_model_loader: - kv 11: llama.vocab_size u32 = 128256 llama_model_loader: - kv 12: llama.rope.dimension_count u32 = 128 llama_model_loader: - kv 13: tokenizer.ggml.model str = gpt2 llama_model_loader: - kv 14: tokenizer.ggml.pre str = llama-bpe llama_model_loader: - kv 15: tokenizer.ggml.tokens arr[str,128256] = ["!", "\"", "#", "$", "%", "&", "'", ... llama_model_loader: - kv 16: tokenizer.ggml.token_type arr[i32,128256] = [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, ... llama_model_loader: - kv 17: tokenizer.ggml.merges arr[str,280147] = ["Ġ Ġ", "Ġ ĠĠĠ", "ĠĠ ĠĠ", "... llama_model_loader: - kv 18: tokenizer.ggml.bos_token_id u32 = 128000 llama_model_loader: - kv 19: tokenizer.ggml.eos_token_id u32 = 128009 llama_model_loader: - kv 20: tokenizer.chat_template str = {% set loop_messages = messages %}{% ... llama_model_loader: - kv 21: general.quantization_version u32 = 2 llama_model_loader: - type f32: 65 tensors llama_model_loader: - type q4_0: 225 tensors llama_model_loader: - type q6_K: 1 tensors print_info: file format = GGUF V3 (latest) print_info: file type = Q4_0 print_info: file size = 4.33 GiB (4.64 BPW) init_tokenizer: initializing tokenizer for type 2 load: control token: 128255 '<|reserved_special_token_250|>' is not marked as EOG load: control token: 128254 '<|reserved_special_token_249|>' is not marked as EOG load: control token: 128253 '<|reserved_special_token_248|>' is not marked as EOG load: control token: 128251 '<|reserved_special_token_246|>' is not marked as EOG load: control token: 128246 '<|reserved_special_token_241|>' is not marked as EOG load: control token: 128243 '<|reserved_special_token_238|>' is not marked as EOG load: control token: 128240 '<|reserved_special_token_235|>' is not marked as EOG load: control token: 128239 '<|reserved_special_token_234|>' is not marked as EOG load: control token: 128238 '<|reserved_special_token_233|>' is not marked as EOG load: control token: 128237 '<|reserved_special_token_232|>' is not marked as EOG load: control token: 128232 '<|reserved_special_token_227|>' is not marked as EOG load: control token: 128228 '<|reserved_special_token_223|>' is not marked as EOG load: control token: 128227 '<|reserved_special_token_222|>' is not marked as EOG load: control token: 128225 '<|reserved_special_token_220|>' is not marked as EOG load: control token: 128222 '<|reserved_special_token_217|>' is not marked as EOG load: control token: 128215 '<|reserved_special_token_210|>' is not marked as EOG load: control token: 128211 '<|reserved_special_token_206|>' is not marked as EOG load: control token: 128210 '<|reserved_special_token_205|>' is not marked as EOG load: control token: 128204 '<|reserved_special_token_199|>' is not marked as EOG load: control token: 128203 '<|reserved_special_token_198|>' is not marked as EOG load: control token: 128201 '<|reserved_special_token_196|>' is not marked as EOG load: control token: 128197 '<|reserved_special_token_192|>' is not marked as EOG load: control token: 128196 '<|reserved_special_token_191|>' is not marked as EOG load: control token: 128195 '<|reserved_special_token_190|>' is not marked as EOG load: control token: 128193 '<|reserved_special_token_188|>' is not marked as EOG load: control token: 128191 '<|reserved_special_token_186|>' is not marked as EOG load: control token: 128190 '<|reserved_special_token_185|>' is not marked as EOG load: control token: 128185 '<|reserved_special_token_180|>' is not marked as EOG load: control token: 128184 '<|reserved_special_token_179|>' is not marked as EOG load: control token: 128182 '<|reserved_special_token_177|>' is not marked as EOG load: control token: 128181 '<|reserved_special_token_176|>' is not marked as EOG load: control token: 128177 '<|reserved_special_token_172|>' is not marked as EOG load: control token: 128176 '<|reserved_special_token_171|>' is not marked as EOG load: control token: 128175 '<|reserved_special_token_170|>' is not marked as EOG load: control token: 128174 '<|reserved_special_token_169|>' is not marked as EOG load: control token: 128173 '<|reserved_special_token_168|>' is not marked as EOG load: control token: 128172 '<|reserved_special_token_167|>' is not marked as EOG load: control token: 128168 '<|reserved_special_token_163|>' is not marked as EOG load: control token: 128167 '<|reserved_special_token_162|>' is not marked as EOG load: control token: 128166 '<|reserved_special_token_161|>' is not marked as EOG load: control token: 128165 '<|reserved_special_token_160|>' is not marked as EOG load: control token: 128162 '<|reserved_special_token_157|>' is not marked as EOG load: control token: 128159 '<|reserved_special_token_154|>' is not marked as EOG load: control token: 128155 '<|reserved_special_token_150|>' is not marked as EOG load: control token: 128153 '<|reserved_special_token_148|>' is not marked as EOG load: control token: 128152 '<|reserved_special_token_147|>' is not marked as EOG load: control token: 128151 '<|reserved_special_token_146|>' is not marked as EOG load: control token: 128148 '<|reserved_special_token_143|>' is not marked as EOG load: control token: 128146 '<|reserved_special_token_141|>' is not marked as EOG load: control token: 128144 '<|reserved_special_token_139|>' is not marked as EOG load: control token: 128143 '<|reserved_special_token_138|>' is not marked as EOG load: control token: 128141 '<|reserved_special_token_136|>' is not marked as EOG load: control token: 128139 '<|reserved_special_token_134|>' is not marked as EOG load: control token: 128138 '<|reserved_special_token_133|>' is not marked as EOG load: control token: 128135 '<|reserved_special_token_130|>' is not marked as EOG load: control token: 128133 '<|reserved_special_token_128|>' is not marked as EOG load: control token: 128132 '<|reserved_special_token_127|>' is not marked as EOG load: control token: 128131 '<|reserved_special_token_126|>' is not marked as EOG load: control token: 128130 '<|reserved_special_token_125|>' is not marked as EOG load: control token: 128128 '<|reserved_special_token_123|>' is not marked as EOG load: control token: 128125 '<|reserved_special_token_120|>' is not marked as EOG load: control token: 128121 '<|reserved_special_token_116|>' is not marked as EOG load: control token: 128120 '<|reserved_special_token_115|>' is not marked as EOG load: control token: 128119 '<|reserved_special_token_114|>' is not marked as EOG load: control token: 128116 '<|reserved_special_token_111|>' is not marked as EOG load: control token: 128112 '<|reserved_special_token_107|>' is not marked as EOG load: control token: 128109 '<|reserved_special_token_104|>' is not marked as EOG load: control token: 128107 '<|reserved_special_token_102|>' is not marked as EOG load: control token: 128106 '<|reserved_special_token_101|>' is not marked as EOG load: control token: 128105 '<|reserved_special_token_100|>' is not marked as EOG load: control token: 128103 '<|reserved_special_token_98|>' is not marked as EOG load: control token: 128100 '<|reserved_special_token_95|>' is not marked as EOG load: control token: 128099 '<|reserved_special_token_94|>' is not marked as EOG load: control token: 128098 '<|reserved_special_token_93|>' is not marked as EOG load: control token: 128094 '<|reserved_special_token_89|>' is not marked as EOG load: control token: 128088 '<|reserved_special_token_83|>' is not marked as EOG load: control token: 128087 '<|reserved_special_token_82|>' is not marked as EOG load: control token: 128086 '<|reserved_special_token_81|>' is not marked as EOG load: control token: 128084 '<|reserved_special_token_79|>' is not marked as EOG load: control token: 128082 '<|reserved_special_token_77|>' is not marked as EOG load: control token: 128078 '<|reserved_special_token_73|>' is not marked as EOG load: control token: 128075 '<|reserved_special_token_70|>' is not marked as EOG load: control token: 128073 '<|reserved_special_token_68|>' is not marked as EOG load: control token: 128072 '<|reserved_special_token_67|>' is not marked as EOG load: control token: 128070 '<|reserved_special_token_65|>' is not marked as EOG load: control token: 128065 '<|reserved_special_token_60|>' is not marked as EOG load: control token: 128064 '<|reserved_special_token_59|>' is not marked as EOG load: control token: 128062 '<|reserved_special_token_57|>' is not marked as EOG load: control token: 128060 '<|reserved_special_token_55|>' is not marked as EOG load: control token: 128059 '<|reserved_special_token_54|>' is not marked as EOG load: control token: 128057 '<|reserved_special_token_52|>' is not marked as EOG load: control token: 128056 '<|reserved_special_token_51|>' is not marked as EOG load: control token: 128054 '<|reserved_special_token_49|>' is not marked as EOG load: control token: 128051 '<|reserved_special_token_46|>' is not marked as EOG load: control token: 128043 '<|reserved_special_token_38|>' is not marked as EOG load: control token: 128042 '<|reserved_special_token_37|>' is not marked as EOG load: control token: 128041 '<|reserved_special_token_36|>' is not marked as EOG load: control token: 128040 '<|reserved_special_token_35|>' is not marked as EOG load: control token: 128035 '<|reserved_special_token_30|>' is not marked as EOG load: control token: 128033 '<|reserved_special_token_28|>' is not marked as EOG load: control token: 128032 '<|reserved_special_token_27|>' is not marked as EOG load: control token: 128029 '<|reserved_special_token_24|>' is not marked as EOG load: control token: 128025 '<|reserved_special_token_20|>' is not marked as EOG load: control token: 128024 '<|reserved_special_token_19|>' is not marked as EOG load: control token: 128021 '<|reserved_special_token_16|>' is not marked as EOG load: control token: 128020 '<|reserved_special_token_15|>' is not marked as EOG load: control token: 128019 '<|reserved_special_token_14|>' is not marked as EOG load: control token: 128018 '<|reserved_special_token_13|>' is not marked as EOG load: control token: 128015 '<|reserved_special_token_10|>' is not marked as EOG load: control token: 128013 '<|reserved_special_token_8|>' is not marked as EOG load: control token: 128012 '<|reserved_special_token_7|>' is not marked as EOG load: control token: 128010 '<|reserved_special_token_5|>' is not marked as EOG load: control token: 128005 '<|reserved_special_token_3|>' is not marked as EOG load: control token: 128004 '<|reserved_special_token_2|>' is not marked as EOG load: control token: 128002 '<|reserved_special_token_0|>' is not marked as EOG load: control token: 128249 '<|reserved_special_token_244|>' is not marked as EOG load: control token: 128187 '<|reserved_special_token_182|>' is not marked as EOG load: control token: 128180 '<|reserved_special_token_175|>' is not marked as EOG load: control token: 128134 '<|reserved_special_token_129|>' is not marked as EOG load: control token: 128179 '<|reserved_special_token_174|>' is not marked as EOG load: control token: 128037 '<|reserved_special_token_32|>' is not marked as EOG load: control token: 128045 '<|reserved_special_token_40|>' is not marked as EOG load: control token: 128089 '<|reserved_special_token_84|>' is not marked as EOG load: control token: 128212 '<|reserved_special_token_207|>' is not marked as EOG load: control token: 128104 '<|reserved_special_token_99|>' is not marked as EOG load: control token: 128205 '<|reserved_special_token_200|>' is not marked as EOG load: control token: 128142 '<|reserved_special_token_137|>' is not marked as EOG load: control token: 128028 '<|reserved_special_token_23|>' is not marked as EOG load: control token: 128126 '<|reserved_special_token_121|>' is not marked as EOG load: control token: 128198 '<|reserved_special_token_193|>' is not marked as EOG load: control token: 128071 '<|reserved_special_token_66|>' is not marked as EOG load: control token: 128092 '<|reserved_special_token_87|>' is not marked as EOG load: control token: 128183 '<|reserved_special_token_178|>' is not marked as EOG load: control token: 128140 '<|reserved_special_token_135|>' is not marked as EOG load: control token: 128226 '<|reserved_special_token_221|>' is not marked as EOG load: control token: 128007 '<|end_header_id|>' is not marked as EOG load: control token: 128052 '<|reserved_special_token_47|>' is not marked as EOG load: control token: 128053 '<|reserved_special_token_48|>' is not marked as EOG load: control token: 128058 '<|reserved_special_token_53|>' is not marked as EOG load: control token: 128150 '<|reserved_special_token_145|>' is not marked as EOG load: control token: 128149 '<|reserved_special_token_144|>' is not marked as EOG load: control token: 128209 '<|reserved_special_token_204|>' is not marked as EOG load: control token: 128169 '<|reserved_special_token_164|>' is not marked as EOG load: control token: 128157 '<|reserved_special_token_152|>' is not marked as EOG load: control token: 128038 '<|reserved_special_token_33|>' is not marked as EOG load: control token: 128178 '<|reserved_special_token_173|>' is not marked as EOG load: control token: 128091 '<|reserved_special_token_86|>' is not marked as EOG load: control token: 128115 '<|reserved_special_token_110|>' is not marked as EOG load: control token: 128233 '<|reserved_special_token_228|>' is not marked as EOG load: control token: 128145 '<|reserved_special_token_140|>' is not marked as EOG load: control token: 128039 '<|reserved_special_token_34|>' is not marked as EOG load: control token: 128136 '<|reserved_special_token_131|>' is not marked as EOG load: control token: 128170 '<|reserved_special_token_165|>' is not marked as EOG load: control token: 128236 '<|reserved_special_token_231|>' is not marked as EOG load: control token: 128154 '<|reserved_special_token_149|>' is not marked as EOG load: control token: 128049 '<|reserved_special_token_44|>' is not marked as EOG load: control token: 128023 '<|reserved_special_token_18|>' is not marked as EOG load: control token: 128003 '<|reserved_special_token_1|>' is not marked as EOG load: control token: 128016 '<|reserved_special_token_11|>' is not marked as EOG load: control token: 128113 '<|reserved_special_token_108|>' is not marked as EOG load: control token: 128158 '<|reserved_special_token_153|>' is not marked as EOG load: control token: 128223 '<|reserved_special_token_218|>' is not marked as EOG load: control token: 128156 '<|reserved_special_token_151|>' is not marked as EOG load: control token: 128008 '<|reserved_special_token_4|>' is not marked as EOG load: control token: 128085 '<|reserved_special_token_80|>' is not marked as EOG load: control token: 128160 '<|reserved_special_token_155|>' is not marked as EOG load: control token: 128001 '<|end_of_text|>' is not marked as EOG load: control token: 128110 '<|reserved_special_token_105|>' is not marked as EOG load: control token: 128247 '<|reserved_special_token_242|>' is not marked as EOG load: control token: 128122 '<|reserved_special_token_117|>' is not marked as EOG load: control token: 128050 '<|reserved_special_token_45|>' is not marked as EOG load: control token: 128221 '<|reserved_special_token_216|>' is not marked as EOG load: control token: 128244 '<|reserved_special_token_239|>' is not marked as EOG load: control token: 128248 '<|reserved_special_token_243|>' is not marked as EOG load: control token: 128213 '<|reserved_special_token_208|>' is not marked as EOG load: control token: 128006 '<|start_header_id|>' is not marked as EOG load: control token: 128208 '<|reserved_special_token_203|>' is not marked as EOG load: control token: 128074 '<|reserved_special_token_69|>' is not marked as EOG load: control token: 128234 '<|reserved_special_token_229|>' is not marked as EOG load: control token: 128083 '<|reserved_special_token_78|>' is not marked as EOG load: control token: 128224 '<|reserved_special_token_219|>' is not marked as EOG load: control token: 128055 '<|reserved_special_token_50|>' is not marked as EOG load: control token: 128097 '<|reserved_special_token_92|>' is not marked as EOG load: control token: 128206 '<|reserved_special_token_201|>' is not marked as EOG load: control token: 128081 '<|reserved_special_token_76|>' is not marked as EOG load: control token: 128068 '<|reserved_special_token_63|>' is not marked as EOG load: control token: 128067 '<|reserved_special_token_62|>' is not marked as EOG load: control token: 128046 '<|reserved_special_token_41|>' is not marked as EOG load: control token: 128194 '<|reserved_special_token_189|>' is not marked as EOG load: control token: 128069 '<|reserved_special_token_64|>' is not marked as EOG load: control token: 128000 '<|begin_of_text|>' is not marked as EOG load: control token: 128220 '<|reserved_special_token_215|>' is not marked as EOG load: control token: 128214 '<|reserved_special_token_209|>' is not marked as EOG load: control token: 128108 '<|reserved_special_token_103|>' is not marked as EOG load: control token: 128200 '<|reserved_special_token_195|>' is not marked as EOG load: control token: 128048 '<|reserved_special_token_43|>' is not marked as EOG load: control token: 128027 '<|reserved_special_token_22|>' is not marked as EOG load: control token: 128114 '<|reserved_special_token_109|>' is not marked as EOG load: control token: 128235 '<|reserved_special_token_230|>' is not marked as EOG load: control token: 128252 '<|reserved_special_token_247|>' is not marked as EOG load: control token: 128199 '<|reserved_special_token_194|>' is not marked as EOG load: control token: 128129 '<|reserved_special_token_124|>' is not marked as EOG load: control token: 128245 '<|reserved_special_token_240|>' is not marked as EOG load: control token: 128164 '<|reserved_special_token_159|>' is not marked as EOG load: control token: 128124 '<|reserved_special_token_119|>' is not marked as EOG load: control token: 128102 '<|reserved_special_token_97|>' is not marked as EOG load: control token: 128036 '<|reserved_special_token_31|>' is not marked as EOG load: control token: 128229 '<|reserved_special_token_224|>' is not marked as EOG load: control token: 128163 '<|reserved_special_token_158|>' is not marked as EOG load: control token: 128127 '<|reserved_special_token_122|>' is not marked as EOG load: control token: 128111 '<|reserved_special_token_106|>' is not marked as EOG load: control token: 128231 '<|reserved_special_token_226|>' is not marked as EOG load: control token: 128188 '<|reserved_special_token_183|>' is not marked as EOG load: control token: 128061 '<|reserved_special_token_56|>' is not marked as EOG load: control token: 128137 '<|reserved_special_token_132|>' is not marked as EOG load: control token: 128093 '<|reserved_special_token_88|>' is not marked as EOG load: control token: 128095 '<|reserved_special_token_90|>' is not marked as EOG load: control token: 128189 '<|reserved_special_token_184|>' is not marked as EOG load: control token: 128090 '<|reserved_special_token_85|>' is not marked as EOG load: control token: 128147 '<|reserved_special_token_142|>' is not marked as EOG load: control token: 128219 '<|reserved_special_token_214|>' is not marked as EOG load: control token: 128230 '<|reserved_special_token_225|>' is not marked as EOG load: control token: 128217 '<|reserved_special_token_212|>' is not marked as EOG load: control token: 128031 '<|reserved_special_token_26|>' is not marked as EOG load: control token: 128030 '<|reserved_special_token_25|>' is not marked as EOG load: control token: 128250 '<|reserved_special_token_245|>' is not marked as EOG load: control token: 128192 '<|reserved_special_token_187|>' is not marked as EOG load: control token: 128096 '<|reserved_special_token_91|>' is not marked as EOG load: control token: 128186 '<|reserved_special_token_181|>' is not marked as EOG load: control token: 128207 '<|reserved_special_token_202|>' is not marked as EOG load: control token: 128171 '<|reserved_special_token_166|>' is not marked as EOG load: control token: 128080 '<|reserved_special_token_75|>' is not marked as EOG load: control token: 128077 '<|reserved_special_token_72|>' is not marked as EOG load: control token: 128101 '<|reserved_special_token_96|>' is not marked as EOG load: control token: 128079 '<|reserved_special_token_74|>' is not marked as EOG load: control token: 128216 '<|reserved_special_token_211|>' is not marked as EOG load: control token: 128014 '<|reserved_special_token_9|>' is not marked as EOG load: control token: 128047 '<|reserved_special_token_42|>' is not marked as EOG load: control token: 128202 '<|reserved_special_token_197|>' is not marked as EOG load: control token: 128044 '<|reserved_special_token_39|>' is not marked as EOG load: control token: 128161 '<|reserved_special_token_156|>' is not marked as EOG load: control token: 128017 '<|reserved_special_token_12|>' is not marked as EOG load: control token: 128066 '<|reserved_special_token_61|>' is not marked as EOG load: control token: 128242 '<|reserved_special_token_237|>' is not marked as EOG load: control token: 128118 '<|reserved_special_token_113|>' is not marked as EOG load: control token: 128076 '<|reserved_special_token_71|>' is not marked as EOG load: control token: 128034 '<|reserved_special_token_29|>' is not marked as EOG load: control token: 128241 '<|reserved_special_token_236|>' is not marked as EOG load: control token: 128026 '<|reserved_special_token_21|>' is not marked as EOG load: control token: 128218 '<|reserved_special_token_213|>' is not marked as EOG load: control token: 128063 '<|reserved_special_token_58|>' is not marked as EOG load: control token: 128117 '<|reserved_special_token_112|>' is not marked as EOG load: control token: 128011 '<|reserved_special_token_6|>' is not marked as EOG load: control token: 128022 '<|reserved_special_token_17|>' is not marked as EOG load: control token: 128123 '<|reserved_special_token_118|>' is not marked as EOG load: special tokens cache size = 256 load: token to piece cache size = 0.8000 MB print_info: arch = llama print_info: vocab_only = 0 print_info: n_ctx_train = 8192 print_info: n_embd = 4096 print_info: n_layer = 32 print_info: n_head = 32 print_info: n_head_kv = 8 print_info: n_rot = 128 print_info: n_swa = 0 print_info: n_swa_pattern = 1 print_info: n_embd_head_k = 128 print_info: n_embd_head_v = 128 print_info: n_gqa = 4 print_info: n_embd_k_gqa = 1024 print_info: n_embd_v_gqa = 1024 print_info: f_norm_eps = 0.0e+00 print_info: f_norm_rms_eps = 1.0e-05 print_info: f_clamp_kqv = 0.0e+00 print_info: f_max_alibi_bias = 0.0e+00 print_info: f_logit_scale = 0.0e+00 print_info: f_attn_scale = 0.0e+00 print_info: n_ff = 14336 print_info: n_expert = 0 print_info: n_expert_used = 0 print_info: causal attn = 1 print_info: pooling type = 0 print_info: rope type = 0 print_info: rope scaling = linear print_info: freq_base_train = 500000.0 print_info: freq_scale_train = 1 print_info: n_ctx_orig_yarn = 8192 print_info: rope_finetuned = unknown print_info: ssm_d_conv = 0 print_info: ssm_d_inner = 0 print_info: ssm_d_state = 0 print_info: ssm_dt_rank = 0 print_info: ssm_dt_b_c_rms = 0 print_info: model type = 8B print_info: model params = 8.03 B print_info: general.name = Meta-Llama-3-8B-Instruct print_info: vocab type = BPE print_info: n_vocab = 128256 print_info: n_merges = 280147 print_info: BOS token = 128000 '<|begin_of_text|>' print_info: EOS token = 128009 '<|eot_id|>' print_info: EOT token = 128009 '<|eot_id|>' print_info: LF token = 198 'Ċ' print_info: EOG token = 128009 '<|eot_id|>' print_info: max token length = 256 load_tensors: loading model tensors, this can take a while... (mmap = true) load_tensors: layer 0 assigned to device CPU, is_swa = 0 load_tensors: layer 1 assigned to device CPU, is_swa = 0 load_tensors: layer 2 assigned to device CPU, is_swa = 0 load_tensors: layer 3 assigned to device CPU, is_swa = 0 load_tensors: layer 4 assigned to device CPU, is_swa = 0 load_tensors: layer 5 assigned to device CPU, is_swa = 0 load_tensors: layer 6 assigned to device CPU, is_swa = 0 load_tensors: layer 7 assigned to device CPU, is_swa = 0 load_tensors: layer 8 assigned to device CPU, is_swa = 0 load_tensors: layer 9 assigned to device CPU, is_swa = 0 load_tensors: layer 10 assigned to device CPU, is_swa = 0 load_tensors: layer 11 assigned to device CPU, is_swa = 0 load_tensors: layer 12 assigned to device CPU, is_swa = 0 load_tensors: layer 13 assigned to device CPU, is_swa = 0 load_tensors: layer 14 assigned to device CPU, is_swa = 0 load_tensors: layer 15 assigned to device CPU, is_swa = 0 load_tensors: layer 16 assigned to device CPU, is_swa = 0 load_tensors: layer 17 assigned to device CPU, is_swa = 0 load_tensors: layer 18 assigned to device CPU, is_swa = 0 load_tensors: layer 19 assigned to device CPU, is_swa = 0 load_tensors: layer 20 assigned to device CPU, is_swa = 0 load_tensors: layer 21 assigned to device CPU, is_swa = 0 load_tensors: layer 22 assigned to device CPU, is_swa = 0 load_tensors: layer 23 assigned to device CPU, is_swa = 0 load_tensors: layer 24 assigned to device CPU, is_swa = 0 load_tensors: layer 25 assigned to device CPU, is_swa = 0 load_tensors: layer 26 assigned to device CPU, is_swa = 0 load_tensors: layer 27 assigned to device CPU, is_swa = 0 load_tensors: layer 28 assigned to device CPU, is_swa = 0 load_tensors: layer 29 assigned to device CPU, is_swa = 0 load_tensors: layer 30 assigned to device CPU, is_swa = 0 load_tensors: layer 31 assigned to device CPU, is_swa = 0 load_tensors: layer 32 assigned to device CPU, is_swa = 0 time=2025-08-02T22:45:52.090-07:00 level=INFO source=server.go:632 msg="waiting for server to become available" status="llm server loading model" load_tensors: CPU_Mapped model buffer size = 4437.80 MiB llama_context: constructing llama_context llama_context: n_seq_max = 1 llama_context: n_ctx = 4096 llama_context: n_ctx_per_seq = 4096 llama_context: n_batch = 512 llama_context: n_ubatch = 512 llama_context: causal_attn = 1 llama_context: flash_attn = 0 llama_context: freq_base = 500000.0 llama_context: freq_scale = 1 llama_context: n_ctx_per_seq (4096) < n_ctx_train (8192) -- the full capacity of the model will not be utilized set_abort_callback: call llama_context: CPU output buffer size = 0.50 MiB create_memory: n_ctx = 4096 (padded) llama_kv_cache_unified: kv_size = 4096, type_k = 'f16', type_v = 'f16', n_layer = 32, can_shift = 1, padding = 32 llama_kv_cache_unified: layer 0: dev = CPU llama_kv_cache_unified: layer 1: dev = CPU llama_kv_cache_unified: layer 2: dev = CPU llama_kv_cache_unified: layer 3: dev = CPU llama_kv_cache_unified: layer 4: dev = CPU llama_kv_cache_unified: layer 5: dev = CPU llama_kv_cache_unified: layer 6: dev = CPU llama_kv_cache_unified: layer 7: dev = CPU llama_kv_cache_unified: layer 8: dev = CPU llama_kv_cache_unified: layer 9: dev = CPU llama_kv_cache_unified: layer 10: dev = CPU llama_kv_cache_unified: layer 11: dev = CPU llama_kv_cache_unified: layer 12: dev = CPU llama_kv_cache_unified: layer 13: dev = CPU llama_kv_cache_unified: layer 14: dev = CPU llama_kv_cache_unified: layer 15: dev = CPU llama_kv_cache_unified: layer 16: dev = CPU llama_kv_cache_unified: layer 17: dev = CPU llama_kv_cache_unified: layer 18: dev = CPU llama_kv_cache_unified: layer 19: dev = CPU llama_kv_cache_unified: layer 20: dev = CPU llama_kv_cache_unified: layer 21: dev = CPU llama_kv_cache_unified: layer 22: dev = CPU llama_kv_cache_unified: layer 23: dev = CPU llama_kv_cache_unified: layer 24: dev = CPU llama_kv_cache_unified: layer 25: dev = CPU llama_kv_cache_unified: layer 26: dev = CPU llama_kv_cache_unified: layer 27: dev = CPU llama_kv_cache_unified: layer 28: dev = CPU llama_kv_cache_unified: layer 29: dev = CPU llama_kv_cache_unified: layer 30: dev = CPU llama_kv_cache_unified: layer 31: dev = CPU llama_kv_cache_unified: CPU KV buffer size = 512.00 MiB llama_kv_cache_unified: KV self size = 512.00 MiB, K (f16): 256.00 MiB, V (f16): 256.00 MiB llama_context: enumerating backends llama_context: backend_ptrs.size() = 1 llama_context: max_nodes = 65536 llama_context: worst-case: n_tokens = 512, n_seqs = 1, n_outputs = 0 llama_context: reserving graph for n_tokens = 512, n_seqs = 1 llama_context: reserving graph for n_tokens = 1, n_seqs = 1 llama_context: reserving graph for n_tokens = 512, n_seqs = 1 llama_context: CPU compute buffer size = 296.01 MiB llama_context: graph nodes = 1094 llama_context: graph splits = 1 time=2025-08-02T22:45:52.341-07:00 level=INFO source=server.go:637 msg="llama runner started in 0.50 seconds" time=2025-08-02T22:45:52.341-07:00 level=DEBUG source=sched.go:493 msg="finished setting up" runner.name=registry.ollama.ai/library/llama3:8b runner.inference=cuda runner.devices=1 runner.size="5.4 GiB" runner.vram="5.4 GiB" runner.parallel=1 runner.pid=REDACTED runner.model=/home/USER-REDACTED/.ollama/models/blobs/sha256-REDACTED runner.num_ctx=4096 [GIN] 2025/08/02 - 22:45:52 | 200 | 913.876018ms | 127.0.0.1 | POST "/api/generate" time=2025-08-02T22:45:52.341-07:00 level=DEBUG source=sched.go:501 msg="context for request finished" time=2025-08-02T22:45:52.341-07:00 level=DEBUG source=sched.go:341 msg="runner with non-zero duration has gone idle, adding timer" runner.name=registry.ollama.ai/library/llama3:8b runner.inference=cuda runner.devices=1 runner.size="5.4 GiB" runner.vram="5.4 GiB" runner.parallel=1 runner.pid=REDACTED runner.model=/home/USER-REDACTED/.ollama/models/blobs/sha256-REDACTED runner.num_ctx=4096 duration=5m0s time=2025-08-02T22:45:52.341-07:00 level=DEBUG source=sched.go:359 msg="after processing request finished event" runner.name=registry.ollama.ai/library/llama3:8b runner.inference=cuda runner.devices=1 runner.size="5.4 GiB" runner.vram="5.4 GiB" runner.parallel=1 runner.pid=REDACTED runner.model=/home/USER-REDACTED/.ollama/models/blobs/sha256-REDACTED runner.num_ctx=4096 refCount=0 time=2025-08-02T22:46:01.318-07:00 level=DEBUG source=sched.go:613 msg="evaluating already loaded" model=/home/USER-REDACTED/.ollama/models/blobs/sha256-REDACTED time=2025-08-02T22:46:01.318-07:00 level=DEBUG source=server.go:736 msg="completion request" images=0 prompt=106 format="" time=2025-08-02T22:46:01.318-07:00 level=DEBUG source=cache.go:104 msg="loading cache slot" id=0 cache=0 prompt=11 used=0 remaining=11 [GIN] 2025/08/02 - 22:46:04 | 200 | 3.625428917s | 127.0.0.1 | POST "/api/chat" time=2025-08-02T22:46:04.918-07:00 level=DEBUG source=sched.go:432 msg="context for request finished" runner.name=registry.ollama.ai/library/llama3:8b runner.inference=cuda runner.devices=1 runner.size="5.4 GiB" runner.vram="5.4 GiB" runner.parallel=1 runner.pid=REDACTED runner.model=/home/USER-REDACTED/.ollama/models/blobs/sha256-REDACTED runner.num_ctx=4096 time=2025-08-02T22:46:04.918-07:00 level=DEBUG source=sched.go:341 msg="runner with non-zero duration has gone idle, adding timer" runner.name=registry.ollama.ai/library/llama3:8b runner.inference=cuda runner.devices=1 runner.size="5.4 GiB" runner.vram="5.4 GiB" runner.parallel=1 runner.pid=REDACTED runner.model=/home/USER-REDACTED/.ollama/models/blobs/sha256-REDACTED runner.num_ctx=4096 duration=5m0s time=2025-08-02T22:46:04.918-07:00 level=DEBUG source=sched.go:359 msg="after processing request finished event" runner.name=registry.ollama.ai/library/llama3:8b runner.inference=cuda runner.devices=1 runner.size="5.4 GiB" runner.vram="5.4 GiB" runner.parallel=1 runner.pid=REDACTED runner.model=/home/USER-REDACTED/.ollama/models/blobs/sha256-REDACTED runner.num_ctx=4096 refCount=0
Author
Owner

@rick-github commented on GitHub (Aug 3, 2025):

time=2025-08-02T22:45:51.845-07:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama
load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-icelake.so
time=2025-08-02T22:45:51.847-07:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.AVX512=1 CPU.0.AVX512_VBMI=1 CPU.0.AVX512_VNNI=1 CPU.0.AVX512_BF16=1 CPU.0.LLAMAFILE=1 CPU.1.SSE3=1 CPU.1.SSSE3=1 CPU.1.AVX=1 CPU.1.AVX2=1 CPU.1.F16C=1 CPU.1.FMA=1 CPU.1.BMI2=1 CPU.1.AVX512=1 CPU.1.AVX512_VBMI=1 CPU.1.AVX512_VNNI=1 CPU.1.AVX512_BF16=1 CPU.1.LLAMAFILE=1 compiler=cgo(gcc)

No GPU backends found. The version looks like this is a non-official build. Did you build and install the libraries?

<!-- gh-comment-id:3147380601 --> @rick-github commented on GitHub (Aug 3, 2025): ``` time=2025-08-02T22:45:51.845-07:00 level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=/usr/lib/ollama load_backend: loaded CPU backend from /usr/lib/ollama/libggml-cpu-icelake.so time=2025-08-02T22:45:51.847-07:00 level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.AVX512=1 CPU.0.AVX512_VBMI=1 CPU.0.AVX512_VNNI=1 CPU.0.AVX512_BF16=1 CPU.0.LLAMAFILE=1 CPU.1.SSE3=1 CPU.1.SSSE3=1 CPU.1.AVX=1 CPU.1.AVX2=1 CPU.1.F16C=1 CPU.1.FMA=1 CPU.1.BMI2=1 CPU.1.AVX512=1 CPU.1.AVX512_VBMI=1 CPU.1.AVX512_VNNI=1 CPU.1.AVX512_BF16=1 CPU.1.LLAMAFILE=1 compiler=cgo(gcc) ``` No GPU backends found. The version looks like this is a non-official build. Did you build and install the libraries?
Author
Owner

@dcmoore commented on GitHub (Aug 3, 2025):

I am running cachyos, so I installed ollama using yay from an AUR server.

Any idea why it would fail to find a GPU backend when it is able to detect the GPU hardware?

time=2025-08-02T22:45:45.117-07:00 level=INFO source=types.go:130 msg="inference compute" id=GPU-UUID-REDACTED library=cuda variant=v12 compute=12.0 driver=12.9 name="NVIDIA GeForce RTX 5080" total="15.4 GiB" available="14.4 GiB"

What is a backend defined as in this context? Is it another term for driver? Because it seems to find the driver just fine CUDA driver version: 12.9.

<!-- gh-comment-id:3147468813 --> @dcmoore commented on GitHub (Aug 3, 2025): I am running cachyos, so I installed ollama using [yay](https://github.com/Jguer/yay) from an [AUR server](https://aur.archlinux.org/packages/ollama-git). Any idea why it would fail to find a GPU backend when it is able to detect the GPU hardware? > time=2025-08-02T22:45:45.117-07:00 level=INFO source=types.go:130 msg="inference compute" id=GPU-UUID-REDACTED library=cuda variant=v12 compute=12.0 driver=12.9 name="NVIDIA GeForce RTX 5080" total="15.4 GiB" available="14.4 GiB" What is a backend defined as in this context? Is it another term for driver? Because it seems to find the driver just fine `CUDA driver version: 12.9`.
Author
Owner

@rick-github commented on GitHub (Aug 3, 2025):

Install ollama-cuda. The server detects the GPU but the runner needs to load the device specific backends (CUDA, CPU, ROCm) to actually use the device.

<!-- gh-comment-id:3147508139 --> @rick-github commented on GitHub (Aug 3, 2025): Install ollama-cuda. The server detects the GPU but the runner needs to load the device specific backends (CUDA, CPU, ROCm) to actually use the device.
Author
Owner

@dcmoore commented on GitHub (Aug 3, 2025):

Ahh, dang. Sorry, I missed your previous convo about that https://github.com/ollama/ollama/issues/10075 Thanks for the tip. I'll give that a go.

<!-- gh-comment-id:3147692797 --> @dcmoore commented on GitHub (Aug 3, 2025): Ahh, dang. Sorry, I missed your previous convo about that https://github.com/ollama/ollama/issues/10075 Thanks for the tip. I'll give that a go.
Author
Owner

@20246688 commented on GitHub (Aug 4, 2025):

安装 ollama-cuda。服务器检测到 GPU,但运行器需要加载设备特定的后端(CUDA、CPU、ROCm)才能实际使用该设备。

thanks a lot

<!-- gh-comment-id:3149283107 --> @20246688 commented on GitHub (Aug 4, 2025): > 安装 ollama-cuda。服务器检测到 GPU,但运行器需要加载设备特定的后端(CUDA、CPU、ROCm)才能实际使用该设备。 thanks a lot
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#7682