mirror of
https://github.com/ollama/ollama.git
synced 2026-05-07 00:22:43 -05:00
Open
opened 2026-04-22 17:45:11 -05:00 by GiteaMirror
·
4 comments
No Branch/Tag Specified
main
hoyyeva/anthropic-local-image-path
dhiltgen/ci
dhiltgen/llama-runner
parth-remove-claude-desktop-launch
hoyyeva/anthropic-reference-images-path
parth-anthropic-reference-images-path
brucemacd/download-before-remove
hoyyeva/editor-config-repair
parth-mlx-decode-checkpoints
parth-launch-codex-app
hoyyeva/fix-codex-model-metadata-warning
hoyyeva/qwen
parth/hide-claude-desktop-till-release
hoyyeva/opencode-image-modality
parth-add-claude-code-autoinstall
release_v0.22.0
pdevine/manifest-list
codex/fix-codex-model-metadata-warning
pdevine/addressable-manifest
brucemacd/launch-fetch-reccomended
jmorganca/llama-compat
launch-copilot-cli
hoyyeva/opencode-thinking
release_v0.20.7
parth-auto-save-backup
parth-test
jmorganca/gemma4-audio-replacements
fix-manifest-digest-on-pull
hoyyeva/vscode-improve
brucemacd/install-server-wait
parth/update-claude-docs
brucemac/start-ap-install
pdevine/mlx-update
pdevine/qwen35_vision
drifkin/api-show-fallback
mintlify/image-generation-1773352582
hoyyeva/server-context-length-local-config
jmorganca/faster-reptition-penalties
jmorganca/convert-nemotron
parth-pi-thinking
pdevine/sampling-penalties
jmorganca/fix-create-quantization-memory
dongchen/resumable_transfer_fix
pdevine/sampling-cache-error
jessegross/mlx-usage
hoyyeva/openclaw-config
hoyyeva/app-html
pdevine/qwen3next
brucemacd/sign-sh-install
brucemacd/tui-update
brucemacd/usage-api
jmorganca/launch-empty
fix-app-dist-embed
mxyng/mlx-compile
mxyng/mlx-quant
mxyng/mlx-glm4.7
mxyng/mlx
brucemacd/simplify-model-picker
jmorganca/qwen3-concurrent
fix-glm-4.7-flash-mla-config
drifkin/qwen3-coder-opening-tag
brucemacd/usage-cli
fix-cuda12-fattn-shmem
ollama-imagegen-docs
parth/fix-multiline-inputs
brucemacd/config-docs
mxyng/model-files
mxyng/simple-execute
fix-imagegen-ollama-models
mxyng/async-upload
jmorganca/lazy-no-dtype-changes
imagegen-auto-detect-create
parth/decrease-concurrent-download-hf
fix-mlx-quantize-init
jmorganca/x-cleanup
usage
imagegen-readme
jmorganca/glm-image
mlx-gpu-cd
jmorganca/imagegen-modelfile
parth/agent-skills
parth/agent-allowlist
parth/signed-in-offline
parth/agents
parth/fix-context-chopping
improve-cloud-flow
parth/add-models-websearch
parth/prompt-renderer-mcp
jmorganca/native-settings
jmorganca/download-stream-hash
jmorganca/client2-rebased
brucemacd/oai-chat-req-multipart
jessegross/multi_chunk_reserve
grace/additional-omit-empty
grace/mistral-3-large
mxyng/tokenizer2
mxyng/tokenizer
jessegross/flash
hoyyeva/windows-nacked-app
mxyng/cleanup-attention
grace/deepseek-parser
hoyyeva/remember-unsent-prompt
parth/add-lfs-pointer-error-conversion
parth/olmo2-test2
hoyyeva/ollama-launchagent-plist
nicole/olmo-model
parth/olmo-test
mxyng/remove-embedded
parth/render-template
jmorganca/intellect-3
parth/remove-prealloc-linter
jmorganca/cmd-eval
nicole/nomic-embed-text-fix
mxyng/lint-2
hoyyeva/add-gemini-3-pro-preview
hoyyeva/load-model-list
mxyng/expand-path
mxyng/environ-2
hoyyeva/deeplink-json-encoding
parth/improve-tool-calling-tests
hoyyeva/conversation
hoyyeva/assistant-edit-response
hoyyeva/thinking
origin/brucemacd/invalid-char-i-err
parth/improve-tool-calling
jmorganca/required-omitempty
grace/qwen3-vl-tests
mxyng/iter-client
parth/docs-readme
nicole/embed-test
pdevine/integration-benchstat
parth/remove-generate-cmd
parth/add-toolcall-id
mxyng/server-tests
jmorganca/glm-4.6
jmorganca/gin-h-compat
drifkin/stable-tool-args
pdevine/qwen3-more-thinking
parth/add-websearch-client
nicole/websearch_local
jmorganca/qwen3-coder-updates
grace/deepseek-v3-migration-tests
mxyng/fix-create
jmorganca/cloud-errors
pdevine/parser-tidy
revert-12233-parth/simplify-entrypoints-runner
parth/enable-so-gpt-oss
brucemacd/qwen3vl
jmorganca/readme-simplify
parth/gpt-oss-structured-outputs
revert-12039-jmorganca/tools-braces
mxyng/embeddings
mxyng/gguf
mxyng/benchmark
mxyng/types-null
parth/move-parsing
mxyng/gemma2
jmorganca/docs
mxyng/16-bit
mxyng/create-stdin
pdevine/authorizedkeys
mxyng/quant
parth/opt-in-error-context-window
brucemacd/cache-models
brucemacd/runner-completion
jmorganca/llama-update-6
brucemacd/benchmark-list
brucemacd/partial-read-caps
parth/deepseek-r1-tools
mxyng/omit-array
parth/tool-prefix-temp
brucemacd/runner-test
jmorganca/qwen25vl
brucemacd/model-forward-test-ext
parth/python-function-parsing
jmorganca/cuda-compression-none
drifkin/num-parallel
drifkin/chat-truncation-fix
jmorganca/sync
parth/python-tools-calling
drifkin/array-head-count
brucemacd/create-no-loop
parth/server-enable-content-stream-with-tools
qwen25omni
mxyng/v3
brucemacd/ropeconfig
jmorganca/silence-tokenizer
parth/sample-so-test
parth/sampling-structured-outputs
brucemacd/doc-go-engine
parth/constrained-sampling-json
jmorganca/mistral-wip
brucemacd/mistral-small-convert
parth/sample-unmarshal-json-for-params
brucemacd/jomorganca/mistral
pdevine/bfloat16
jmorganca/mistral
brucemacd/mistral
pdevine/logging
parth/sample-correctness-fix
parth/sample-fix-sorting
jmorgan/sample-fix-sorting-extras
jmorganca/temp-0-images
brucemacd/parallel-embed-models
brucemacd/shim-grammar
jmorganca/fix-gguf-error
bmizerany/nameswork
jmorganca/faster-releases
bmizerany/validatenames
brucemacd/err-no-vocab
brucemacd/rope-config
brucemacd/err-hint
brucemacd/qwen2_5
brucemacd/logprobs
brucemacd/new_runner_graph_bench
progress-flicker
brucemacd/forward-test
brucemacd/go_qwen2
pdevine/gemma2
jmorganca/add-missing-symlink-eval
mxyng/next-debug
parth/set-context-size-openai
brucemacd/next-bpe-bench
brucemacd/next-bpe-test
brucemacd/new_runner_e2e
brucemacd/new_runner_qwen2
pdevine/convert-cohere2
brucemacd/convert-cli
parth/log-probs
mxyng/next-mlx
mxyng/cmd-history
parth/templating
parth/tokenize-detokenize
brucemacd/check-key-register
bmizerany/grammar
jmorganca/vendor-081b29bd
mxyng/func-checks
jmorganca/fix-null-format
parth/fix-default-to-warn-json
jmorganca/qwen2vl
jmorganca/no-concat
parth/cmd-cleanup-SO
brucemacd/check-key-register-structured-err
parth/openai-stream-usage
parth/fix-referencing-so
stream-tools-stop
jmorganca/degin-1
brucemacd/install-path-clean
brucemacd/push-name-validation
brucemacd/browser-key-register
jmorganca/openai-fix-first-message
jmorganca/fix-proxy
jessegross/sample
parth/disallow-streaming-tools
dhiltgen/remove_submodule
jmorganca/ga
jmorganca/mllama
pdevine/newlines
pdevine/geems-2b
jmorganca/llama-bump
mxyng/modelname-7
mxyng/gin-slog
mxyng/modelname-6
jyan/convert-prog
jyan/quant5
paligemma-support
pdevine/import-docs
jmorganca/openai-context
jyan/paligemma
jyan/p2
jyan/palitest
bmizerany/embedspeedup
jmorganca/llama-vit
brucemacd/allow-ollama
royh/ep-methods
royh/whisper
mxyng/api-models
mxyng/fix-memory
jyan/q4_4/8
jyan/ollama-v
royh/stream-tools
roy-embed-parallel
bmizerany/hrm
revert-5963-revert-5924-mxyng/llama3.1-rope
royh/embed-viz
jyan/local2
jyan/auth
jyan/local
jyan/parse-temp
jmorganca/template-mistral
jyan/reord-g
royh-openai-suffixdocs
royh-imgembed
royh-embed-parallel
jyan/quant4
royh-precision
jyan/progress
pdevine/fix-template
jyan/quant3
pdevine/ggla
mxyng/update-registry-domain
jmorganca/ggml-static
mxyng/create-context
jyan/v0.146
mxyng/layers-from-files
build_dist
bmizerany/noseek
royh-ls
royh-name
timeout
mxyng/server-timestamp
bmizerany/nosillyggufslurps
royh-params
jmorganca/llama-cpp-7c26775
royh-openai-delete
royh-show-rigid
jmorganca/enable-fa
jmorganca/no-error-template
jyan/format
royh-testdelete
bmizerany/fastverify
language_support
pdevine/ps-glitches
brucemacd/tokenize
bruce/iq-quants
bmizerany/filepathwithcoloninhost
mxyng/split-bin
bmizerany/client-registry
jmorganca/if-none-match
native
jmorganca/native
jmorganca/batch-embeddings
jmorganca/initcmake
jmorganca/mm
pdevine/showggmlinfo
modenameenforcealphanum
bmizerany/modenameenforcealphanum
jmorganca/done-reason
jmorganca/llama-cpp-8960fe8
ollama.com
bmizerany/filepathnobuild
bmizerany/types/model/defaultfix
rmdisplaylong
nogogen
bmizerany/x
modelfile-readme
bmizerany/replacecolon
jmorganca/limit
jmorganca/execstack
jmorganca/replace-assets
mxyng/tune-concurrency
jmorganca/testing
whitespace-detection
jmorganca/options
upgrade-all
scratch
cuda-search
mattw/airenamer
mattw/allmodelsonhuggingface
mattw/quantcontext
mattw/whatneedstorun
brucemacd/llama-mem-calc
mattw/faq-context
mattw/communitylinks
mattw/noprune
mattw/python-functioncalling
rename
mxyng/install
pulse
remove-first
editor
mattw/selfqueryingretrieval
cgo
mattw/howtoquant
api
matt/streamingapi
format-config
mxyng/extra-args
shell
update-nous-hermes
cp-model
upload-progress
fix-unknown-model
fix-model-names
delete-fix
insecure-registry
ls
deletemodels
progressbar
readme-updates
license-layers
skip-list
list-models
modelpath
matt/examplemodelfiles
distribution
go-opts
v0.30.0-rc3
v0.30.0-rc2
v0.30.0-rc1
v0.30.0-rc0
v0.23.1
v0.23.1-rc0
v0.23.0
v0.23.0-rc0
v0.22.1
v0.22.1-rc1
v0.22.1-rc0
v0.22.0
v0.22.0-rc1
v0.21.3-rc0
v0.21.2-rc1
v0.21.2
v0.21.2-rc0
v0.21.1
v0.21.1-rc1
v0.21.1-rc0
v0.21.0
v0.21.0-rc1
v0.21.0-rc0
v0.20.8-rc0
v0.20.7
v0.20.7-rc1
v0.20.7-rc0
v0.20.6
v0.20.6-rc1
v0.20.6-rc0
v0.20.5
v0.20.5-rc2
v0.20.5-rc1
v0.20.5-rc0
v0.20.4
v0.20.4-rc2
v0.20.4-rc1
v0.20.4-rc0
v0.20.3
v0.20.3-rc0
v0.20.2
v0.20.1
v0.20.1-rc2
v0.20.1-rc1
v0.20.1-rc0
v0.20.0
v0.20.0-rc1
v0.20.0-rc0
v0.19.0
v0.19.0-rc2
v0.19.0-rc1
v0.19.0-rc0
v0.18.4-rc1
v0.18.4-rc0
v0.18.3
v0.18.3-rc2
v0.18.3-rc1
v0.18.3-rc0
v0.18.2
v0.18.2-rc1
v0.18.2-rc0
v0.18.1
v0.18.1-rc1
v0.18.1-rc0
v0.18.0
v0.18.0-rc2
v0.18.0-rc1
v0.18.0-rc0
v0.17.8-rc4
v0.17.8-rc3
v0.17.8-rc2
v0.17.8-rc1
v0.17.8-rc0
v0.17.7
v0.17.7-rc2
v0.17.7-rc1
v0.17.7-rc0
v0.17.6
v0.17.5
v0.17.4
v0.17.3
v0.17.2
v0.17.1
v0.17.1-rc2
v0.17.1-rc1
v0.17.1-rc0
v0.17.0
v0.17.0-rc2
v0.17.0-rc1
v0.17.0-rc0
v0.16.3
v0.16.3-rc2
v0.16.3-rc1
v0.16.3-rc0
v0.16.2
v0.16.2-rc0
v0.16.1
v0.16.0
v0.16.0-rc2
v0.16.0-rc0
v0.16.0-rc1
v0.15.6
v0.15.5
v0.15.5-rc5
v0.15.5-rc4
v0.15.5-rc3
v0.15.5-rc2
v0.15.5-rc1
v0.15.5-rc0
v0.15.4
v0.15.3
v0.15.2
v0.15.1
v0.15.1-rc1
v0.15.1-rc0
v0.15.0-rc6
v0.15.0
v0.15.0-rc5
v0.15.0-rc4
v0.15.0-rc3
v0.15.0-rc2
v0.15.0-rc1
v0.15.0-rc0
v0.14.3
v0.14.3-rc3
v0.14.3-rc2
v0.14.3-rc1
v0.14.3-rc0
v0.14.2
v0.14.2-rc1
v0.14.2-rc0
v0.14.1
v0.14.0-rc11
v0.14.0
v0.14.0-rc10
v0.14.0-rc9
v0.14.0-rc8
v0.14.0-rc7
v0.14.0-rc6
v0.14.0-rc5
v0.14.0-rc4
v0.14.0-rc3
v0.14.0-rc2
v0.14.0-rc1
v0.14.0-rc0
v0.13.5
v0.13.5-rc1
v0.13.5-rc0
v0.13.4-rc2
v0.13.4
v0.13.4-rc1
v0.13.4-rc0
v0.13.3
v0.13.3-rc1
v0.13.3-rc0
v0.13.2
v0.13.2-rc2
v0.13.2-rc1
v0.13.2-rc0
v0.13.1
v0.13.1-rc2
v0.13.1-rc1
v0.13.1-rc0
v0.13.0
v0.13.0-rc0
v0.12.11
v0.12.11-rc1
v0.12.11-rc0
v0.12.10
v0.12.10-rc1
v0.12.10-rc0
v0.12.9-rc0
v0.12.9
v0.12.8
v0.12.8-rc0
v0.12.7
v0.12.7-rc1
v0.12.7-rc0
v0.12.7-citest0
v0.12.6
v0.12.6-rc1
v0.12.6-rc0
v0.12.5
v0.12.5-rc0
v0.12.4
v0.12.4-rc7
v0.12.4-rc6
v0.12.4-rc5
v0.12.4-rc4
v0.12.4-rc3
v0.12.4-rc2
v0.12.4-rc1
v0.12.4-rc0
v0.12.3
v0.12.2
v0.12.2-rc0
v0.12.1
v0.12.1-rc1
v0.12.1-rc2
v0.12.1-rc0
v0.12.0
v0.12.0-rc1
v0.12.0-rc0
v0.11.11
v0.11.11-rc3
v0.11.11-rc2
v0.11.11-rc1
v0.11.11-rc0
v0.11.10
v0.11.9
v0.11.9-rc0
v0.11.8
v0.11.8-rc0
v0.11.7-rc1
v0.11.7-rc0
v0.11.7
v0.11.6
v0.11.6-rc0
v0.11.5-rc4
v0.11.5-rc3
v0.11.5
v0.11.5-rc5
v0.11.5-rc2
v0.11.5-rc1
v0.11.5-rc0
v0.11.4
v0.11.4-rc0
v0.11.3
v0.11.3-rc0
v0.11.2
v0.11.1
v0.11.0-rc0
v0.11.0-rc1
v0.11.0-rc2
v0.11.0
v0.10.2-int1
v0.10.1
v0.10.0
v0.10.0-rc4
v0.10.0-rc3
v0.10.0-rc2
v0.10.0-rc1
v0.10.0-rc0
v0.9.7-rc1
v0.9.7-rc0
v0.9.6
v0.9.6-rc0
v0.9.6-ci0
v0.9.5
v0.9.4-rc5
v0.9.4-rc6
v0.9.4
v0.9.4-rc3
v0.9.4-rc4
v0.9.4-rc1
v0.9.4-rc2
v0.9.4-rc0
v0.9.3
v0.9.3-rc5
v0.9.4-citest0
v0.9.3-rc4
v0.9.3-rc3
v0.9.3-rc2
v0.9.3-rc1
v0.9.3-rc0
v0.9.2
v0.9.1
v0.9.1-rc1
v0.9.1-rc0
v0.9.1-ci1
v0.9.1-ci0
v0.9.0
v0.9.0-rc0
v0.8.0
v0.8.0-rc0
v0.7.1-rc2
v0.7.1
v0.7.1-rc1
v0.7.1-rc0
v0.7.0
v0.7.0-rc1
v0.7.0-rc0
v0.6.9-rc0
v0.6.8
v0.6.8-rc0
v0.6.7
v0.6.7-rc2
v0.6.7-rc1
v0.6.7-rc0
v0.6.6
v0.6.6-rc2
v0.6.6-rc1
v0.6.6-rc0
v0.6.5-rc1
v0.6.5
v0.6.5-rc0
v0.6.4-rc0
v0.6.4
v0.6.3-rc1
v0.6.3
v0.6.3-rc0
v0.6.2
v0.6.2-rc0
v0.6.1
v0.6.1-rc0
v0.6.0-rc0
v0.6.0
v0.5.14-rc0
v0.5.13
v0.5.13-rc6
v0.5.13-rc5
v0.5.13-rc4
v0.5.13-rc3
v0.5.13-rc2
v0.5.13-rc1
v0.5.13-rc0
v0.5.12
v0.5.12-rc1
v0.5.12-rc0
v0.5.11
v0.5.10
v0.5.9
v0.5.9-rc0
v0.5.8-rc13
v0.5.8
v0.5.8-rc12
v0.5.8-rc11
v0.5.8-rc10
v0.5.8-rc9
v0.5.8-rc8
v0.5.8-rc7
v0.5.8-rc6
v0.5.8-rc5
v0.5.8-rc4
v0.5.8-rc3
v0.5.8-rc2
v0.5.8-rc1
v0.5.8-rc0
v0.5.7
v0.5.6
v0.5.5
v0.5.5-rc0
v0.5.4
v0.5.3
v0.5.3-rc0
v0.5.2
v0.5.2-rc3
v0.5.2-rc2
v0.5.2-rc1
v0.5.2-rc0
v0.5.1
v0.5.0
v0.5.0-rc1
v0.4.8-rc0
v0.4.7
v0.4.6
v0.4.5
v0.4.4
v0.4.3
v0.4.3-rc0
v0.4.2
v0.4.2-rc1
v0.4.2-rc0
v0.4.1
v0.4.1-rc0
v0.4.0
v0.4.0-rc8
v0.4.0-rc7
v0.4.0-rc6
v0.4.0-rc5
v0.4.0-rc4
v0.4.0-rc3
v0.4.0-rc2
v0.4.0-rc1
v0.4.0-rc0
v0.4.0-ci3
v0.3.14
v0.3.14-rc0
v0.3.13
v0.3.12
v0.3.12-rc5
v0.3.12-rc4
v0.3.12-rc3
v0.3.12-rc2
v0.3.12-rc1
v0.3.11
v0.3.11-rc4
v0.3.11-rc3
v0.3.11-rc2
v0.3.11-rc1
v0.3.10
v0.3.10-rc1
v0.3.9
v0.3.8
v0.3.7
v0.3.7-rc6
v0.3.7-rc5
v0.3.7-rc4
v0.3.7-rc3
v0.3.7-rc2
v0.3.7-rc1
v0.3.6
v0.3.5
v0.3.4
v0.3.3
v0.3.2
v0.3.1
v0.3.0
v0.2.8
v0.2.8-rc2
v0.2.8-rc1
v0.2.7
v0.2.6
v0.2.5
v0.2.4
v0.2.3
v0.2.2
v0.2.2-rc2
v0.2.2-rc1
v0.2.1
v0.2.0
v0.1.49-rc14
v0.1.49-rc13
v0.1.49-rc12
v0.1.49-rc11
v0.1.49-rc10
v0.1.49-rc9
v0.1.49-rc8
v0.1.49-rc7
v0.1.49-rc6
v0.1.49-rc4
v0.1.49-rc5
v0.1.49-rc3
v0.1.49-rc2
v0.1.49-rc1
v0.1.48
v0.1.47
v0.1.46
v0.1.45-rc5
v0.1.45
v0.1.45-rc4
v0.1.45-rc3
v0.1.45-rc2
v0.1.45-rc1
v0.1.44
v0.1.43
v0.1.42
v0.1.41
v0.1.40
v0.1.40-rc1
v0.1.39
v0.1.39-rc2
v0.1.39-rc1
v0.1.38
v0.1.37
v0.1.36
v0.1.35
v0.1.35-rc1
v0.1.34
v0.1.34-rc1
v0.1.33
v0.1.33-rc7
v0.1.33-rc6
v0.1.33-rc5
v0.1.33-rc4
v0.1.33-rc3
v0.1.33-rc2
v0.1.33-rc1
v0.1.32
v0.1.32-rc2
v0.1.32-rc1
v0.1.31
v0.1.30
v0.1.29
v0.1.28
v0.1.27
v0.1.26
v0.1.25
v0.1.24
v0.1.23
v0.1.22
v0.1.21
v0.1.20
v0.1.19
v0.1.18
v0.1.17
v0.1.16
v0.1.15
v0.1.14
v0.1.13
v0.1.12
v0.1.11
v0.1.10
v0.1.9
v0.1.8
v0.1.7
v0.1.6
v0.1.5
v0.1.4
v0.1.3
v0.1.2
v0.1.1
v0.1.0
v0.0.21
v0.0.20
v0.0.19
v0.0.18
v0.0.17
v0.0.16
v0.0.15
v0.0.14
v0.0.13
v0.0.12
v0.0.11
v0.0.10
v0.0.9
v0.0.8
v0.0.7
v0.0.6
v0.0.5
v0.0.4
v0.0.3
v0.0.2
v0.0.1
Labels
Clear labels
amd
api
app
bug
build
cli
cloud
compatibility
context-length
create
docker
documentation
embeddings
feature request
feedback wanted
good first issue
gpt-oss
gpu
harmony
help wanted
image
install
intel
js
launch
linux
macos
memory
mlx
model
needs more info
networking
nvidia
ollama.com
performance
pull-request
python
question
registry
rendering
thinking
tools
top
vulkan
windows
wsl
Mirrored from GitHub Pull Request
Milestone
No items
No Milestone
Projects
Clear projects
No project
No Assignees
Notifications
Due Date
No due date set.
Dependencies
No dependencies set.
Reference: github-starred/ollama#34305
Reference in New Issue
Block a user
Blocking a user prevents them from interacting with repositories, such as opening or commenting on pull requests or issues. Learn more about blocking a user.
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @Zorgonatis on GitHub (Oct 31, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/12895
Originally assigned to: @dhiltgen on GitHub.
What is the issue?
Just moved to 12.8 as I was getting the iGPU identity bug.
I have a 5090 and a 7900XTX. My goal is to use only the 7900XTX while keeping the 5090 free for image gen, gaming, whatever.
https://docs.ollama.com/gpu#gpu-selection
Per the above I am setting CUDA_VISIBLE_DEVICES to -1. This does indeed stop CUDA discovery, but it seems to cause a runner to timeout after 30 seconds and subsequently not try to find other devices:
2025-10-31 23:48:28 - Starting Ollama server...
time=2025-10-31T23:48:28.681Z level=INFO source=routes.go:1524 msg="server config" env="map[CUDA_VISIBLE_DEVICES:-1 GGML_VK_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:16384 OLLAMA_DEBUG:INFO OLLAMA_FLASH_ATTENTION:true OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://0.0.0.0:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:1h0m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:C:\AI-Models\llms OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:4 OLLAMA_ORIGINS:[* http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://*] OLLAMA_REMOTES:[ollama.com] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES:1]"
time=2025-10-31T23:48:28.682Z level=INFO source=images.go:522 msg="total blobs: 4"
time=2025-10-31T23:48:28.682Z level=INFO source=images.go:529 msg="total unused blobs removed: 0"
time=2025-10-31T23:48:28.683Z level=INFO source=routes.go:1577 msg="Listening on [::]:11434 (version 0.12.8)"
time=2025-10-31T23:48:28.684Z level=INFO source=runner.go:76 msg="discovering available GPUs..."
time=2025-10-31T23:48:28.695Z level=INFO source=server.go:400 msg="starting runner" cmd="C:\Users\mswil\AppData\Local\Programs\Ollama\ollama.exe runner --ollama-engine --port 61114"
time=2025-10-31T23:48:28.814Z level=INFO source=server.go:400 msg="starting runner" cmd="C:\Users\mswil\AppData\Local\Programs\Ollama\ollama.exe runner --ollama-engine --port 61118"
time=2025-10-31T23:48:58.814Z level=INFO source=runner.go:498 msg="failure during GPU discovery" OLLAMA_LIBRARY_PATH="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v13]" extra_envs=map[] error="failed to finish discovery before timeout"
time=2025-10-31T23:48:58.822Z level=INFO source=server.go:400 msg="starting runner" cmd="C:\Users\mswil\AppData\Local\Programs\Ollama\ollama.exe runner --ollama-engine --port 61123"
time=2025-10-31T23:48:58.893Z level=INFO source=types.go:60 msg="inference compute" id=cpu library=cpu compute="" name=cpu description=cpu libdirs=ollama driver="" pci_id="" type="" total="93.6 GiB" available="62.6 GiB"
time=2025-10-31T23:48:58.893Z level=INFO source=routes.go:1618 msg="entering low vram mode" "total vram"="0 B" threshold="20.0 GiB"
Relevant log output
OS
No response
GPU
No response
CPU
No response
Ollama version
No response
@dhiltgen commented on GitHub (Nov 4, 2025):
Can you set OLLAMA_DEBUG=2 and share your startup log so we can see more details on what's going wrong?
@Zorgonatis commented on GitHub (Nov 4, 2025):
Sure thing, here's the output below. When running we get to "gs 0x2b" and it stalls before continuing, ~30 seconds or so.
I recognise it's an unusual gpu config so feel free to ask me to run any other tests you might have.
2025-11-04 17:48:09 - Starting Ollama server...
time=2025-11-04T17:48:09.258Z level=INFO source=routes.go:1524 msg="server config" env="map[CUDA_VISIBLE_DEVICES:-1 GGML_VK_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:16384 OLLAMA_DEBUG:DEBUG-4 OLLAMA_FLASH_ATTENTION:true OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://0.0.0.0:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:1h0m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:C:\AI-Models\llms OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:4 OLLAMA_ORIGINS:[* http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://] OLLAMA_REMOTES:[ollama.com] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES:]"
time=2025-11-04T17:48:09.259Z level=INFO source=images.go:522 msg="total blobs: 4"
time=2025-11-04T17:48:09.260Z level=INFO source=images.go:529 msg="total unused blobs removed: 0"
time=2025-11-04T17:48:09.260Z level=INFO source=routes.go:1577 msg="Listening on [::]:11434 (version 0.12.8)"
time=2025-11-04T17:48:09.260Z level=DEBUG source=sched.go:120 msg="starting llm scheduler"
time=2025-11-04T17:48:09.261Z level=INFO source=runner.go:76 msg="discovering available GPUs..."
time=2025-11-04T17:48:09.261Z level=TRACE source=runner.go:474 msg="starting runner for device discovery" libDirs="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v12]" extraEnvs=map[]
time=2025-11-04T17:48:09.272Z level=INFO source=server.go:400 msg="starting runner" cmd="C:\Users\mswil\AppData\Local\Programs\Ollama\ollama.exe runner --ollama-engine --port 52374"
time=2025-11-04T17:48:09.272Z level=DEBUG source=server.go:401 msg=subprocess CUDA_PATH="C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.8" CUDA_PATH_V12_9="C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.9" CUDA_VISIBLE_DEVICES=-1 HIP_PATH="C:\Program Files\AMD\ROCm\6.2" HIP_PATH_64="C:\Program Files\AMD\ROCm\6.4\" OLLAMA_CONTEXT_LENGTH=16384 OLLAMA_DEBUG=2 OLLAMA_FLASH_ATTENTION=true OLLAMA_HOST=0.0.0.0 OLLAMA_KEEP_ALIVE=1h OLLAMA_MODELS=C:\AI-Models\llms OLLAMA_NUM_PARALLEL=4 OLLAMA_ORIGINS= PATH="C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama;C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v12;;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.9\bin;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.9\libnvvp;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.8\bin;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.8\libnvvp;C:\Program Files\Oculus\Support\oculus-runtime;c:\windows\system32;c:\windows;c:\windows\system32\wbem;c:\windows\system32\windowspowershell\v1.0\;c:\windows\system32\openssh\;c:\program files\microsoft vs code\bin;c:\program files\dotnet\;c:\program files\nvidia corporation\nvidia nvdlisr;C:\Program Files\PuTTY\;C:\Program Files\Microsoft VS Code\bin;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;C:\WINDOWS\System32\WindowsPowerShell\v1.0\;C:\WINDOWS\System32\OpenSSH\;C:\Program Files\AMD\ROCm\6.2\bin;C:\Program Files\NVIDIA Corporation\NVIDIA App\NvDLISR;C:\Program Files (x86)\NVIDIA Corporation\PhysX\Common;C:\Program Files\Docker\Docker\resources\bin;C:\Program Files\Git\cmd;C:\Program Files\nodejs\;C:\Program Files\NVIDIA Corporation\Nsight Compute 2025.3.1\;C:\Program Files\cursor\resources\app\bin;C:\Users\mswil\miniconda3;C:\Users\mswil\miniconda3\Library\mingw-w64\bin;C:\Users\mswil\miniconda3\Library\usr\bin;C:\Users\mswil\miniconda3\Library\bin;C:\Users\mswil\miniconda3\Scripts;C:\Users\mswil\AppData\Local\Programs\Python\Python311\Scripts\;C:\Users\mswil\AppData\Local\Programs\Python\Python311\;C:\Users\mswil\AppData\Local\Microsoft\WindowsApps;C:\Program Files\AMD\ROCm\6.2\bin;C:\Users\mswil\AppData\Local\Programs\Ollama;C:\Users\mswil\.lmstudio\bin;C:\Users\mswil\AppData\Local\Microsoft\WinGet\Packages\Gyan.FFmpeg_Microsoft.Winget.Source_8wekyb3d8bbwe\ffmpeg-8.0-full_build\bin;C:\Users\mswil\AppData\Roaming\npm;C:\Users\mswil\AppData\Local\Programs\Microsoft VS Code\bin" OLLAMA_LIBRARY_PATH=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama;C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v12
time=2025-11-04T17:48:09.298Z level=INFO source=runner.go:1349 msg="starting ollama engine"
time=2025-11-04T17:48:09.298Z level=INFO source=runner.go:1384 msg="Server listening on 127.0.0.1:52374"
time=2025-11-04T17:48:09.306Z level=DEBUG source=gguf.go:590 msg=general.architecture type=string
time=2025-11-04T17:48:09.306Z level=DEBUG source=gguf.go:590 msg=tokenizer.ggml.model type=string
time=2025-11-04T17:48:09.306Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
time=2025-11-04T17:48:09.307Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
time=2025-11-04T17:48:09.307Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0
time=2025-11-04T17:48:09.307Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default=""
time=2025-11-04T17:48:09.307Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default=""
time=2025-11-04T17:48:09.307Z level=INFO source=ggml.go:136 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3
time=2025-11-04T17:48:09.307Z level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama
load_backend: loaded CPU backend from C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\ggml-cpu-icelake.dll
time=2025-11-04T17:48:09.318Z level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v12
ggml_cuda_init: failed to initialize CUDA: no CUDA-capable device is detected
load_backend: loaded CUDA backend from C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v12\ggml-cuda.dll
time=2025-11-04T17:48:09.401Z level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.AVX512=1 CPU.0.AVX512_VBMI=1 CPU.0.AVX512_VNNI=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 compiler=cgo(clang)
time=2025-11-04T17:48:09.401Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
time=2025-11-04T17:48:09.401Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0
time=2025-11-04T17:48:09.401Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0
time=2025-11-04T17:48:09.401Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}"
time=2025-11-04T17:48:09.401Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}"
time=2025-11-04T17:48:09.401Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}"
time=2025-11-04T17:48:09.401Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}"
time=2025-11-04T17:48:09.401Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true
time=2025-11-04T17:48:09.401Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0
time=2025-11-04T17:48:09.401Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false
time=2025-11-04T17:48:09.401Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0
time=2025-11-04T17:48:09.401Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}"
time=2025-11-04T17:48:09.401Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default=""
time=2025-11-04T17:48:09.401Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
time=2025-11-04T17:48:09.401Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0
time=2025-11-04T17:48:09.401Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0
time=2025-11-04T17:48:09.401Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0
time=2025-11-04T17:48:09.401Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0
time=2025-11-04T17:48:09.401Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0
time=2025-11-04T17:48:09.401Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0
time=2025-11-04T17:48:09.401Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000
time=2025-11-04T17:48:09.401Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1
time=2025-11-04T17:48:09.401Z level=DEBUG source=runner.go:1324 msg="dummy model load took" duration=95.2382ms
time=2025-11-04T17:48:09.401Z level=DEBUG source=runner.go:1329 msg="gathering device infos took" duration=0s
time=2025-11-04T17:48:09.402Z level=TRACE source=runner.go:501 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v12]" devices=[]
time=2025-11-04T17:48:09.402Z level=DEBUG source=runner.go:471 msg="bootstrap discovery took" duration=140.6074ms OLLAMA_LIBRARY_PATH="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v12]" extra_envs=map[]
time=2025-11-04T17:48:09.402Z level=TRACE source=runner.go:474 msg="starting runner for device discovery" libDirs="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v13]" extraEnvs=map[]
time=2025-11-04T17:48:09.403Z level=INFO source=server.go:400 msg="starting runner" cmd="C:\Users\mswil\AppData\Local\Programs\Ollama\ollama.exe runner --ollama-engine --port 52378"
time=2025-11-04T17:48:09.403Z level=DEBUG source=server.go:401 msg=subprocess CUDA_PATH="C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.8" CUDA_PATH_V12_9="C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.9" CUDA_VISIBLE_DEVICES=-1 HIP_PATH="C:\Program Files\AMD\ROCm\6.2" HIP_PATH_64="C:\Program Files\AMD\ROCm\6.4\" OLLAMA_CONTEXT_LENGTH=16384 OLLAMA_DEBUG=2 OLLAMA_FLASH_ATTENTION=true OLLAMA_HOST=0.0.0.0 OLLAMA_KEEP_ALIVE=1h OLLAMA_MODELS=C:\AI-Models\llms OLLAMA_NUM_PARALLEL=4 OLLAMA_ORIGINS=* PATH="C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama;C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v13;;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.9\bin;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.9\libnvvp;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.8\bin;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.8\libnvvp;C:\Program Files\Oculus\Support\oculus-runtime;c:\windows\system32;c:\windows;c:\windows\system32\wbem;c:\windows\system32\windowspowershell\v1.0\;c:\windows\system32\openssh\;c:\program files\microsoft vs code\bin;c:\program files\dotnet\;c:\program files\nvidia corporation\nvidia nvdlisr;C:\Program Files\PuTTY\;C:\Program Files\Microsoft VS Code\bin;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;C:\WINDOWS\System32\WindowsPowerShell\v1.0\;C:\WINDOWS\System32\OpenSSH\;C:\Program Files\AMD\ROCm\6.2\bin;C:\Program Files\NVIDIA Corporation\NVIDIA App\NvDLISR;C:\Program Files (x86)\NVIDIA Corporation\PhysX\Common;C:\Program Files\Docker\Docker\resources\bin;C:\Program Files\Git\cmd;C:\Program Files\nodejs\;C:\Program Files\NVIDIA Corporation\Nsight Compute 2025.3.1\;C:\Program Files\cursor\resources\app\bin;C:\Users\mswil\miniconda3;C:\Users\mswil\miniconda3\Library\mingw-w64\bin;C:\Users\mswil\miniconda3\Library\usr\bin;C:\Users\mswil\miniconda3\Library\bin;C:\Users\mswil\miniconda3\Scripts;C:\Users\mswil\AppData\Local\Programs\Python\Python311\Scripts\;C:\Users\mswil\AppData\Local\Programs\Python\Python311\;C:\Users\mswil\AppData\Local\Microsoft\WindowsApps;C:\Program Files\AMD\ROCm\6.2\bin;C:\Users\mswil\AppData\Local\Programs\Ollama;C:\Users\mswil\.lmstudio\bin;C:\Users\mswil\AppData\Local\Microsoft\WinGet\Packages\Gyan.FFmpeg_Microsoft.Winget.Source_8wekyb3d8bbwe\ffmpeg-8.0-full_build\bin;C:\Users\mswil\AppData\Roaming\npm;C:\Users\mswil\AppData\Local\Programs\Microsoft VS Code\bin" OLLAMA_LIBRARY_PATH=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama;C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v13
time=2025-11-04T17:48:09.431Z level=INFO source=runner.go:1349 msg="starting ollama engine"
time=2025-11-04T17:48:09.432Z level=INFO source=runner.go:1384 msg="Server listening on 127.0.0.1:52378"
time=2025-11-04T17:48:09.437Z level=DEBUG source=gguf.go:590 msg=general.architecture type=string
time=2025-11-04T17:48:09.437Z level=DEBUG source=gguf.go:590 msg=tokenizer.ggml.model type=string
time=2025-11-04T17:48:09.437Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
time=2025-11-04T17:48:09.437Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
time=2025-11-04T17:48:09.437Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0
time=2025-11-04T17:48:09.437Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default=""
time=2025-11-04T17:48:09.437Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default=""
time=2025-11-04T17:48:09.437Z level=INFO source=ggml.go:136 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3
time=2025-11-04T17:48:09.437Z level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama
load_backend: loaded CPU backend from C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\ggml-cpu-icelake.dll
time=2025-11-04T17:48:09.448Z level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v13
Exception 0xc0000005 0x8 0x7ffa765bcb12 0x7ffa765bcb12
PC=0x7ffa765bcb12
signal arrived during external code execution
runtime.cgocall(0x7ff7b305f430, 0xc000048868)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/cgocall.go:167 +0x3e fp=0xc000048840 sp=0xc0000487d8 pc=0x7ff7b2322c7e
github.com/ollama/ollama/ml/backend/ggml/ggml/src._Cfunc_ggml_backend_load_all_from_path(0x1eba27c8de0)
_cgo_gotypes.go:199 +0x45 fp=0xc000048868 sp=0xc000048840 pc=0x7ff7b26e9985
github.com/ollama/ollama/ml/backend/ggml/ggml/src.init.func1.1({0xc00049a180, 0x40})
C:/a/ollama/ollama/ml/backend/ggml/ggml/src/ggml.go:97 +0xf5 fp=0xc000048900 sp=0xc000048868 pc=0x7ff7b26e93b5
github.com/ollama/ollama/ml/backend/ggml/ggml/src.init.func1()
C:/a/ollama/ollama/ml/backend/ggml/ggml/src/ggml.go:98 +0x505 fp=0xc000048b68 sp=0xc000048900 pc=0x7ff7b26e9205
github.com/ollama/ollama/ml/backend/ggml/ggml/src.init.OnceFunc.func2()
C:/hostedtoolcache/windows/go/1.24.0/x64/src/sync/oncefunc.go:27 +0x62 fp=0xc000048bb0 sp=0xc000048b68 pc=0x7ff7b26e8c22
sync.(*Once).doSlow(0x7ff7b37f6760?, 0x7ff7b41f8f00?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/sync/once.go:78 +0xab fp=0xc000048c08 sp=0xc000048bb0 pc=0x7ff7b233a36b
sync.(*Once).Do(0x0?, 0xc000048cb0?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/sync/once.go:69 +0x19 fp=0xc000048c28 sp=0xc000048c08 pc=0x7ff7b233a299
github.com/ollama/ollama/ml/backend/ggml/ggml/src.init.OnceFunc.func3()
C:/hostedtoolcache/windows/go/1.24.0/x64/src/sync/oncefunc.go:32 +0x2d fp=0xc000048c58 sp=0xc000048c28 pc=0x7ff7b26e8b8d
github.com/ollama/ollama/ml/backend/ggml.init.func1()
C:/a/ollama/ollama/ml/backend/ggml/ggml.go:48 +0x23 fp=0xc000048ce8 sp=0xc000048c58 pc=0x7ff7b276c403
github.com/ollama/ollama/ml/backend/ggml.init.OnceFunc.func2()
C:/hostedtoolcache/windows/go/1.24.0/x64/src/sync/oncefunc.go:27 +0x62 fp=0xc000048d30 sp=0xc000048ce8 pc=0x7ff7b276c302
sync.(*Once).doSlow(0x17ff7b37ec2a0?, 0xc00058c028?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/sync/once.go:78 +0xab fp=0xc000048d88 sp=0xc000048d30 pc=0x7ff7b233a36b
sync.(*Once).Do(0x7ff7b233a420?, 0x7ff7b41f949c?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/sync/once.go:69 +0x19 fp=0xc000048da8 sp=0xc000048d88 pc=0x7ff7b233a299
github.com/ollama/ollama/ml/backend/ggml.init.OnceFunc.func3()
C:/hostedtoolcache/windows/go/1.24.0/x64/src/sync/oncefunc.go:32 +0x2d fp=0xc000048dd8 sp=0xc000048da8 pc=0x7ff7b276c26d
github.com/ollama/ollama/ml/backend/ggml.New({0xc0003dd4d0, 0x30}, {0x0, 0x10, {0xc000188180, 0x1, 0x1}, 0x0})
C:/a/ollama/ollama/ml/backend/ggml/ggml.go:147 +0x124 fp=0xc000049670 sp=0xc000048dd8 pc=0x7ff7b2776444
github.com/ollama/ollama/ml.NewBackend({0xc0003dd4d0, 0x30}, {0x0, 0x10, {0xc000188180, 0x1, 0x1}, 0x0})
C:/a/ollama/ollama/ml/backend.go:92 +0x9c fp=0xc0000496c0 sp=0xc000049670 pc=0x7ff7b26eb65c
github.com/ollama/ollama/model.New({0xc0003dd4d0?, 0xc000049980?}, {0x0, 0x10, {0xc000188180, 0x1, 0x1}, 0x0})
C:/a/ollama/ollama/model/model.go:106 +0x7e fp=0xc000049790 sp=0xc0000496c0 pc=0x7ff7b278c01e
github.com/ollama/ollama/runner/ollamarunner.(*Server).info(0xc000180f00, {0x7ff7b37f4410, 0xc000000000}, 0xc00058e070?)
C:/a/ollama/ollama/runner/ollamarunner/runner.go:1319 +0x53e fp=0xc000049ac0 sp=0xc000049790 pc=0x7ff7b2852efe
github.com/ollama/ollama/runner/ollamarunner.(*Server).info-fm({0x7ff7b37f4410?, 0xc000000000?}, 0xc000049b38?)
:1 +0x36 fp=0xc000049af0 sp=0xc000049ac0 pc=0x7ff7b28543d6
net/http.HandlerFunc.ServeHTTP(0xc000473500?, {0x7ff7b37f4410?, 0xc000000000?}, 0x7ff7b262c9b6?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/net/http/server.go:2294 +0x29 fp=0xc000049b18 sp=0xc000049af0 pc=0x7ff7b2634749
net/http.(*ServeMux).ServeHTTP(0x7ff7b2322b59?, {0x7ff7b37f4410, 0xc000000000}, 0xc000466000)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/net/http/server.go:2822 +0x1c4 fp=0xc000049b68 sp=0xc000049b18 pc=0x7ff7b2636644
net/http.serverHandler.ServeHTTP({0xc00003e510?}, {0x7ff7b37f4410?, 0xc000000000?}, 0x1?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/net/http/server.go:3301 +0x8e fp=0xc000049b98 sp=0xc000049b68 pc=0x7ff7b26540ce
net/http.(*conn).serve(0xc0006c23f0, {0x7ff7b37f6798, 0xc0005950b0})
C:/hostedtoolcache/windows/go/1.24.0/x64/src/net/http/server.go:2102 +0x625 fp=0xc000049fb8 sp=0xc000049b98 pc=0x7ff7b2632c45
net/http.(*Server).Serve.gowrap3()
C:/hostedtoolcache/windows/go/1.24.0/x64/src/net/http/server.go:3454 +0x28 fp=0xc000049fe0 sp=0xc000049fb8 pc=0x7ff7b2638508
runtime.goexit({})
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/asm_amd64.s:1700 +0x1 fp=0xc000049fe8 sp=0xc000049fe0 pc=0x7ff7b232d8e1
created by net/http.(*Server).Serve in goroutine 1
C:/hostedtoolcache/windows/go/1.24.0/x64/src/net/http/server.go:3454 +0x485
goroutine 1 gp=0xc0000021c0 m=nil [IO wait]:
runtime.gopark(0x7ff7b232f0e0?, 0x7ff7b4185a60?, 0x20?, 0x34?, 0xc0006b34cc?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/proc.go:435 +0xce fp=0xc00050f648 sp=0xc00050f628 pc=0x7ff7b23261ce
runtime.netpollblock(0x24c?, 0xb22c0406?, 0xf7?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/netpoll.go:575 +0xf7 fp=0xc00050f680 sp=0xc00050f648 pc=0x7ff7b22ebdf7
internal/poll.runtime_pollWait(0x1ebe7d4c3f0, 0x72)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/netpoll.go:351 +0x85 fp=0xc00050f6a0 sp=0xc00050f680 pc=0x7ff7b2325365
internal/poll.(*pollDesc).wait(0x7ff7b23b9e73?, 0x0?, 0x0)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/internal/poll/fd_poll_runtime.go:84 +0x27 fp=0xc00050f6c8 sp=0xc00050f6a0 pc=0x7ff7b23bb467
internal/poll.execIO(0xc0006b3420, 0xc00050f770)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/internal/poll/fd_windows.go:177 +0x105 fp=0xc00050f740 sp=0xc00050f6c8 pc=0x7ff7b23bc8c5
internal/poll.(*FD).acceptOne(0xc0006b3408, 0x260, {0xc0006e40f0?, 0xc00050f7d0?, 0x7ff7b23c4585?}, 0xc00050f804?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/internal/poll/fd_windows.go:946 +0x65 fp=0xc00050f7a0 sp=0xc00050f740 pc=0x7ff7b23c0e45
internal/poll.(*FD).Accept(0xc0006b3408, 0xc00050f950)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/internal/poll/fd_windows.go:980 +0x1b6 fp=0xc00050f858 sp=0xc00050f7a0 pc=0x7ff7b23c1176
net.(*netFD).accept(0xc0006b3408)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/net/fd_windows.go:182 +0x4b fp=0xc00050f970 sp=0xc00050f858 pc=0x7ff7b243264b
net.(*TCPListener).accept(0xc0002be3c0)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/net/tcpsock_posix.go:159 +0x1b fp=0xc00050f9c0 sp=0xc00050f970 pc=0x7ff7b244869b
net.(*TCPListener).Accept(0xc0002be3c0)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/net/tcpsock.go:380 +0x30 fp=0xc00050f9f0 sp=0xc00050f9c0 pc=0x7ff7b2447450
net/http.(*onceCloseListener).Accept(0xc0006c23f0?)
:1 +0x24 fp=0xc00050fa08 sp=0xc00050f9f0 pc=0x7ff7b2660844
net/http.(*Server).Serve(0xc00048ee00, {0x7ff7b37f4260, 0xc0002be3c0})
C:/hostedtoolcache/windows/go/1.24.0/x64/src/net/http/server.go:3424 +0x30c fp=0xc00050fb38 sp=0xc00050fa08 pc=0x7ff7b263810c
github.com/ollama/ollama/runner/ollamarunner.Execute({0xc0000500b0, 0x2, 0x5})
C:/a/ollama/ollama/runner/ollamarunner/runner.go:1385 +0x94e fp=0xc00050fd08 sp=0xc00050fb38 pc=0x7ff7b2853dee
github.com/ollama/ollama/runner.Execute({0xc000050090?, 0x0?, 0x0?})
C:/a/ollama/ollama/runner/runner.go:20 +0xc9 fp=0xc00050fd30 sp=0xc00050fd08 pc=0x7ff7b28546e9
github.com/ollama/ollama/cmd.NewCLI.func2(0xc00048ec00?, {0x7ff7b360efb9?, 0x4?, 0x7ff7b360efbd?})
C:/a/ollama/ollama/cmd/cmd.go:1774 +0x45 fp=0xc00050fd58 sp=0xc00050fd30 pc=0x7ff7b2fde425
github.com/spf13/cobra.(*Command).execute(0xc0006c5508, {0xc000594bd0, 0x3, 0x3})
C:/Users/runneradmin/go/pkg/mod/github.com/spf13/cobra@v1.7.0/command.go:940 +0x85c fp=0xc00050fe78 sp=0xc00050fd58 pc=0x7ff7b24ad11c
github.com/spf13/cobra.(*Command).ExecuteC(0xc0005a5208)
C:/Users/runneradmin/go/pkg/mod/github.com/spf13/cobra@v1.7.0/command.go:1068 +0x3a5 fp=0xc00050ff30 sp=0xc00050fe78 pc=0x7ff7b24ad965
github.com/spf13/cobra.(*Command).Execute(...)
C:/Users/runneradmin/go/pkg/mod/github.com/spf13/cobra@v1.7.0/command.go:992
github.com/spf13/cobra.(*Command).ExecuteContext(...)
C:/Users/runneradmin/go/pkg/mod/github.com/spf13/cobra@v1.7.0/command.go:985
main.main()
C:/a/ollama/ollama/main.go:12 +0x4d fp=0xc00050ff50 sp=0xc00050ff30 pc=0x7ff7b2fdeeed
runtime.main()
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/proc.go:283 +0x27d fp=0xc00050ffe0 sp=0xc00050ff50 pc=0x7ff7b22f4ddd
runtime.goexit({})
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/asm_amd64.s:1700 +0x1 fp=0xc00050ffe8 sp=0xc00050ffe0 pc=0x7ff7b232d8e1
goroutine 2 gp=0xc0000028c0 m=nil [force gc (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/proc.go:435 +0xce fp=0xc000081fa8 sp=0xc000081f88 pc=0x7ff7b23261ce
runtime.goparkunlock(...)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/proc.go:441
runtime.forcegchelper()
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/proc.go:348 +0xb8 fp=0xc000081fe0 sp=0xc000081fa8 pc=0x7ff7b22f50f8
runtime.goexit({})
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/asm_amd64.s:1700 +0x1 fp=0xc000081fe8 sp=0xc000081fe0 pc=0x7ff7b232d8e1
created by runtime.init.7 in goroutine 1
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/proc.go:336 +0x1a
goroutine 3 gp=0xc000002c40 m=nil [GC sweep wait]:
runtime.gopark(0x1?, 0x0?, 0x0?, 0x0?, 0x0?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/proc.go:435 +0xce fp=0xc000083f80 sp=0xc000083f60 pc=0x7ff7b23261ce
runtime.goparkunlock(...)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/proc.go:441
runtime.bgsweep(0xc00008c000)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgcsweep.go:316 +0xdf fp=0xc000083fc8 sp=0xc000083f80 pc=0x7ff7b22ddebf
runtime.gcenable.gowrap1()
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:204 +0x25 fp=0xc000083fe0 sp=0xc000083fc8 pc=0x7ff7b22d2285
runtime.goexit({})
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/asm_amd64.s:1700 +0x1 fp=0xc000083fe8 sp=0xc000083fe0 pc=0x7ff7b232d8e1
created by runtime.gcenable in goroutine 1
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:204 +0x66
goroutine 4 gp=0xc000002e00 m=nil [GC scavenge wait]:
runtime.gopark(0x10000?, 0x7ff7b37e0af8?, 0x0?, 0x0?, 0x0?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/proc.go:435 +0xce fp=0xc000093f78 sp=0xc000093f58 pc=0x7ff7b23261ce
runtime.goparkunlock(...)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/proc.go:441
runtime.(*scavengerState).park(0x7ff7b41ac3a0)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgcscavenge.go:425 +0x49 fp=0xc000093fa8 sp=0xc000093f78 pc=0x7ff7b22db909
runtime.bgscavenge(0xc00008c000)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgcscavenge.go:658 +0x59 fp=0xc000093fc8 sp=0xc000093fa8 pc=0x7ff7b22dbe99
runtime.gcenable.gowrap2()
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:205 +0x25 fp=0xc000093fe0 sp=0xc000093fc8 pc=0x7ff7b22d2225
runtime.goexit({})
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/asm_amd64.s:1700 +0x1 fp=0xc000093fe8 sp=0xc000093fe0 pc=0x7ff7b232d8e1
created by runtime.gcenable in goroutine 1
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:205 +0xa5
goroutine 5 gp=0xc000003340 m=nil [finalizer wait]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/proc.go:435 +0xce fp=0xc000095e30 sp=0xc000095e10 pc=0x7ff7b23261ce
runtime.runfinq()
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mfinal.go:196 +0x107 fp=0xc000095fe0 sp=0xc000095e30 pc=0x7ff7b22d1207
runtime.goexit({})
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/asm_amd64.s:1700 +0x1 fp=0xc000095fe8 sp=0xc000095fe0 pc=0x7ff7b232d8e1
created by runtime.createfing in goroutine 1
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mfinal.go:166 +0x3d
goroutine 6 gp=0xc000003dc0 m=nil [chan receive]:
runtime.gopark(0xc0000e1860?, 0xc000009170?, 0x60?, 0x5f?, 0x7ff7b241b588?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/proc.go:435 +0xce fp=0xc000085f18 sp=0xc000085ef8 pc=0x7ff7b23261ce
runtime.chanrecv(0xc00009a380, 0x0, 0x1)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/chan.go:664 +0x445 fp=0xc000085f90 sp=0xc000085f18 pc=0x7ff7b22c2d45
runtime.chanrecv1(0x7ff7b22f4f40?, 0xc000085f76?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/chan.go:506 +0x12 fp=0xc000085fb8 sp=0xc000085f90 pc=0x7ff7b22c28d2
runtime.unique_runtime_registerUniqueMapCleanup.func2(...)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1796
runtime.unique_runtime_registerUniqueMapCleanup.gowrap1()
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1799 +0x2f fp=0xc000085fe0 sp=0xc000085fb8 pc=0x7ff7b22d54af
runtime.goexit({})
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/asm_amd64.s:1700 +0x1 fp=0xc000085fe8 sp=0xc000085fe0 pc=0x7ff7b232d8e1
created by unique.runtime_registerUniqueMapCleanup in goroutine 1
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1794 +0x85
goroutine 7 gp=0xc0003f6380 m=nil [GC worker (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/proc.go:435 +0xce fp=0xc00008ff38 sp=0xc00008ff18 pc=0x7ff7b23261ce
runtime.gcBgMarkWorker(0xc00009b7a0)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1423 +0xe9 fp=0xc00008ffc8 sp=0xc00008ff38 pc=0x7ff7b22d47a9
runtime.gcBgMarkStartWorkers.gowrap1()
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1339 +0x25 fp=0xc00008ffe0 sp=0xc00008ffc8 pc=0x7ff7b22d4685
runtime.goexit({})
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/asm_amd64.s:1700 +0x1 fp=0xc00008ffe8 sp=0xc00008ffe0 pc=0x7ff7b232d8e1
created by runtime.gcBgMarkStartWorkers in goroutine 1
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1339 +0x105
goroutine 18 gp=0xc0002081c0 m=nil [GC worker (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/proc.go:435 +0xce fp=0xc000213f38 sp=0xc000213f18 pc=0x7ff7b23261ce
runtime.gcBgMarkWorker(0xc00009b7a0)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1423 +0xe9 fp=0xc000213fc8 sp=0xc000213f38 pc=0x7ff7b22d47a9
runtime.gcBgMarkStartWorkers.gowrap1()
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1339 +0x25 fp=0xc000213fe0 sp=0xc000213fc8 pc=0x7ff7b22d4685
runtime.goexit({})
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/asm_amd64.s:1700 +0x1 fp=0xc000213fe8 sp=0xc000213fe0 pc=0x7ff7b232d8e1
created by runtime.gcBgMarkStartWorkers in goroutine 1
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1339 +0x105
goroutine 19 gp=0xc000208380 m=nil [GC worker (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/proc.go:435 +0xce fp=0xc000215f38 sp=0xc000215f18 pc=0x7ff7b23261ce
runtime.gcBgMarkWorker(0xc00009b7a0)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1423 +0xe9 fp=0xc000215fc8 sp=0xc000215f38 pc=0x7ff7b22d47a9
runtime.gcBgMarkStartWorkers.gowrap1()
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1339 +0x25 fp=0xc000215fe0 sp=0xc000215fc8 pc=0x7ff7b22d4685
runtime.goexit({})
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/asm_amd64.s:1700 +0x1 fp=0xc000215fe8 sp=0xc000215fe0 pc=0x7ff7b232d8e1
created by runtime.gcBgMarkStartWorkers in goroutine 1
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1339 +0x105
goroutine 34 gp=0xc000484000 m=nil [GC worker (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/proc.go:435 +0xce fp=0xc00020ff38 sp=0xc00020ff18 pc=0x7ff7b23261ce
runtime.gcBgMarkWorker(0xc00009b7a0)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1423 +0xe9 fp=0xc00020ffc8 sp=0xc00020ff38 pc=0x7ff7b22d47a9
runtime.gcBgMarkStartWorkers.gowrap1()
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1339 +0x25 fp=0xc00020ffe0 sp=0xc00020ffc8 pc=0x7ff7b22d4685
runtime.goexit({})
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/asm_amd64.s:1700 +0x1 fp=0xc00020ffe8 sp=0xc00020ffe0 pc=0x7ff7b232d8e1
created by runtime.gcBgMarkStartWorkers in goroutine 1
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1339 +0x105
goroutine 8 gp=0xc0003f6540 m=nil [GC worker (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/proc.go:435 +0xce fp=0xc000091f38 sp=0xc000091f18 pc=0x7ff7b23261ce
runtime.gcBgMarkWorker(0xc00009b7a0)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1423 +0xe9 fp=0xc000091fc8 sp=0xc000091f38 pc=0x7ff7b22d47a9
runtime.gcBgMarkStartWorkers.gowrap1()
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1339 +0x25 fp=0xc000091fe0 sp=0xc000091fc8 pc=0x7ff7b22d4685
runtime.goexit({})
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/asm_amd64.s:1700 +0x1 fp=0xc000091fe8 sp=0xc000091fe0 pc=0x7ff7b232d8e1
created by runtime.gcBgMarkStartWorkers in goroutine 1
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1339 +0x105
goroutine 20 gp=0xc000208540 m=nil [GC worker (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/proc.go:435 +0xce fp=0xc00021df38 sp=0xc00021df18 pc=0x7ff7b23261ce
runtime.gcBgMarkWorker(0xc00009b7a0)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1423 +0xe9 fp=0xc00021dfc8 sp=0xc00021df38 pc=0x7ff7b22d47a9
runtime.gcBgMarkStartWorkers.gowrap1()
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1339 +0x25 fp=0xc00021dfe0 sp=0xc00021dfc8 pc=0x7ff7b22d4685
runtime.goexit({})
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/asm_amd64.s:1700 +0x1 fp=0xc00021dfe8 sp=0xc00021dfe0 pc=0x7ff7b232d8e1
created by runtime.gcBgMarkStartWorkers in goroutine 1
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1339 +0x105
goroutine 35 gp=0xc0004841c0 m=nil [GC worker (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/proc.go:435 +0xce fp=0xc000211f38 sp=0xc000211f18 pc=0x7ff7b23261ce
runtime.gcBgMarkWorker(0xc00009b7a0)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1423 +0xe9 fp=0xc000211fc8 sp=0xc000211f38 pc=0x7ff7b22d47a9
runtime.gcBgMarkStartWorkers.gowrap1()
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1339 +0x25 fp=0xc000211fe0 sp=0xc000211fc8 pc=0x7ff7b22d4685
runtime.goexit({})
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/asm_amd64.s:1700 +0x1 fp=0xc000211fe8 sp=0xc000211fe0 pc=0x7ff7b232d8e1
created by runtime.gcBgMarkStartWorkers in goroutine 1
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1339 +0x105
goroutine 9 gp=0xc0003f6700 m=nil [GC worker (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/proc.go:435 +0xce fp=0xc000219f38 sp=0xc000219f18 pc=0x7ff7b23261ce
runtime.gcBgMarkWorker(0xc00009b7a0)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1423 +0xe9 fp=0xc000219fc8 sp=0xc000219f38 pc=0x7ff7b22d47a9
runtime.gcBgMarkStartWorkers.gowrap1()
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1339 +0x25 fp=0xc000219fe0 sp=0xc000219fc8 pc=0x7ff7b22d4685
runtime.goexit({})
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/asm_amd64.s:1700 +0x1 fp=0xc000219fe8 sp=0xc000219fe0 pc=0x7ff7b232d8e1
created by runtime.gcBgMarkStartWorkers in goroutine 1
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1339 +0x105
goroutine 21 gp=0xc000208700 m=nil [GC worker (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/proc.go:435 +0xce fp=0xc00021ff38 sp=0xc00021ff18 pc=0x7ff7b23261ce
runtime.gcBgMarkWorker(0xc00009b7a0)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1423 +0xe9 fp=0xc00021ffc8 sp=0xc00021ff38 pc=0x7ff7b22d47a9
runtime.gcBgMarkStartWorkers.gowrap1()
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1339 +0x25 fp=0xc00021ffe0 sp=0xc00021ffc8 pc=0x7ff7b22d4685
runtime.goexit({})
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/asm_amd64.s:1700 +0x1 fp=0xc00021ffe8 sp=0xc00021ffe0 pc=0x7ff7b232d8e1
created by runtime.gcBgMarkStartWorkers in goroutine 1
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1339 +0x105
goroutine 36 gp=0xc000484380 m=nil [GC worker (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/proc.go:435 +0xce fp=0xc00048bf38 sp=0xc00048bf18 pc=0x7ff7b23261ce
runtime.gcBgMarkWorker(0xc00009b7a0)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1423 +0xe9 fp=0xc00048bfc8 sp=0xc00048bf38 pc=0x7ff7b22d47a9
runtime.gcBgMarkStartWorkers.gowrap1()
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1339 +0x25 fp=0xc00048bfe0 sp=0xc00048bfc8 pc=0x7ff7b22d4685
runtime.goexit({})
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/asm_amd64.s:1700 +0x1 fp=0xc00048bfe8 sp=0xc00048bfe0 pc=0x7ff7b232d8e1
created by runtime.gcBgMarkStartWorkers in goroutine 1
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1339 +0x105
goroutine 10 gp=0xc0003f68c0 m=nil [GC worker (idle)]:
runtime.gopark(0x7ff7b41faf80?, 0x1?, 0x0?, 0x0?, 0x0?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/proc.go:435 +0xce fp=0xc00021bf38 sp=0xc00021bf18 pc=0x7ff7b23261ce
runtime.gcBgMarkWorker(0xc00009b7a0)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1423 +0xe9 fp=0xc00021bfc8 sp=0xc00021bf38 pc=0x7ff7b22d47a9
runtime.gcBgMarkStartWorkers.gowrap1()
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1339 +0x25 fp=0xc00021bfe0 sp=0xc00021bfc8 pc=0x7ff7b22d4685
runtime.goexit({})
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/asm_amd64.s:1700 +0x1 fp=0xc00021bfe8 sp=0xc00021bfe0 pc=0x7ff7b232d8e1
created by runtime.gcBgMarkStartWorkers in goroutine 1
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1339 +0x105
goroutine 22 gp=0xc0002088c0 m=nil [GC worker (idle)]:
runtime.gopark(0x12473d45970?, 0x0?, 0x0?, 0x0?, 0x0?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/proc.go:435 +0xce fp=0xc000487f38 sp=0xc000487f18 pc=0x7ff7b23261ce
runtime.gcBgMarkWorker(0xc00009b7a0)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1423 +0xe9 fp=0xc000487fc8 sp=0xc000487f38 pc=0x7ff7b22d47a9
runtime.gcBgMarkStartWorkers.gowrap1()
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1339 +0x25 fp=0xc000487fe0 sp=0xc000487fc8 pc=0x7ff7b22d4685
runtime.goexit({})
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/asm_amd64.s:1700 +0x1 fp=0xc000487fe8 sp=0xc000487fe0 pc=0x7ff7b232d8e1
created by runtime.gcBgMarkStartWorkers in goroutine 1
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1339 +0x105
goroutine 37 gp=0xc000484540 m=nil [GC worker (idle)]:
runtime.gopark(0x12473d45970?, 0x0?, 0x0?, 0x0?, 0x0?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/proc.go:435 +0xce fp=0xc00048df38 sp=0xc00048df18 pc=0x7ff7b23261ce
runtime.gcBgMarkWorker(0xc00009b7a0)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1423 +0xe9 fp=0xc00048dfc8 sp=0xc00048df38 pc=0x7ff7b22d47a9
runtime.gcBgMarkStartWorkers.gowrap1()
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1339 +0x25 fp=0xc00048dfe0 sp=0xc00048dfc8 pc=0x7ff7b22d4685
runtime.goexit({})
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/asm_amd64.s:1700 +0x1 fp=0xc00048dfe8 sp=0xc00048dfe0 pc=0x7ff7b232d8e1
created by runtime.gcBgMarkStartWorkers in goroutine 1
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1339 +0x105
goroutine 11 gp=0xc0003f6a80 m=nil [GC worker (idle)]:
runtime.gopark(0x12473cca8b0?, 0x0?, 0x0?, 0x0?, 0x0?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/proc.go:435 +0xce fp=0xc00047bf38 sp=0xc00047bf18 pc=0x7ff7b23261ce
runtime.gcBgMarkWorker(0xc00009b7a0)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1423 +0xe9 fp=0xc00047bfc8 sp=0xc00047bf38 pc=0x7ff7b22d47a9
runtime.gcBgMarkStartWorkers.gowrap1()
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1339 +0x25 fp=0xc00047bfe0 sp=0xc00047bfc8 pc=0x7ff7b22d4685
runtime.goexit({})
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/asm_amd64.s:1700 +0x1 fp=0xc00047bfe8 sp=0xc00047bfe0 pc=0x7ff7b232d8e1
created by runtime.gcBgMarkStartWorkers in goroutine 1
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1339 +0x105
goroutine 12 gp=0xc0003f6c40 m=nil [GC worker (idle)]:
runtime.gopark(0x12473d45970?, 0x1?, 0x0?, 0x0?, 0x0?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/proc.go:435 +0xce fp=0xc00047df38 sp=0xc00047df18 pc=0x7ff7b23261ce
runtime.gcBgMarkWorker(0xc00009b7a0)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1423 +0xe9 fp=0xc00047dfc8 sp=0xc00047df38 pc=0x7ff7b22d47a9
runtime.gcBgMarkStartWorkers.gowrap1()
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1339 +0x25 fp=0xc00047dfe0 sp=0xc00047dfc8 pc=0x7ff7b22d4685
runtime.goexit({})
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/asm_amd64.s:1700 +0x1 fp=0xc00047dfe8 sp=0xc00047dfe0 pc=0x7ff7b232d8e1
created by runtime.gcBgMarkStartWorkers in goroutine 1
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1339 +0x105
goroutine 23 gp=0xc000208a80 m=nil [GC worker (idle)]:
runtime.gopark(0x12473d45970?, 0x0?, 0x0?, 0x0?, 0x0?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/proc.go:435 +0xce fp=0xc000489f38 sp=0xc000489f18 pc=0x7ff7b23261ce
runtime.gcBgMarkWorker(0xc00009b7a0)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1423 +0xe9 fp=0xc000489fc8 sp=0xc000489f38 pc=0x7ff7b22d47a9
runtime.gcBgMarkStartWorkers.gowrap1()
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1339 +0x25 fp=0xc000489fe0 sp=0xc000489fc8 pc=0x7ff7b22d4685
runtime.goexit({})
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/asm_amd64.s:1700 +0x1 fp=0xc000489fe8 sp=0xc000489fe0 pc=0x7ff7b232d8e1
created by runtime.gcBgMarkStartWorkers in goroutine 1
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/mgc.go:1339 +0x105
goroutine 13 gp=0xc000209500 m=nil [sync.WaitGroup.Wait]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x40?, 0x0?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/proc.go:435 +0xce fp=0xc000477a90 sp=0xc000477a70 pc=0x7ff7b23261ce
runtime.goparkunlock(...)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/proc.go:441
runtime.semacquire1(0xc000180fb8, 0x0, 0x1, 0x0, 0x18)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/sema.go:188 +0x22f fp=0xc000477af8 sp=0xc000477a90 pc=0x7ff7b230750f
sync.runtime_SemacquireWaitGroup(0x0?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/sema.go:110 +0x25 fp=0xc000477b30 sp=0xc000477af8 pc=0x7ff7b23277c5
sync.(*WaitGroup).Wait(0x0?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/sync/waitgroup.go:118 +0x48 fp=0xc000477b58 sp=0xc000477b30 pc=0x7ff7b233b7a8
github.com/ollama/ollama/runner/ollamarunner.(*Server).run(0xc000180f00, {0x7ff7b37f67d0, 0xc0006a1e50})
C:/a/ollama/ollama/runner/ollamarunner/runner.go:413 +0x45 fp=0xc000477fb8 sp=0xc000477b58 pc=0x7ff7b284ac45
github.com/ollama/ollama/runner/ollamarunner.Execute.gowrap1()
C:/a/ollama/ollama/runner/ollamarunner/runner.go:1362 +0x28 fp=0xc000477fe0 sp=0xc000477fb8 pc=0x7ff7b2854068
runtime.goexit({})
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/asm_amd64.s:1700 +0x1 fp=0xc000477fe8 sp=0xc000477fe0 pc=0x7ff7b232d8e1
created by github.com/ollama/ollama/runner/ollamarunner.Execute in goroutine 1
C:/a/ollama/ollama/runner/ollamarunner/runner.go:1362 +0x4c9
goroutine 50 gp=0xc0004848c0 m=nil [IO wait]:
runtime.gopark(0x0?, 0xc0006b36a0?, 0x48?, 0x37?, 0xc0006b374c?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/proc.go:435 +0xce fp=0xc000479d58 sp=0xc000479d38 pc=0x7ff7b23261ce
runtime.netpollblock(0x254?, 0xb22c0406?, 0xf7?)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/netpoll.go:575 +0xf7 fp=0xc000479d90 sp=0xc000479d58 pc=0x7ff7b22ebdf7
internal/poll.runtime_pollWait(0x1ebe7d4c2d8, 0x72)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/netpoll.go:351 +0x85 fp=0xc000479db0 sp=0xc000479d90 pc=0x7ff7b2325365
internal/poll.(*pollDesc).wait(0x0?, 0x0?, 0x0)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/internal/poll/fd_poll_runtime.go:84 +0x27 fp=0xc000479dd8 sp=0xc000479db0 pc=0x7ff7b23bb467
internal/poll.execIO(0xc0006b36a0, 0x7ff7b3685c58)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/internal/poll/fd_windows.go:177 +0x105 fp=0xc000479e50 sp=0xc000479dd8 pc=0x7ff7b23bc8c5
internal/poll.(*FD).Read(0xc0006b3688, {0xc00003e521, 0x1, 0x1})
C:/hostedtoolcache/windows/go/1.24.0/x64/src/internal/poll/fd_windows.go:438 +0x29b fp=0xc000479ef0 sp=0xc000479e50 pc=0x7ff7b23bd59b
net.(*netFD).Read(0xc0006b3688, {0xc00003e521?, 0x0?, 0x0?})
C:/hostedtoolcache/windows/go/1.24.0/x64/src/net/fd_posix.go:55 +0x25 fp=0xc000479f38 sp=0xc000479ef0 pc=0x7ff7b2430765
net.(*conn).Read(0xc00007c8f8, {0xc00003e521?, 0x0?, 0x0?})
C:/hostedtoolcache/windows/go/1.24.0/x64/src/net/net.go:194 +0x45 fp=0xc000479f80 sp=0xc000479f38 pc=0x7ff7b243fc45
net/http.(*connReader).backgroundRead(0xc00003e510)
C:/hostedtoolcache/windows/go/1.24.0/x64/src/net/http/server.go:690 +0x37 fp=0xc000479fc8 sp=0xc000479f80 pc=0x7ff7b262cb17
net/http.(*connReader).startBackgroundRead.gowrap2()
C:/hostedtoolcache/windows/go/1.24.0/x64/src/net/http/server.go:686 +0x25 fp=0xc000479fe0 sp=0xc000479fc8 pc=0x7ff7b262ca45
runtime.goexit({})
C:/hostedtoolcache/windows/go/1.24.0/x64/src/runtime/asm_amd64.s:1700 +0x1 fp=0xc000479fe8 sp=0xc000479fe0 pc=0x7ff7b232d8e1
created by net/http.(connReader).startBackgroundRead in goroutine 14
C:/hostedtoolcache/windows/go/1.24.0/x64/src/net/http/server.go:686 +0xb6
rax 0x64
rbx 0x7ffa7668098c
rcx 0x88e2dbfe253d0000
rdx 0x1eba24c02c0
rdi 0x1ebea140860
rsi 0x0
rbp 0x73ceafd979
rsp 0x73ceafd410
r8 0x7ffffffffffffffc
r9 0x73cdfa7000
r10 0x80fcf8fefcfefefe
r11 0x1ebe8186fe0
r12 0x1eba279c018
r13 0x0
r14 0x0
r15 0x73ceafe6d0
rip 0x7ffa765bcb12
rflags 0x10202
cs 0x33
fs 0x53
gs 0x2b
time=2025-11-04T17:48:39.402Z level=INFO source=runner.go:498 msg="failure during GPU discovery" OLLAMA_LIBRARY_PATH="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v13]" extra_envs=map[] error="failed to finish discovery before timeout"
time=2025-11-04T17:48:39.403Z level=TRACE source=runner.go:501 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v13]" devices=[]
time=2025-11-04T17:48:39.403Z level=DEBUG source=runner.go:471 msg="bootstrap discovery took" duration=30.0009178s OLLAMA_LIBRARY_PATH="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v13]" extra_envs=map[]
time=2025-11-04T17:48:39.403Z level=TRACE source=runner.go:474 msg="starting runner for device discovery" libDirs="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm]" extraEnvs=map[]
time=2025-11-04T17:48:39.411Z level=INFO source=server.go:400 msg="starting runner" cmd="C:\Users\mswil\AppData\Local\Programs\Ollama\ollama.exe runner --ollama-engine --port 52398"
time=2025-11-04T17:48:39.411Z level=DEBUG source=server.go:401 msg=subprocess CUDA_PATH="C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.8" CUDA_PATH_V12_9="C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.9" CUDA_VISIBLE_DEVICES=-1 HIP_PATH="C:\Program Files\AMD\ROCm\6.2" HIP_PATH_64="C:\Program Files\AMD\ROCm\6.4\" OLLAMA_CONTEXT_LENGTH=16384 OLLAMA_DEBUG=2 OLLAMA_FLASH_ATTENTION=true OLLAMA_HOST=0.0.0.0 OLLAMA_KEEP_ALIVE=1h OLLAMA_MODELS=C:\AI-Models\llms OLLAMA_NUM_PARALLEL=4 OLLAMA_ORIGINS= PATH="C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama;C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm;;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.9\bin;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.9\libnvvp;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.8\bin;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.8\libnvvp;C:\Program Files\Oculus\Support\oculus-runtime;c:\windows\system32;c:\windows;c:\windows\system32\wbem;c:\windows\system32\windowspowershell\v1.0\;c:\windows\system32\openssh\;c:\program files\microsoft vs code\bin;c:\program files\dotnet\;c:\program files\nvidia corporation\nvidia nvdlisr;C:\Program Files\PuTTY\;C:\Program Files\Microsoft VS Code\bin;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;C:\WINDOWS\System32\WindowsPowerShell\v1.0\;C:\WINDOWS\System32\OpenSSH\;C:\Program Files\AMD\ROCm\6.2\bin;C:\Program Files\NVIDIA Corporation\NVIDIA App\NvDLISR;C:\Program Files (x86)\NVIDIA Corporation\PhysX\Common;C:\Program Files\Docker\Docker\resources\bin;C:\Program Files\Git\cmd;C:\Program Files\nodejs\;C:\Program Files\NVIDIA Corporation\Nsight Compute 2025.3.1\;C:\Program Files\cursor\resources\app\bin;C:\Users\mswil\miniconda3;C:\Users\mswil\miniconda3\Library\mingw-w64\bin;C:\Users\mswil\miniconda3\Library\usr\bin;C:\Users\mswil\miniconda3\Library\bin;C:\Users\mswil\miniconda3\Scripts;C:\Users\mswil\AppData\Local\Programs\Python\Python311\Scripts\;C:\Users\mswil\AppData\Local\Programs\Python\Python311\;C:\Users\mswil\AppData\Local\Microsoft\WindowsApps;C:\Program Files\AMD\ROCm\6.2\bin;C:\Users\mswil\AppData\Local\Programs\Ollama;C:\Users\mswil\.lmstudio\bin;C:\Users\mswil\AppData\Local\Microsoft\WinGet\Packages\Gyan.FFmpeg_Microsoft.Winget.Source_8wekyb3d8bbwe\ffmpeg-8.0-full_build\bin;C:\Users\mswil\AppData\Roaming\npm;C:\Users\mswil\AppData\Local\Programs\Microsoft VS Code\bin" OLLAMA_LIBRARY_PATH=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama;C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm
time=2025-11-04T17:48:39.441Z level=INFO source=runner.go:1349 msg="starting ollama engine"
time=2025-11-04T17:48:39.442Z level=INFO source=runner.go:1384 msg="Server listening on 127.0.0.1:52398"
time=2025-11-04T17:48:39.444Z level=DEBUG source=gguf.go:590 msg=general.architecture type=string
time=2025-11-04T17:48:39.444Z level=DEBUG source=gguf.go:590 msg=tokenizer.ggml.model type=string
time=2025-11-04T17:48:39.444Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
time=2025-11-04T17:48:39.445Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
time=2025-11-04T17:48:39.445Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0
time=2025-11-04T17:48:39.445Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default=""
time=2025-11-04T17:48:39.445Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default=""
time=2025-11-04T17:48:39.445Z level=INFO source=ggml.go:136 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3
time=2025-11-04T17:48:39.445Z level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama
load_backend: loaded CPU backend from C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\ggml-cpu-icelake.dll
time=2025-11-04T17:48:39.457Z level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm
ggml_cuda_init: failed to initialize ROCm: no ROCm-capable device is detected
load_backend: loaded ROCm backend from C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm\ggml-hip.dll
time=2025-11-04T17:48:39.489Z level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.AVX512=1 CPU.0.AVX512_VBMI=1 CPU.0.AVX512_VNNI=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 compiler=cgo(clang)
time=2025-11-04T17:48:39.489Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
time=2025-11-04T17:48:39.490Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0
time=2025-11-04T17:48:39.490Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0
time=2025-11-04T17:48:39.490Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}"
time=2025-11-04T17:48:39.490Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}"
time=2025-11-04T17:48:39.490Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}"
time=2025-11-04T17:48:39.490Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}"
time=2025-11-04T17:48:39.490Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true
time=2025-11-04T17:48:39.490Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0
time=2025-11-04T17:48:39.490Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false
time=2025-11-04T17:48:39.490Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0
time=2025-11-04T17:48:39.490Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}"
time=2025-11-04T17:48:39.490Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default=""
time=2025-11-04T17:48:39.490Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
time=2025-11-04T17:48:39.490Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0
time=2025-11-04T17:48:39.490Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0
time=2025-11-04T17:48:39.490Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0
time=2025-11-04T17:48:39.490Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0
time=2025-11-04T17:48:39.490Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0
time=2025-11-04T17:48:39.490Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0
time=2025-11-04T17:48:39.490Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000
time=2025-11-04T17:48:39.490Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1
time=2025-11-04T17:48:39.490Z level=DEBUG source=runner.go:1324 msg="dummy model load took" duration=46.0572ms
time=2025-11-04T17:48:39.490Z level=DEBUG source=runner.go:1329 msg="gathering device infos took" duration=0s
time=2025-11-04T17:48:39.490Z level=TRACE source=runner.go:501 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm]" devices=[]
time=2025-11-04T17:48:39.490Z level=DEBUG source=runner.go:471 msg="bootstrap discovery took" duration=87.5677ms OLLAMA_LIBRARY_PATH="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm]" extra_envs=map[]
time=2025-11-04T17:48:39.490Z level=DEBUG source=runner.go:120 msg="evluating which if any devices to filter out" initial_count=0
time=2025-11-04T17:48:39.490Z level=TRACE source=runner.go:179 msg="supported GPU library combinations before filtering" supported=map[]
time=2025-11-04T17:48:39.490Z level=DEBUG source=runner.go:41 msg="GPU bootstrap discovery took" duration=30.2300937s
time=2025-11-04T17:48:39.490Z level=INFO source=types.go:60 msg="inference compute" id=cpu library=cpu compute="" name=cpu description=cpu libdirs=ollama driver="" pci_id="" type="" total="93.6 GiB" available="76.4 GiB"
time=2025-11-04T17:48:39.490Z level=INFO source=routes.go:1618 msg="entering low vram mode" "total vram"="0 B" threshold="20.0 GiB"
@Zorgonatis commented on GitHub (Nov 4, 2025):
This may help too, a startup without the flag to disable CUDA, showing the 7900XTX discovery:
time=2025-11-04T17:53:48.513Z level=INFO source=routes.go:1524 msg="server config" env="map[CUDA_VISIBLE_DEVICES: GGML_VK_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_CONTEXT_LENGTH:16384 OLLAMA_DEBUG:DEBUG-4 OLLAMA_FLASH_ATTENTION:true OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://0.0.0.0:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:1h0m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:C:\AI-Models\llms OLLAMA_MULTIUSER_CACHE:false OLLAMA_NEW_ENGINE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:4 OLLAMA_ORIGINS:[* http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://* vscode-file://] OLLAMA_REMOTES:[ollama.com] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES:]"
time=2025-11-04T17:53:48.514Z level=INFO source=images.go:522 msg="total blobs: 4"
time=2025-11-04T17:53:48.514Z level=INFO source=images.go:529 msg="total unused blobs removed: 0"
time=2025-11-04T17:53:48.515Z level=INFO source=routes.go:1577 msg="Listening on [::]:11434 (version 0.12.8)"
time=2025-11-04T17:53:48.515Z level=DEBUG source=sched.go:120 msg="starting llm scheduler"
time=2025-11-04T17:53:48.516Z level=INFO source=runner.go:76 msg="discovering available GPUs..."
time=2025-11-04T17:53:48.516Z level=TRACE source=runner.go:474 msg="starting runner for device discovery" libDirs="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v12]" extraEnvs=map[]
time=2025-11-04T17:53:48.527Z level=INFO source=server.go:400 msg="starting runner" cmd="C:\Users\mswil\AppData\Local\Programs\Ollama\ollama.exe runner --ollama-engine --port 52532"
time=2025-11-04T17:53:48.527Z level=DEBUG source=server.go:401 msg=subprocess CUDA_PATH="C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.8" CUDA_PATH_V12_9="C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.9" HIP_PATH="C:\Program Files\AMD\ROCm\6.2" HIP_PATH_64="C:\Program Files\AMD\ROCm\6.4\" OLLAMA_CONTEXT_LENGTH=16384 OLLAMA_DEBUG=2 OLLAMA_FLASH_ATTENTION=true OLLAMA_HOST=0.0.0.0 OLLAMA_KEEP_ALIVE=1h OLLAMA_MODELS=C:\AI-Models\llms OLLAMA_NUM_PARALLEL=4 OLLAMA_ORIGINS= PATH="C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama;C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v12;;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.9\bin;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.9\libnvvp;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.8\bin;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.8\libnvvp;C:\Program Files\Oculus\Support\oculus-runtime;c:\windows\system32;c:\windows;c:\windows\system32\wbem;c:\windows\system32\windowspowershell\v1.0\;c:\windows\system32\openssh\;c:\program files\microsoft vs code\bin;c:\program files\dotnet\;c:\program files\nvidia corporation\nvidia nvdlisr;C:\Program Files\PuTTY\;C:\Program Files\Microsoft VS Code\bin;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;C:\WINDOWS\System32\WindowsPowerShell\v1.0\;C:\WINDOWS\System32\OpenSSH\;C:\Program Files\AMD\ROCm\6.2\bin;C:\Program Files\NVIDIA Corporation\NVIDIA App\NvDLISR;C:\Program Files (x86)\NVIDIA Corporation\PhysX\Common;C:\Program Files\Docker\Docker\resources\bin;C:\Program Files\Git\cmd;C:\Program Files\nodejs\;C:\Program Files\NVIDIA Corporation\Nsight Compute 2025.3.1\;C:\Program Files\cursor\resources\app\bin;C:\Users\mswil\miniconda3;C:\Users\mswil\miniconda3\Library\mingw-w64\bin;C:\Users\mswil\miniconda3\Library\usr\bin;C:\Users\mswil\miniconda3\Library\bin;C:\Users\mswil\miniconda3\Scripts;C:\Users\mswil\AppData\Local\Programs\Python\Python311\Scripts\;C:\Users\mswil\AppData\Local\Programs\Python\Python311\;C:\Users\mswil\AppData\Local\Microsoft\WindowsApps;C:\Program Files\AMD\ROCm\6.2\bin;C:\Users\mswil\AppData\Local\Programs\Ollama;C:\Users\mswil\.lmstudio\bin;C:\Users\mswil\AppData\Local\Microsoft\WinGet\Packages\Gyan.FFmpeg_Microsoft.Winget.Source_8wekyb3d8bbwe\ffmpeg-8.0-full_build\bin;C:\Users\mswil\AppData\Roaming\npm;C:\Users\mswil\AppData\Local\Programs\Microsoft VS Code\bin" OLLAMA_LIBRARY_PATH=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama;C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v12
time=2025-11-04T17:53:48.553Z level=INFO source=runner.go:1349 msg="starting ollama engine"
time=2025-11-04T17:53:48.554Z level=INFO source=runner.go:1384 msg="Server listening on 127.0.0.1:52532"
time=2025-11-04T17:53:48.562Z level=DEBUG source=gguf.go:590 msg=general.architecture type=string
time=2025-11-04T17:53:48.562Z level=DEBUG source=gguf.go:590 msg=tokenizer.ggml.model type=string
time=2025-11-04T17:53:48.562Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
time=2025-11-04T17:53:48.563Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
time=2025-11-04T17:53:48.563Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0
time=2025-11-04T17:53:48.563Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default=""
time=2025-11-04T17:53:48.563Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default=""
time=2025-11-04T17:53:48.563Z level=INFO source=ggml.go:136 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3
time=2025-11-04T17:53:48.563Z level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama
load_backend: loaded CPU backend from C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\ggml-cpu-icelake.dll
time=2025-11-04T17:53:48.574Z level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v12
ggml_cuda_init: GGML_CUDA_FORCE_MMQ: no
ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no
ggml_cuda_init: found 1 CUDA devices:
Device 0: NVIDIA GeForce RTX 5090, compute capability 12.0, VMM: yes, ID: GPU-2ba428d6-e4cb-5172-42b7-a3f44e959156
load_backend: loaded CUDA backend from C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v12\ggml-cuda.dll
time=2025-11-04T17:53:48.656Z level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.AVX512=1 CPU.0.AVX512_VBMI=1 CPU.0.AVX512_VNNI=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 CUDA.0.ARCHS=500,520,600,610,700,750,800,860,890,900,1200 CUDA.0.USE_GRAPHS=1 CUDA.0.PEER_MAX_BATCH_SIZE=128 compiler=cgo(clang)
time=2025-11-04T17:53:48.656Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
time=2025-11-04T17:53:48.656Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0
time=2025-11-04T17:53:48.656Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0
time=2025-11-04T17:53:48.656Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}"
time=2025-11-04T17:53:48.656Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}"
time=2025-11-04T17:53:48.656Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}"
time=2025-11-04T17:53:48.656Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}"
time=2025-11-04T17:53:48.656Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true
time=2025-11-04T17:53:48.656Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0
time=2025-11-04T17:53:48.656Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false
time=2025-11-04T17:53:48.656Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0
time=2025-11-04T17:53:48.656Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}"
time=2025-11-04T17:53:48.656Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default=""
time=2025-11-04T17:53:48.656Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
time=2025-11-04T17:53:48.656Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0
time=2025-11-04T17:53:48.656Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0
time=2025-11-04T17:53:48.656Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0
time=2025-11-04T17:53:48.656Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0
time=2025-11-04T17:53:48.656Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0
time=2025-11-04T17:53:48.656Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0
time=2025-11-04T17:53:48.656Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000
time=2025-11-04T17:53:48.656Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1
time=2025-11-04T17:53:48.656Z level=DEBUG source=runner.go:1324 msg="dummy model load took" duration=93.7494ms
time=2025-11-04T17:53:48.722Z level=DEBUG source=runner.go:1329 msg="gathering device infos took" duration=66.1237ms
time=2025-11-04T17:53:48.722Z level=TRACE source=runner.go:501 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v12]" devices="[{DeviceID:{ID:GPU-2ba428d6-e4cb-5172-42b7-a3f44e959156 Library:CUDA} Name:CUDA0 Description:NVIDIA GeForce RTX 5090 FilteredID: Integrated:false PCIID:0000:01:00.0 TotalMemory:34190458880 FreeMemory:32339132416 ComputeMajor:12 ComputeMinor:0 DriverMajor:13 DriverMinor:0 LibraryPath:[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v12]}]"
time=2025-11-04T17:53:48.722Z level=DEBUG source=runner.go:471 msg="bootstrap discovery took" duration=206.3996ms OLLAMA_LIBRARY_PATH="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v12]" extra_envs=map[]
time=2025-11-04T17:53:48.722Z level=TRACE source=runner.go:474 msg="starting runner for device discovery" libDirs="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v13]" extraEnvs=map[]
time=2025-11-04T17:53:48.723Z level=INFO source=server.go:400 msg="starting runner" cmd="C:\Users\mswil\AppData\Local\Programs\Ollama\ollama.exe runner --ollama-engine --port 52536"
time=2025-11-04T17:53:48.723Z level=DEBUG source=server.go:401 msg=subprocess CUDA_PATH="C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.8" CUDA_PATH_V12_9="C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.9" HIP_PATH="C:\Program Files\AMD\ROCm\6.2" HIP_PATH_64="C:\Program Files\AMD\ROCm\6.4\" OLLAMA_CONTEXT_LENGTH=16384 OLLAMA_DEBUG=2 OLLAMA_FLASH_ATTENTION=true OLLAMA_HOST=0.0.0.0 OLLAMA_KEEP_ALIVE=1h OLLAMA_MODELS=C:\AI-Models\llms OLLAMA_NUM_PARALLEL=4 OLLAMA_ORIGINS=* PATH="C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama;C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v13;;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.9\bin;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.9\libnvvp;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.8\bin;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.8\libnvvp;C:\Program Files\Oculus\Support\oculus-runtime;c:\windows\system32;c:\windows;c:\windows\system32\wbem;c:\windows\system32\windowspowershell\v1.0\;c:\windows\system32\openssh\;c:\program files\microsoft vs code\bin;c:\program files\dotnet\;c:\program files\nvidia corporation\nvidia nvdlisr;C:\Program Files\PuTTY\;C:\Program Files\Microsoft VS Code\bin;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;C:\WINDOWS\System32\WindowsPowerShell\v1.0\;C:\WINDOWS\System32\OpenSSH\;C:\Program Files\AMD\ROCm\6.2\bin;C:\Program Files\NVIDIA Corporation\NVIDIA App\NvDLISR;C:\Program Files (x86)\NVIDIA Corporation\PhysX\Common;C:\Program Files\Docker\Docker\resources\bin;C:\Program Files\Git\cmd;C:\Program Files\nodejs\;C:\Program Files\NVIDIA Corporation\Nsight Compute 2025.3.1\;C:\Program Files\cursor\resources\app\bin;C:\Users\mswil\miniconda3;C:\Users\mswil\miniconda3\Library\mingw-w64\bin;C:\Users\mswil\miniconda3\Library\usr\bin;C:\Users\mswil\miniconda3\Library\bin;C:\Users\mswil\miniconda3\Scripts;C:\Users\mswil\AppData\Local\Programs\Python\Python311\Scripts\;C:\Users\mswil\AppData\Local\Programs\Python\Python311\;C:\Users\mswil\AppData\Local\Microsoft\WindowsApps;C:\Program Files\AMD\ROCm\6.2\bin;C:\Users\mswil\AppData\Local\Programs\Ollama;C:\Users\mswil\.lmstudio\bin;C:\Users\mswil\AppData\Local\Microsoft\WinGet\Packages\Gyan.FFmpeg_Microsoft.Winget.Source_8wekyb3d8bbwe\ffmpeg-8.0-full_build\bin;C:\Users\mswil\AppData\Roaming\npm;C:\Users\mswil\AppData\Local\Programs\Microsoft VS Code\bin" OLLAMA_LIBRARY_PATH=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama;C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v13
time=2025-11-04T17:53:48.752Z level=INFO source=runner.go:1349 msg="starting ollama engine"
time=2025-11-04T17:53:48.753Z level=INFO source=runner.go:1384 msg="Server listening on 127.0.0.1:52536"
time=2025-11-04T17:53:48.757Z level=DEBUG source=gguf.go:590 msg=general.architecture type=string
time=2025-11-04T17:53:48.757Z level=DEBUG source=gguf.go:590 msg=tokenizer.ggml.model type=string
time=2025-11-04T17:53:48.757Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
time=2025-11-04T17:53:48.757Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
time=2025-11-04T17:53:48.757Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0
time=2025-11-04T17:53:48.757Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default=""
time=2025-11-04T17:53:48.757Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default=""
time=2025-11-04T17:53:48.757Z level=INFO source=ggml.go:136 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3
time=2025-11-04T17:53:48.757Z level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama
load_backend: loaded CPU backend from C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\ggml-cpu-icelake.dll
time=2025-11-04T17:53:48.768Z level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v13
ggml_cuda_init: GGML_CUDA_FORCE_MMQ: no
ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no
ggml_cuda_init: found 1 CUDA devices:
Device 0: NVIDIA GeForce RTX 5090, compute capability 12.0, VMM: yes, ID: GPU-2ba428d6-e4cb-5172-42b7-a3f44e959156
load_backend: loaded CUDA backend from C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v13\ggml-cuda.dll
time=2025-11-04T17:53:48.837Z level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.AVX512=1 CPU.0.AVX512_VBMI=1 CPU.0.AVX512_VNNI=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 CUDA.0.ARCHS=750,800,860,870,890,900,1000,1030,1100,1200,1210 CUDA.0.USE_GRAPHS=1 CUDA.0.PEER_MAX_BATCH_SIZE=128 compiler=cgo(clang)
time=2025-11-04T17:53:48.838Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
time=2025-11-04T17:53:48.838Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0
time=2025-11-04T17:53:48.838Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0
time=2025-11-04T17:53:48.838Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}"
time=2025-11-04T17:53:48.838Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}"
time=2025-11-04T17:53:48.838Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}"
time=2025-11-04T17:53:48.838Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}"
time=2025-11-04T17:53:48.838Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true
time=2025-11-04T17:53:48.838Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0
time=2025-11-04T17:53:48.838Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false
time=2025-11-04T17:53:48.838Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0
time=2025-11-04T17:53:48.838Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}"
time=2025-11-04T17:53:48.838Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default=""
time=2025-11-04T17:53:48.838Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
time=2025-11-04T17:53:48.838Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0
time=2025-11-04T17:53:48.838Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0
time=2025-11-04T17:53:48.838Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0
time=2025-11-04T17:53:48.838Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0
time=2025-11-04T17:53:48.838Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0
time=2025-11-04T17:53:48.838Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0
time=2025-11-04T17:53:48.838Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000
time=2025-11-04T17:53:48.838Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1
time=2025-11-04T17:53:48.838Z level=DEBUG source=runner.go:1324 msg="dummy model load took" duration=81.5617ms
time=2025-11-04T17:53:48.889Z level=DEBUG source=runner.go:1329 msg="gathering device infos took" duration=50.6568ms
time=2025-11-04T17:53:48.889Z level=TRACE source=runner.go:501 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v13]" devices="[{DeviceID:{ID:GPU-2ba428d6-e4cb-5172-42b7-a3f44e959156 Library:CUDA} Name:CUDA0 Description:NVIDIA GeForce RTX 5090 FilteredID: Integrated:false PCIID:0000:01:00.0 TotalMemory:34190458880 FreeMemory:32339132416 ComputeMajor:12 ComputeMinor:0 DriverMajor:13 DriverMinor:0 LibraryPath:[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v13]}]"
time=2025-11-04T17:53:48.889Z level=DEBUG source=runner.go:471 msg="bootstrap discovery took" duration=166.8204ms OLLAMA_LIBRARY_PATH="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v13]" extra_envs=map[]
time=2025-11-04T17:53:48.889Z level=TRACE source=runner.go:474 msg="starting runner for device discovery" libDirs="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm]" extraEnvs=map[]
time=2025-11-04T17:53:48.890Z level=INFO source=server.go:400 msg="starting runner" cmd="C:\Users\mswil\AppData\Local\Programs\Ollama\ollama.exe runner --ollama-engine --port 52540"
time=2025-11-04T17:53:48.890Z level=DEBUG source=server.go:401 msg=subprocess CUDA_PATH="C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.8" CUDA_PATH_V12_9="C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.9" HIP_PATH="C:\Program Files\AMD\ROCm\6.2" HIP_PATH_64="C:\Program Files\AMD\ROCm\6.4\" OLLAMA_CONTEXT_LENGTH=16384 OLLAMA_DEBUG=2 OLLAMA_FLASH_ATTENTION=true OLLAMA_HOST=0.0.0.0 OLLAMA_KEEP_ALIVE=1h OLLAMA_MODELS=C:\AI-Models\llms OLLAMA_NUM_PARALLEL=4 OLLAMA_ORIGINS=* PATH="C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama;C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm;;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.9\bin;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.9\libnvvp;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.8\bin;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.8\libnvvp;C:\Program Files\Oculus\Support\oculus-runtime;c:\windows\system32;c:\windows;c:\windows\system32\wbem;c:\windows\system32\windowspowershell\v1.0\;c:\windows\system32\openssh\;c:\program files\microsoft vs code\bin;c:\program files\dotnet\;c:\program files\nvidia corporation\nvidia nvdlisr;C:\Program Files\PuTTY\;C:\Program Files\Microsoft VS Code\bin;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;C:\WINDOWS\System32\WindowsPowerShell\v1.0\;C:\WINDOWS\System32\OpenSSH\;C:\Program Files\AMD\ROCm\6.2\bin;C:\Program Files\NVIDIA Corporation\NVIDIA App\NvDLISR;C:\Program Files (x86)\NVIDIA Corporation\PhysX\Common;C:\Program Files\Docker\Docker\resources\bin;C:\Program Files\Git\cmd;C:\Program Files\nodejs\;C:\Program Files\NVIDIA Corporation\Nsight Compute 2025.3.1\;C:\Program Files\cursor\resources\app\bin;C:\Users\mswil\miniconda3;C:\Users\mswil\miniconda3\Library\mingw-w64\bin;C:\Users\mswil\miniconda3\Library\usr\bin;C:\Users\mswil\miniconda3\Library\bin;C:\Users\mswil\miniconda3\Scripts;C:\Users\mswil\AppData\Local\Programs\Python\Python311\Scripts\;C:\Users\mswil\AppData\Local\Programs\Python\Python311\;C:\Users\mswil\AppData\Local\Microsoft\WindowsApps;C:\Program Files\AMD\ROCm\6.2\bin;C:\Users\mswil\AppData\Local\Programs\Ollama;C:\Users\mswil\.lmstudio\bin;C:\Users\mswil\AppData\Local\Microsoft\WinGet\Packages\Gyan.FFmpeg_Microsoft.Winget.Source_8wekyb3d8bbwe\ffmpeg-8.0-full_build\bin;C:\Users\mswil\AppData\Roaming\npm;C:\Users\mswil\AppData\Local\Programs\Microsoft VS Code\bin" OLLAMA_LIBRARY_PATH=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama;C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm
time=2025-11-04T17:53:48.921Z level=INFO source=runner.go:1349 msg="starting ollama engine"
time=2025-11-04T17:53:48.922Z level=INFO source=runner.go:1384 msg="Server listening on 127.0.0.1:52540"
time=2025-11-04T17:53:48.924Z level=DEBUG source=gguf.go:590 msg=general.architecture type=string
time=2025-11-04T17:53:48.924Z level=DEBUG source=gguf.go:590 msg=tokenizer.ggml.model type=string
time=2025-11-04T17:53:48.924Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
time=2025-11-04T17:53:48.925Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
time=2025-11-04T17:53:48.925Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0
time=2025-11-04T17:53:48.925Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default=""
time=2025-11-04T17:53:48.925Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default=""
time=2025-11-04T17:53:48.925Z level=INFO source=ggml.go:136 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3
time=2025-11-04T17:53:48.925Z level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama
load_backend: loaded CPU backend from C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\ggml-cpu-icelake.dll
time=2025-11-04T17:53:48.935Z level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm
ggml_cuda_init: GGML_CUDA_FORCE_MMQ: no
ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no
ggml_cuda_init: found 2 ROCm devices:
Device 0: AMD Radeon(TM) Graphics, gfx1036 (0x1036), VMM: no, Wave Size: 32, ID: 0
Device 1: AMD Radeon RX 7900 XTX, gfx1100 (0x1100), VMM: no, Wave Size: 32, ID: 1
load_backend: loaded ROCm backend from C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm\ggml-hip.dll
time=2025-11-04T17:53:48.962Z level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.AVX512=1 CPU.0.AVX512_VBMI=1 CPU.0.AVX512_VNNI=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 ROCm.0.NO_VMM=1 ROCm.0.NO_PEER_COPY=1 ROCm.0.PEER_MAX_BATCH_SIZE=128 ROCm.1.NO_VMM=1 ROCm.1.NO_PEER_COPY=1 ROCm.1.PEER_MAX_BATCH_SIZE=128 compiler=cgo(clang)
time=2025-11-04T17:53:48.962Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
time=2025-11-04T17:53:48.962Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0
time=2025-11-04T17:53:48.962Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0
time=2025-11-04T17:53:48.962Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}"
time=2025-11-04T17:53:48.962Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}"
time=2025-11-04T17:53:48.962Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}"
time=2025-11-04T17:53:48.962Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}"
time=2025-11-04T17:53:48.962Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true
time=2025-11-04T17:53:48.962Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0
time=2025-11-04T17:53:48.962Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false
time=2025-11-04T17:53:48.962Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0
time=2025-11-04T17:53:48.962Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}"
time=2025-11-04T17:53:48.962Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default=""
time=2025-11-04T17:53:48.962Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
time=2025-11-04T17:53:48.962Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0
time=2025-11-04T17:53:48.962Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0
time=2025-11-04T17:53:48.962Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0
time=2025-11-04T17:53:48.962Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0
time=2025-11-04T17:53:48.962Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0
time=2025-11-04T17:53:48.962Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0
time=2025-11-04T17:53:48.962Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000
time=2025-11-04T17:53:48.962Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1
time=2025-11-04T17:53:48.962Z level=DEBUG source=runner.go:1324 msg="dummy model load took" duration=38.3935ms
time=2025-11-04T17:53:49.526Z level=DEBUG source=runner.go:1329 msg="gathering device infos took" duration=563.5102ms
time=2025-11-04T17:53:49.526Z level=TRACE source=runner.go:501 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm]" devices="[{DeviceID:{ID:0 Library:ROCm} Name:ROCm0 Description:AMD Radeon(TM) Graphics FilteredID: Integrated:true PCIID:0000:13:00.0 TotalMemory:39639842816 FreeMemory:39481610240 ComputeMajor:16 ComputeMinor:54 DriverMajor:60342 DriverMinor:56 LibraryPath:[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm]} {DeviceID:{ID:1 Library:ROCm} Name:ROCm1 Description:AMD Radeon RX 7900 XTX FilteredID: Integrated:false PCIID:0000:05:00.0 TotalMemory:25753026560 FreeMemory:25596264448 ComputeMajor:17 ComputeMinor:0 DriverMajor:60342 DriverMinor:56 LibraryPath:[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm]}]"
time=2025-11-04T17:53:49.526Z level=DEBUG source=runner.go:471 msg="bootstrap discovery took" duration=636.9031ms OLLAMA_LIBRARY_PATH="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm]" extra_envs=map[]
time=2025-11-04T17:53:49.526Z level=DEBUG source=runner.go:120 msg="evluating which if any devices to filter out" initial_count=4
time=2025-11-04T17:53:49.526Z level=DEBUG source=runner.go:132 msg="verifying GPU is supported" library=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v12 description="NVIDIA GeForce RTX 5090" compute=12.0 id=GPU-2ba428d6-e4cb-5172-42b7-a3f44e959156 pci_id=0000:01:00.0
time=2025-11-04T17:53:49.526Z level=DEBUG source=runner.go:132 msg="verifying GPU is supported" library=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v13 description="NVIDIA GeForce RTX 5090" compute=12.0 id=GPU-2ba428d6-e4cb-5172-42b7-a3f44e959156 pci_id=0000:01:00.0
time=2025-11-04T17:53:49.527Z level=TRACE source=runner.go:474 msg="starting runner for device discovery" libDirs="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v12]" extraEnvs="map[CUDA_VISIBLE_DEVICES:GPU-2ba428d6-e4cb-5172-42b7-a3f44e959156 GGML_CUDA_INIT:1]"
time=2025-11-04T17:53:49.527Z level=DEBUG source=runner.go:132 msg="verifying GPU is supported" library=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm description="AMD Radeon(TM) Graphics" compute=gfx1036 id=0 pci_id=0000:13:00.0
time=2025-11-04T17:53:49.527Z level=TRACE source=runner.go:474 msg="starting runner for device discovery" libDirs="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v13]" extraEnvs="map[CUDA_VISIBLE_DEVICES:GPU-2ba428d6-e4cb-5172-42b7-a3f44e959156 GGML_CUDA_INIT:1]"
time=2025-11-04T17:53:49.527Z level=DEBUG source=runner.go:132 msg="verifying GPU is supported" library=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm description="AMD Radeon RX 7900 XTX" compute=gfx1100 id=1 pci_id=0000:05:00.0
time=2025-11-04T17:53:49.527Z level=TRACE source=runner.go:474 msg="starting runner for device discovery" libDirs="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm]" extraEnvs="map[GGML_CUDA_INIT:1 HIP_VISIBLE_DEVICES:0]"
time=2025-11-04T17:53:49.527Z level=TRACE source=runner.go:474 msg="starting runner for device discovery" libDirs="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm]" extraEnvs="map[GGML_CUDA_INIT:1 HIP_VISIBLE_DEVICES:1]"
time=2025-11-04T17:53:49.528Z level=INFO source=server.go:400 msg="starting runner" cmd="C:\Users\mswil\AppData\Local\Programs\Ollama\ollama.exe runner --ollama-engine --port 52547"
time=2025-11-04T17:53:49.528Z level=DEBUG source=server.go:401 msg=subprocess CUDA_PATH="C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.8" CUDA_PATH_V12_9="C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.9" HIP_PATH="C:\Program Files\AMD\ROCm\6.2" HIP_PATH_64="C:\Program Files\AMD\ROCm\6.4\" OLLAMA_CONTEXT_LENGTH=16384 OLLAMA_DEBUG=2 OLLAMA_FLASH_ATTENTION=true OLLAMA_HOST=0.0.0.0 OLLAMA_KEEP_ALIVE=1h OLLAMA_MODELS=C:\AI-Models\llms OLLAMA_NUM_PARALLEL=4 OLLAMA_ORIGINS=* PATH="C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama;C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm;;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.9\bin;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.9\libnvvp;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.8\bin;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.8\libnvvp;C:\Program Files\Oculus\Support\oculus-runtime;c:\windows\system32;c:\windows;c:\windows\system32\wbem;c:\windows\system32\windowspowershell\v1.0\;c:\windows\system32\openssh\;c:\program files\microsoft vs code\bin;c:\program files\dotnet\;c:\program files\nvidia corporation\nvidia nvdlisr;C:\Program Files\PuTTY\;C:\Program Files\Microsoft VS Code\bin;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;C:\WINDOWS\System32\WindowsPowerShell\v1.0\;C:\WINDOWS\System32\OpenSSH\;C:\Program Files\AMD\ROCm\6.2\bin;C:\Program Files\NVIDIA Corporation\NVIDIA App\NvDLISR;C:\Program Files (x86)\NVIDIA Corporation\PhysX\Common;C:\Program Files\Docker\Docker\resources\bin;C:\Program Files\Git\cmd;C:\Program Files\nodejs\;C:\Program Files\NVIDIA Corporation\Nsight Compute 2025.3.1\;C:\Program Files\cursor\resources\app\bin;C:\Users\mswil\miniconda3;C:\Users\mswil\miniconda3\Library\mingw-w64\bin;C:\Users\mswil\miniconda3\Library\usr\bin;C:\Users\mswil\miniconda3\Library\bin;C:\Users\mswil\miniconda3\Scripts;C:\Users\mswil\AppData\Local\Programs\Python\Python311\Scripts\;C:\Users\mswil\AppData\Local\Programs\Python\Python311\;C:\Users\mswil\AppData\Local\Microsoft\WindowsApps;C:\Program Files\AMD\ROCm\6.2\bin;C:\Users\mswil\AppData\Local\Programs\Ollama;C:\Users\mswil\.lmstudio\bin;C:\Users\mswil\AppData\Local\Microsoft\WinGet\Packages\Gyan.FFmpeg_Microsoft.Winget.Source_8wekyb3d8bbwe\ffmpeg-8.0-full_build\bin;C:\Users\mswil\AppData\Roaming\npm;C:\Users\mswil\AppData\Local\Programs\Microsoft VS Code\bin" OLLAMA_LIBRARY_PATH=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama;C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm GGML_CUDA_INIT=1 HIP_VISIBLE_DEVICES=0
time=2025-11-04T17:53:49.528Z level=INFO source=server.go:400 msg="starting runner" cmd="C:\Users\mswil\AppData\Local\Programs\Ollama\ollama.exe runner --ollama-engine --port 52545"
time=2025-11-04T17:53:49.528Z level=INFO source=server.go:400 msg="starting runner" cmd="C:\Users\mswil\AppData\Local\Programs\Ollama\ollama.exe runner --ollama-engine --port 52544"
time=2025-11-04T17:53:49.528Z level=DEBUG source=server.go:401 msg=subprocess CUDA_PATH="C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.8" CUDA_PATH_V12_9="C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.9" HIP_PATH="C:\Program Files\AMD\ROCm\6.2" HIP_PATH_64="C:\Program Files\AMD\ROCm\6.4\" OLLAMA_CONTEXT_LENGTH=16384 OLLAMA_DEBUG=2 OLLAMA_FLASH_ATTENTION=true OLLAMA_HOST=0.0.0.0 OLLAMA_KEEP_ALIVE=1h OLLAMA_MODELS=C:\AI-Models\llms OLLAMA_NUM_PARALLEL=4 OLLAMA_ORIGINS=* PATH="C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama;C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm;;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.9\bin;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.9\libnvvp;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.8\bin;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.8\libnvvp;C:\Program Files\Oculus\Support\oculus-runtime;c:\windows\system32;c:\windows;c:\windows\system32\wbem;c:\windows\system32\windowspowershell\v1.0\;c:\windows\system32\openssh\;c:\program files\microsoft vs code\bin;c:\program files\dotnet\;c:\program files\nvidia corporation\nvidia nvdlisr;C:\Program Files\PuTTY\;C:\Program Files\Microsoft VS Code\bin;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;C:\WINDOWS\System32\WindowsPowerShell\v1.0\;C:\WINDOWS\System32\OpenSSH\;C:\Program Files\AMD\ROCm\6.2\bin;C:\Program Files\NVIDIA Corporation\NVIDIA App\NvDLISR;C:\Program Files (x86)\NVIDIA Corporation\PhysX\Common;C:\Program Files\Docker\Docker\resources\bin;C:\Program Files\Git\cmd;C:\Program Files\nodejs\;C:\Program Files\NVIDIA Corporation\Nsight Compute 2025.3.1\;C:\Program Files\cursor\resources\app\bin;C:\Users\mswil\miniconda3;C:\Users\mswil\miniconda3\Library\mingw-w64\bin;C:\Users\mswil\miniconda3\Library\usr\bin;C:\Users\mswil\miniconda3\Library\bin;C:\Users\mswil\miniconda3\Scripts;C:\Users\mswil\AppData\Local\Programs\Python\Python311\Scripts\;C:\Users\mswil\AppData\Local\Programs\Python\Python311\;C:\Users\mswil\AppData\Local\Microsoft\WindowsApps;C:\Program Files\AMD\ROCm\6.2\bin;C:\Users\mswil\AppData\Local\Programs\Ollama;C:\Users\mswil\.lmstudio\bin;C:\Users\mswil\AppData\Local\Microsoft\WinGet\Packages\Gyan.FFmpeg_Microsoft.Winget.Source_8wekyb3d8bbwe\ffmpeg-8.0-full_build\bin;C:\Users\mswil\AppData\Roaming\npm;C:\Users\mswil\AppData\Local\Programs\Microsoft VS Code\bin" OLLAMA_LIBRARY_PATH=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama;C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm GGML_CUDA_INIT=1 HIP_VISIBLE_DEVICES=1
time=2025-11-04T17:53:49.528Z level=DEBUG source=server.go:401 msg=subprocess CUDA_PATH="C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.8" CUDA_PATH_V12_9="C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.9" HIP_PATH="C:\Program Files\AMD\ROCm\6.2" HIP_PATH_64="C:\Program Files\AMD\ROCm\6.4\" OLLAMA_CONTEXT_LENGTH=16384 OLLAMA_DEBUG=2 OLLAMA_FLASH_ATTENTION=true OLLAMA_HOST=0.0.0.0 OLLAMA_KEEP_ALIVE=1h OLLAMA_MODELS=C:\AI-Models\llms OLLAMA_NUM_PARALLEL=4 OLLAMA_ORIGINS=* PATH="C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama;C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v12;;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.9\bin;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.9\libnvvp;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.8\bin;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.8\libnvvp;C:\Program Files\Oculus\Support\oculus-runtime;c:\windows\system32;c:\windows;c:\windows\system32\wbem;c:\windows\system32\windowspowershell\v1.0\;c:\windows\system32\openssh\;c:\program files\microsoft vs code\bin;c:\program files\dotnet\;c:\program files\nvidia corporation\nvidia nvdlisr;C:\Program Files\PuTTY\;C:\Program Files\Microsoft VS Code\bin;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;C:\WINDOWS\System32\WindowsPowerShell\v1.0\;C:\WINDOWS\System32\OpenSSH\;C:\Program Files\AMD\ROCm\6.2\bin;C:\Program Files\NVIDIA Corporation\NVIDIA App\NvDLISR;C:\Program Files (x86)\NVIDIA Corporation\PhysX\Common;C:\Program Files\Docker\Docker\resources\bin;C:\Program Files\Git\cmd;C:\Program Files\nodejs\;C:\Program Files\NVIDIA Corporation\Nsight Compute 2025.3.1\;C:\Program Files\cursor\resources\app\bin;C:\Users\mswil\miniconda3;C:\Users\mswil\miniconda3\Library\mingw-w64\bin;C:\Users\mswil\miniconda3\Library\usr\bin;C:\Users\mswil\miniconda3\Library\bin;C:\Users\mswil\miniconda3\Scripts;C:\Users\mswil\AppData\Local\Programs\Python\Python311\Scripts\;C:\Users\mswil\AppData\Local\Programs\Python\Python311\;C:\Users\mswil\AppData\Local\Microsoft\WindowsApps;C:\Program Files\AMD\ROCm\6.2\bin;C:\Users\mswil\AppData\Local\Programs\Ollama;C:\Users\mswil\.lmstudio\bin;C:\Users\mswil\AppData\Local\Microsoft\WinGet\Packages\Gyan.FFmpeg_Microsoft.Winget.Source_8wekyb3d8bbwe\ffmpeg-8.0-full_build\bin;C:\Users\mswil\AppData\Roaming\npm;C:\Users\mswil\AppData\Local\Programs\Microsoft VS Code\bin" OLLAMA_LIBRARY_PATH=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama;C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v12 GGML_CUDA_INIT=1 CUDA_VISIBLE_DEVICES=GPU-2ba428d6-e4cb-5172-42b7-a3f44e959156
time=2025-11-04T17:53:49.528Z level=INFO source=server.go:400 msg="starting runner" cmd="C:\Users\mswil\AppData\Local\Programs\Ollama\ollama.exe runner --ollama-engine --port 52546"
time=2025-11-04T17:53:49.529Z level=DEBUG source=server.go:401 msg=subprocess CUDA_PATH="C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.8" CUDA_PATH_V12_9="C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.9" HIP_PATH="C:\Program Files\AMD\ROCm\6.2" HIP_PATH_64="C:\Program Files\AMD\ROCm\6.4\" OLLAMA_CONTEXT_LENGTH=16384 OLLAMA_DEBUG=2 OLLAMA_FLASH_ATTENTION=true OLLAMA_HOST=0.0.0.0 OLLAMA_KEEP_ALIVE=1h OLLAMA_MODELS=C:\AI-Models\llms OLLAMA_NUM_PARALLEL=4 OLLAMA_ORIGINS=* PATH="C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama;C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v13;;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.9\bin;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.9\libnvvp;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.8\bin;C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v12.8\libnvvp;C:\Program Files\Oculus\Support\oculus-runtime;c:\windows\system32;c:\windows;c:\windows\system32\wbem;c:\windows\system32\windowspowershell\v1.0\;c:\windows\system32\openssh\;c:\program files\microsoft vs code\bin;c:\program files\dotnet\;c:\program files\nvidia corporation\nvidia nvdlisr;C:\Program Files\PuTTY\;C:\Program Files\Microsoft VS Code\bin;C:\WINDOWS\system32;C:\WINDOWS;C:\WINDOWS\System32\Wbem;C:\WINDOWS\System32\WindowsPowerShell\v1.0\;C:\WINDOWS\System32\OpenSSH\;C:\Program Files\AMD\ROCm\6.2\bin;C:\Program Files\NVIDIA Corporation\NVIDIA App\NvDLISR;C:\Program Files (x86)\NVIDIA Corporation\PhysX\Common;C:\Program Files\Docker\Docker\resources\bin;C:\Program Files\Git\cmd;C:\Program Files\nodejs\;C:\Program Files\NVIDIA Corporation\Nsight Compute 2025.3.1\;C:\Program Files\cursor\resources\app\bin;C:\Users\mswil\miniconda3;C:\Users\mswil\miniconda3\Library\mingw-w64\bin;C:\Users\mswil\miniconda3\Library\usr\bin;C:\Users\mswil\miniconda3\Library\bin;C:\Users\mswil\miniconda3\Scripts;C:\Users\mswil\AppData\Local\Programs\Python\Python311\Scripts\;C:\Users\mswil\AppData\Local\Programs\Python\Python311\;C:\Users\mswil\AppData\Local\Microsoft\WindowsApps;C:\Program Files\AMD\ROCm\6.2\bin;C:\Users\mswil\AppData\Local\Programs\Ollama;C:\Users\mswil\.lmstudio\bin;C:\Users\mswil\AppData\Local\Microsoft\WinGet\Packages\Gyan.FFmpeg_Microsoft.Winget.Source_8wekyb3d8bbwe\ffmpeg-8.0-full_build\bin;C:\Users\mswil\AppData\Roaming\npm;C:\Users\mswil\AppData\Local\Programs\Microsoft VS Code\bin" OLLAMA_LIBRARY_PATH=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama;C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v13 GGML_CUDA_INIT=1 CUDA_VISIBLE_DEVICES=GPU-2ba428d6-e4cb-5172-42b7-a3f44e959156
time=2025-11-04T17:53:49.562Z level=INFO source=runner.go:1349 msg="starting ollama engine"
time=2025-11-04T17:53:49.562Z level=INFO source=runner.go:1349 msg="starting ollama engine"
time=2025-11-04T17:53:49.563Z level=INFO source=runner.go:1384 msg="Server listening on 127.0.0.1:52544"
time=2025-11-04T17:53:49.563Z level=INFO source=runner.go:1384 msg="Server listening on 127.0.0.1:52546"
time=2025-11-04T17:53:49.563Z level=INFO source=runner.go:1349 msg="starting ollama engine"
time=2025-11-04T17:53:49.564Z level=INFO source=runner.go:1384 msg="Server listening on 127.0.0.1:52547"
time=2025-11-04T17:53:49.563Z level=INFO source=runner.go:1349 msg="starting ollama engine"
time=2025-11-04T17:53:49.564Z level=INFO source=runner.go:1384 msg="Server listening on 127.0.0.1:52545"
time=2025-11-04T17:53:49.572Z level=DEBUG source=gguf.go:590 msg=general.architecture type=string
time=2025-11-04T17:53:49.572Z level=DEBUG source=gguf.go:590 msg=general.architecture type=string
time=2025-11-04T17:53:49.572Z level=DEBUG source=gguf.go:590 msg=tokenizer.ggml.model type=string
time=2025-11-04T17:53:49.572Z level=DEBUG source=gguf.go:590 msg=tokenizer.ggml.model type=string
time=2025-11-04T17:53:49.572Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
time=2025-11-04T17:53:49.572Z level=DEBUG source=gguf.go:590 msg=general.architecture type=string
time=2025-11-04T17:53:49.572Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
time=2025-11-04T17:53:49.572Z level=DEBUG source=gguf.go:590 msg=general.architecture type=string
time=2025-11-04T17:53:49.572Z level=DEBUG source=gguf.go:590 msg=tokenizer.ggml.model type=string
time=2025-11-04T17:53:49.572Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
time=2025-11-04T17:53:49.572Z level=DEBUG source=gguf.go:590 msg=tokenizer.ggml.model type=string
time=2025-11-04T17:53:49.572Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
time=2025-11-04T17:53:49.573Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
time=2025-11-04T17:53:49.573Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0
time=2025-11-04T17:53:49.573Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default=""
time=2025-11-04T17:53:49.573Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default=""
time=2025-11-04T17:53:49.573Z level=INFO source=ggml.go:136 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3
time=2025-11-04T17:53:49.573Z level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama
time=2025-11-04T17:53:49.573Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
time=2025-11-04T17:53:49.573Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0
time=2025-11-04T17:53:49.573Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default=""
time=2025-11-04T17:53:49.573Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default=""
time=2025-11-04T17:53:49.573Z level=INFO source=ggml.go:136 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3
time=2025-11-04T17:53:49.573Z level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama
time=2025-11-04T17:53:49.573Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
time=2025-11-04T17:53:49.573Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0
time=2025-11-04T17:53:49.573Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default=""
time=2025-11-04T17:53:49.573Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default=""
time=2025-11-04T17:53:49.573Z level=INFO source=ggml.go:136 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3
time=2025-11-04T17:53:49.573Z level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama
time=2025-11-04T17:53:49.573Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.alignment default=32
time=2025-11-04T17:53:49.573Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.file_type default=0
time=2025-11-04T17:53:49.573Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.name default=""
time=2025-11-04T17:53:49.573Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=general.description default=""
time=2025-11-04T17:53:49.573Z level=INFO source=ggml.go:136 msg="" architecture=llama file_type=unknown name="" description="" num_tensors=0 num_key_values=3
time=2025-11-04T17:53:49.573Z level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama
load_backend: loaded CPU backend from C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\ggml-cpu-icelake.dll
time=2025-11-04T17:53:49.586Z level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm
load_backend: loaded CPU backend from C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\ggml-cpu-icelake.dll
load_backend: loaded CPU backend from C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\ggml-cpu-icelake.dll
time=2025-11-04T17:53:49.587Z level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v13
time=2025-11-04T17:53:49.587Z level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v12
load_backend: loaded CPU backend from C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\ggml-cpu-icelake.dll
time=2025-11-04T17:53:49.587Z level=DEBUG source=ggml.go:94 msg="ggml backend load all from path" path=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm
ggml_cuda_init: GGML_CUDA_FORCE_MMQ: no
ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no
ggml_cuda_init: found 1 ROCm devices:
ggml_cuda_init: initializing rocBLAS on device 0
ggml_cuda_init: GGML_CUDA_FORCE_MMQ: no
ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no
ggml_cuda_init: found 1 ROCm devices:
ggml_cuda_init: initializing rocBLAS on device 0
rocBLAS error: Cannot read C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm/rocblas/library/TensileLibrary.dat: No such file or directory for GPU arch : gfx1036
List of available TensileLibrary Files :
ggml_cuda_init: GGML_CUDA_FORCE_MMQ: no
ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no
ggml_cuda_init: found 1 CUDA devices:
Device 0: NVIDIA GeForce RTX 5090, compute capability 12.0, VMM: yes, ID: GPU-2ba428d6-e4cb-5172-42b7-a3f44e959156
load_backend: loaded CUDA backend from C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v13\ggml-cuda.dll
time=2025-11-04T17:53:49.663Z level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.AVX512=1 CPU.0.AVX512_VBMI=1 CPU.0.AVX512_VNNI=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 CUDA.0.ARCHS=750,800,860,870,890,900,1000,1030,1100,1200,1210 CUDA.0.USE_GRAPHS=1 CUDA.0.PEER_MAX_BATCH_SIZE=128 compiler=cgo(clang)
time=2025-11-04T17:53:49.663Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
time=2025-11-04T17:53:49.663Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0
time=2025-11-04T17:53:49.663Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0
time=2025-11-04T17:53:49.663Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}"
time=2025-11-04T17:53:49.663Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}"
time=2025-11-04T17:53:49.663Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}"
time=2025-11-04T17:53:49.663Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}"
time=2025-11-04T17:53:49.663Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true
time=2025-11-04T17:53:49.663Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0
time=2025-11-04T17:53:49.663Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false
time=2025-11-04T17:53:49.663Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0
time=2025-11-04T17:53:49.663Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}"
time=2025-11-04T17:53:49.663Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default=""
time=2025-11-04T17:53:49.664Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
time=2025-11-04T17:53:49.664Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0
time=2025-11-04T17:53:49.664Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0
time=2025-11-04T17:53:49.664Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0
time=2025-11-04T17:53:49.664Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0
time=2025-11-04T17:53:49.664Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0
time=2025-11-04T17:53:49.664Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0
time=2025-11-04T17:53:49.664Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000
time=2025-11-04T17:53:49.664Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1
time=2025-11-04T17:53:49.664Z level=DEBUG source=runner.go:1324 msg="dummy model load took" duration=92.461ms
ggml_cuda_init: GGML_CUDA_FORCE_MMQ: no
ggml_cuda_init: GGML_CUDA_FORCE_CUBLAS: no
ggml_cuda_init: found 1 CUDA devices:
Device 0: NVIDIA GeForce RTX 5090, compute capability 12.0, VMM: yes, ID: GPU-2ba428d6-e4cb-5172-42b7-a3f44e959156
load_backend: loaded CUDA backend from C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v12\ggml-cuda.dll
time=2025-11-04T17:53:49.675Z level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.AVX512=1 CPU.0.AVX512_VBMI=1 CPU.0.AVX512_VNNI=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 CUDA.0.ARCHS=500,520,600,610,700,750,800,860,890,900,1200 CUDA.0.USE_GRAPHS=1 CUDA.0.PEER_MAX_BATCH_SIZE=128 compiler=cgo(clang)
time=2025-11-04T17:53:49.676Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
time=2025-11-04T17:53:49.676Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0
time=2025-11-04T17:53:49.676Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0
time=2025-11-04T17:53:49.676Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}"
time=2025-11-04T17:53:49.676Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}"
time=2025-11-04T17:53:49.676Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}"
time=2025-11-04T17:53:49.676Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}"
time=2025-11-04T17:53:49.676Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true
time=2025-11-04T17:53:49.676Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0
time=2025-11-04T17:53:49.676Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false
time=2025-11-04T17:53:49.676Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0
time=2025-11-04T17:53:49.676Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}"
time=2025-11-04T17:53:49.676Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default=""
time=2025-11-04T17:53:49.676Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
time=2025-11-04T17:53:49.676Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0
time=2025-11-04T17:53:49.676Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0
time=2025-11-04T17:53:49.676Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0
time=2025-11-04T17:53:49.676Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0
time=2025-11-04T17:53:49.676Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0
time=2025-11-04T17:53:49.676Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0
time=2025-11-04T17:53:49.676Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000
time=2025-11-04T17:53:49.676Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1
time=2025-11-04T17:53:49.676Z level=DEBUG source=runner.go:1324 msg="dummy model load took" duration=104.9632ms
time=2025-11-04T17:53:49.712Z level=TRACE source=runner.go:496 msg="runner exited" OLLAMA_LIBRARY_PATH="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm]" extra_envs="map[GGML_CUDA_INIT:1 HIP_VISIBLE_DEVICES:0]" code=3221226505
time=2025-11-04T17:53:49.713Z level=TRACE source=runner.go:501 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm]" devices=[]
time=2025-11-04T17:53:49.713Z level=DEBUG source=runner.go:471 msg="bootstrap discovery took" duration=186.0225ms OLLAMA_LIBRARY_PATH="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm]" extra_envs="map[GGML_CUDA_INIT:1 HIP_VISIBLE_DEVICES:0]"
time=2025-11-04T17:53:49.713Z level=DEBUG source=runner.go:158 msg="filtering device which didn't fully initialize" id=0 libdir=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm pci_id=0000:13:00.0 library=ROCm
time=2025-11-04T17:53:49.739Z level=DEBUG source=runner.go:1329 msg="gathering device infos took" duration=74.9147ms
time=2025-11-04T17:53:49.739Z level=TRACE source=runner.go:501 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v13]" devices="[{DeviceID:{ID:GPU-2ba428d6-e4cb-5172-42b7-a3f44e959156 Library:CUDA} Name:CUDA0 Description:NVIDIA GeForce RTX 5090 FilteredID: Integrated:false PCIID:0000:01:00.0 TotalMemory:34190458880 FreeMemory:32339132416 ComputeMajor:12 ComputeMinor:0 DriverMajor:13 DriverMinor:0 LibraryPath:[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v13]}]"
time=2025-11-04T17:53:49.740Z level=DEBUG source=runner.go:471 msg="bootstrap discovery took" duration=213.2045ms OLLAMA_LIBRARY_PATH="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v13]" extra_envs="map[CUDA_VISIBLE_DEVICES:GPU-2ba428d6-e4cb-5172-42b7-a3f44e959156 GGML_CUDA_INIT:1]"
time=2025-11-04T17:53:49.749Z level=DEBUG source=runner.go:1329 msg="gathering device infos took" duration=72.913ms
time=2025-11-04T17:53:49.749Z level=TRACE source=runner.go:501 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v12]" devices="[{DeviceID:{ID:GPU-2ba428d6-e4cb-5172-42b7-a3f44e959156 Library:CUDA} Name:CUDA0 Description:NVIDIA GeForce RTX 5090 FilteredID: Integrated:false PCIID:0000:01:00.0 TotalMemory:34190458880 FreeMemory:32339132416 ComputeMajor:12 ComputeMinor:0 DriverMajor:13 DriverMinor:0 LibraryPath:[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v12]}]"
time=2025-11-04T17:53:49.750Z level=DEBUG source=runner.go:471 msg="bootstrap discovery took" duration=223.2055ms OLLAMA_LIBRARY_PATH="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v12]" extra_envs="map[CUDA_VISIBLE_DEVICES:GPU-2ba428d6-e4cb-5172-42b7-a3f44e959156 GGML_CUDA_INIT:1]"
ggml_cuda_init: rocBLAS initialized on device 0
Device 0: AMD Radeon RX 7900 XTX, gfx1100 (0x1100), VMM: no, Wave Size: 32, ID: 0
load_backend: loaded ROCm backend from C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm\ggml-hip.dll
time=2025-11-04T17:53:50.868Z level=INFO source=ggml.go:104 msg=system CPU.0.SSE3=1 CPU.0.SSSE3=1 CPU.0.AVX=1 CPU.0.AVX2=1 CPU.0.F16C=1 CPU.0.FMA=1 CPU.0.BMI2=1 CPU.0.AVX512=1 CPU.0.AVX512_VBMI=1 CPU.0.AVX512_VNNI=1 CPU.0.LLAMAFILE=1 CPU.1.LLAMAFILE=1 ROCm.0.NO_VMM=1 ROCm.0.NO_PEER_COPY=1 ROCm.0.PEER_MAX_BATCH_SIZE=128 compiler=cgo(clang)
time=2025-11-04T17:53:50.868Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
time=2025-11-04T17:53:50.869Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.pooling_type default=0
time=2025-11-04T17:53:50.869Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.expert_count default=0
time=2025-11-04T17:53:50.869Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.tokens default="&{size:0 values:[]}"
time=2025-11-04T17:53:50.869Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.scores default="&{size:0 values:[]}"
time=2025-11-04T17:53:50.869Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.token_type default="&{size:0 values:[]}"
time=2025-11-04T17:53:50.869Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.merges default="&{size:0 values:[]}"
time=2025-11-04T17:53:50.869Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_bos_token default=true
time=2025-11-04T17:53:50.869Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.bos_token_id default=0
time=2025-11-04T17:53:50.869Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.add_eos_token default=false
time=2025-11-04T17:53:50.869Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_id default=0
time=2025-11-04T17:53:50.869Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.eos_token_ids default="&{size:0 values:[]}"
time=2025-11-04T17:53:50.869Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=tokenizer.ggml.pre default=""
time=2025-11-04T17:53:50.869Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.block_count default=0
time=2025-11-04T17:53:50.869Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.embedding_length default=0
time=2025-11-04T17:53:50.869Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count default=0
time=2025-11-04T17:53:50.869Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.head_count_kv default=0
time=2025-11-04T17:53:50.869Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.key_length default=0
time=2025-11-04T17:53:50.869Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.dimension_count default=0
time=2025-11-04T17:53:50.869Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.attention.layer_norm_rms_epsilon default=0
time=2025-11-04T17:53:50.869Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.freq_base default=100000
time=2025-11-04T17:53:50.869Z level=DEBUG source=ggml.go:276 msg="key with type not found" key=llama.rope.scaling.factor default=1
time=2025-11-04T17:53:50.869Z level=DEBUG source=runner.go:1324 msg="dummy model load took" duration=1.2975321s
time=2025-11-04T17:53:50.869Z level=DEBUG source=runner.go:1329 msg="gathering device infos took" duration=0s
time=2025-11-04T17:53:50.869Z level=TRACE source=runner.go:501 msg="runner enumerated devices" OLLAMA_LIBRARY_PATH="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm]" devices="[{DeviceID:{ID:0 Library:ROCm} Name:ROCm0 Description:AMD Radeon RX 7900 XTX FilteredID: Integrated:false PCIID:0000:05:00.0 TotalMemory:25753026560 FreeMemory:25369083904 ComputeMajor:17 ComputeMinor:0 DriverMajor:60342 DriverMinor:56 LibraryPath:[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm]}]"
time=2025-11-04T17:53:50.869Z level=DEBUG source=runner.go:471 msg="bootstrap discovery took" duration=1.3428604s OLLAMA_LIBRARY_PATH="[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm]" extra_envs="map[GGML_CUDA_INIT:1 HIP_VISIBLE_DEVICES:1]"
time=2025-11-04T17:53:50.869Z level=TRACE source=runner.go:179 msg="supported GPU library combinations before filtering" supported="map[CUDA:map[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v12:map[GPU-2ba428d6-e4cb-5172-42b7-a3f44e959156:0] C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v13:map[GPU-2ba428d6-e4cb-5172-42b7-a3f44e959156:1]] ROCm:map[C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm:map[1:3]]]"
time=2025-11-04T17:53:50.869Z level=DEBUG source=runner.go:435 msg="filtering device with overlapping libraries" id=GPU-2ba428d6-e4cb-5172-42b7-a3f44e959156 library=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v12 delete_index=0 kept_library=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v13
time=2025-11-04T17:53:50.870Z level=TRACE source=runner.go:190 msg="removing unsupported or overlapping GPU combination" libDir=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\cuda_v12 description="NVIDIA GeForce RTX 5090" compute=12.0 pci_id=0000:01:00.0
time=2025-11-04T17:53:50.870Z level=TRACE source=runner.go:190 msg="removing unsupported or overlapping GPU combination" libDir=C:\Users\mswil\AppData\Local\Programs\Ollama\lib\ollama\rocm description="AMD Radeon(TM) Graphics" compute=gfx1036 pci_id=0000:13:00.0
time=2025-11-04T17:53:50.870Z level=DEBUG source=runner.go:41 msg="GPU bootstrap discovery took" duration=2.3549837s
time=2025-11-04T17:53:50.870Z level=INFO source=types.go:42 msg="inference compute" id=GPU-2ba428d6-e4cb-5172-42b7-a3f44e959156 filtered_id="" library=CUDA compute=12.0 name=CUDA0 description="NVIDIA GeForce RTX 5090" libdirs=ollama,cuda_v13 driver=13.0 pci_id=0000:01:00.0 type=discrete total="31.8 GiB" available="30.1 GiB"
time=2025-11-04T17:53:50.870Z level=INFO source=types.go:42 msg="inference compute" id=0 filtered_id=1 library=ROCm compute=gfx1100 name=ROCm1 description="AMD Radeon RX 7900 XTX" libdirs=ollama,rocm driver=60342.56 pci_id=0000:05:00.0 type=discrete total="24.0 GiB" available="23.8 GiB"
@dhiltgen commented on GitHub (Nov 4, 2025):
Unfortunately I believe what you're hitting is ROCm implements "compatibility" logic and interprets CUDA_VISIBLE_DEVICES - https://rocm.docs.amd.com/en/latest/conceptual/gpu-isolation.html#cuda-visible-devices so by setting this to -1, you're also disabling AMD GPUs.