[GH-ISSUE #3120] Ollama cannot open models with unicode in the filepath #48432

Closed
opened 2026-04-28 08:17:56 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @BruceMacD on GitHub (Mar 13, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3120

Originally assigned to: @dhiltgen on GitHub.

Tracking this issue here, split from #2753

time=2024-02-26T00:11:49.314+01:00 level=INFO source=dyn_ext_server.go:90 msg="Loading Dynamic llm server: C:\\Users\\ELJKO~1\\AppData\\Local\\Temp\\ollama816527122\\cpu_avx2\\ext_server.dll"
time=2024-02-26T00:11:49.314+01:00 level=INFO source=dyn_ext_server.go:150 msg="Initializing llama server"
llama_model_load: error loading model: failed to open C:\Users\Željko\.ollama\models\blobs\sha256-8934d96d3f08982e95922b2b7a2c626a1fe873d7c3b06e8e56d7bc0a1fef9246: No such file or directory
llama_load_model_from_file: failed to load model
llama_init_from_gpt_params: error: failed to load model 'C:\Users\Željko\.ollama\models\blobs\sha256-8934d96d3f08982e95922b2b7a2c626a1fe873d7c3b06e8e56d7bc0a1fef9246'
{"timestamp":1708902709,"level":"ERROR","function":"load_model","line":388,"message":"unable to load model","model":"C:\\Users\\Željko\\.ollama\\models\\blobs\\sha256-8934d96d3f08982e95922b2b7a2c626a1fe873d7c3b06e8e56d7bc0a1fef9246"}
time=2024-02-26T00:11:49.314+01:00 level=WARN source=llm.go:162 msg="Failed to load dynamic library C:\\Users\\ELJKO~1\\AppData\\Local\\Temp\\ollama816527122\\cpu_avx2\\ext_server.dll  error loading model C:\\Users\\Željko\\.ollama\\models\\blobs\\sha256-8934d96d3f08982e95922b2b7a2c626a1fe873d7c3b06e8e56d7bc0a1fef92"
[GIN] 2024/02/26 - 00:11:49 | 500 |    332.6058ms |       127.0.0.1 | POST     "/api/chat"
[GIN] 2024/02/26 - 00:15:10 | 200 |         509µs |       127.0.0.1 | GET      "/api/version"

related llama.cpp fix:
https://github.com/ggerganov/llama.cpp/pull/5927

Originally created by @BruceMacD on GitHub (Mar 13, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3120 Originally assigned to: @dhiltgen on GitHub. Tracking this issue here, split from #2753 ``` time=2024-02-26T00:11:49.314+01:00 level=INFO source=dyn_ext_server.go:90 msg="Loading Dynamic llm server: C:\\Users\\ELJKO~1\\AppData\\Local\\Temp\\ollama816527122\\cpu_avx2\\ext_server.dll" time=2024-02-26T00:11:49.314+01:00 level=INFO source=dyn_ext_server.go:150 msg="Initializing llama server" llama_model_load: error loading model: failed to open C:\Users\Željko\.ollama\models\blobs\sha256-8934d96d3f08982e95922b2b7a2c626a1fe873d7c3b06e8e56d7bc0a1fef9246: No such file or directory llama_load_model_from_file: failed to load model llama_init_from_gpt_params: error: failed to load model 'C:\Users\Željko\.ollama\models\blobs\sha256-8934d96d3f08982e95922b2b7a2c626a1fe873d7c3b06e8e56d7bc0a1fef9246' {"timestamp":1708902709,"level":"ERROR","function":"load_model","line":388,"message":"unable to load model","model":"C:\\Users\\Željko\\.ollama\\models\\blobs\\sha256-8934d96d3f08982e95922b2b7a2c626a1fe873d7c3b06e8e56d7bc0a1fef9246"} time=2024-02-26T00:11:49.314+01:00 level=WARN source=llm.go:162 msg="Failed to load dynamic library C:\\Users\\ELJKO~1\\AppData\\Local\\Temp\\ollama816527122\\cpu_avx2\\ext_server.dll error loading model C:\\Users\\Željko\\.ollama\\models\\blobs\\sha256-8934d96d3f08982e95922b2b7a2c626a1fe873d7c3b06e8e56d7bc0a1fef92" [GIN] 2024/02/26 - 00:11:49 | 500 | 332.6058ms | 127.0.0.1 | POST "/api/chat" [GIN] 2024/02/26 - 00:15:10 | 200 | 509µs | 127.0.0.1 | GET "/api/version" ``` related llama.cpp fix: https://github.com/ggerganov/llama.cpp/pull/5927
GiteaMirror added the bug label 2026-04-28 08:17:56 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#48432