[GH-ISSUE #4755] (windows) ollama model download will not keep on downloading when reopen ollama #2996

Open
opened 2026-04-12 13:23:30 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @waldolin on GitHub (May 31, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/4755

ollama model download will not keep on downloading when reopen ollama or close ollama accidentally.

C:\Users\lin\AppData\Local\Ollama>ollama run gemma:7b
pulling manifest
pulling ef311de6af9d...  70% ▕███████████████████████████████████████                 ▏ 3.5 GB/5.0 GB  3.5 MB/s    7m9s
Error: Post "http://127.0.0.1:11434/api/show": dial tcp 127.0.0.1:11434: connectex: No connection could be made because the target machine actively refused it.

By the way,
can you help me figure out the problem

i have created "modelfile"

FROM D:\Users\lin\.cache\lm-studio\models\MaziyarPanahi\Meta-Llama-3-70B-Instruct-GGUF\Meta-Llama-3-70B-Instruct.Q3_K_M.gguf
FROM D:\Users\lin\.cache\lm-studio\models\mradermacher\CodeLlama3-8B-Python-GGUF\CodeLlama3-8B-Python.f16.gguf
FROM D:\Users\lin\.cache\lm-studio\models\nctu6\Llama3-TAIDE-LX-8B-Chat-Alpha1-GGUF\Llama3-TAIDE-LX-8B-Chat-Alpha1-Q3_K_S.gguf

ollama run example
transferring model data

using existing layer sha256:ffc76ff74022adb94c91442a6eea9a19d3f3568afdc79f03b82b848ff32d81a8
using existing layer sha256:2fef7d258c60b8ef793960004a61f9f0b87723e7ecbc610221efc0cdbe0bc46a
using existing layer sha256:e03488e99c59505264d1f0ff0fc33559e0ea5cd2c05744afbcf9bb485ad82e86
creating new layer sha256:7ca37b96018a295573217abe25dbc2f74318ae156f00cd457c322d0c37f94cc5
writing manifest
success

run
ollama create example -f Modelfile

the guide is not clear to understand
"https://github.com/ollama/ollama/blob/main/docs/import.md"

i have download some files, like Meta-Llama-3-70B-Instruct.Q3_K_M.gguf
and how to set the road or configure of using it in common share file?
not creating file

create in here?
"C:\Users\lin\AppData\Local\Ollama"
how to set model from "C:\Users\lin.ollama\models" to
"C:\Users\lin.cache\lm-studio\models"

What OS are you running the ollama server on? windows 11 23H2
What version of Ollama are you using? ollama version is 0.1.39

Originally created by @waldolin on GitHub (May 31, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/4755 ## ollama model download will not keep on downloading when reopen ollama or close ollama accidentally. ``` C:\Users\lin\AppData\Local\Ollama>ollama run gemma:7b pulling manifest pulling ef311de6af9d... 70% ▕███████████████████████████████████████ ▏ 3.5 GB/5.0 GB 3.5 MB/s 7m9s Error: Post "http://127.0.0.1:11434/api/show": dial tcp 127.0.0.1:11434: connectex: No connection could be made because the target machine actively refused it. ``` By the way, can you help me figure out the problem i have created "modelfile" ``` FROM D:\Users\lin\.cache\lm-studio\models\MaziyarPanahi\Meta-Llama-3-70B-Instruct-GGUF\Meta-Llama-3-70B-Instruct.Q3_K_M.gguf FROM D:\Users\lin\.cache\lm-studio\models\mradermacher\CodeLlama3-8B-Python-GGUF\CodeLlama3-8B-Python.f16.gguf FROM D:\Users\lin\.cache\lm-studio\models\nctu6\Llama3-TAIDE-LX-8B-Chat-Alpha1-GGUF\Llama3-TAIDE-LX-8B-Chat-Alpha1-Q3_K_S.gguf ``` `ollama run example` transferring model data ``` using existing layer sha256:ffc76ff74022adb94c91442a6eea9a19d3f3568afdc79f03b82b848ff32d81a8 using existing layer sha256:2fef7d258c60b8ef793960004a61f9f0b87723e7ecbc610221efc0cdbe0bc46a using existing layer sha256:e03488e99c59505264d1f0ff0fc33559e0ea5cd2c05744afbcf9bb485ad82e86 creating new layer sha256:7ca37b96018a295573217abe25dbc2f74318ae156f00cd457c322d0c37f94cc5 writing manifest success ``` run `ollama create example -f Modelfile` the guide is not clear to understand "https://github.com/ollama/ollama/blob/main/docs/import.md" i have download some files, like Meta-Llama-3-70B-Instruct.Q3_K_M.gguf and how to set the road or configure of using it in common share file? not creating file create in here? "C:\Users\lin\AppData\Local\Ollama" how to set model from "C:\Users\lin.ollama\models" to "C:\Users\lin.cache\lm-studio\models" What OS are you running the ollama server on? windows 11 23H2 What version of Ollama are you using? ollama version is 0.1.39
GiteaMirror added the feature request label 2026-04-12 13:23:30 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#2996