[GH-ISSUE #12189] LLM Download issue #8105

Closed
opened 2026-04-12 20:25:40 -05:00 by GiteaMirror · 10 comments
Owner

Originally created by @vishal5007 on GitHub (Sep 5, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/12189

As a student I was trying to download deepseek 6.7b version and it said it's not compatible and I tried to go with 1.3b version when I try to download 1.3b version it automatically keeps downloading the 6.7b version even if a try any normal convo it goes with the download of that file.I tried uninstalling the the ollama but still facing with the same issue tried even deleting system file no use..

Originally created by @vishal5007 on GitHub (Sep 5, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/12189 As a student I was trying to download deepseek 6.7b version and it said it's not compatible and I tried to go with 1.3b version when I try to download 1.3b version it automatically keeps downloading the 6.7b version even if a try any normal convo it goes with the download of that file.I tried uninstalling the the ollama but still facing with the same issue tried even deleting system file no use..
GiteaMirror added the feature requestneeds more info labels 2026-04-12 20:25:40 -05:00
Author
Owner

@rick-github commented on GitHub (Sep 5, 2025):

Do you mean deepseek-coder? There are no 1.3b or 6.7b versions of deepseek. What command are you running?

<!-- gh-comment-id:3257203650 --> @rick-github commented on GitHub (Sep 5, 2025): Do you mean deepseek-coder? There are no 1.3b or 6.7b versions of deepseek. What command are you running?
Author
Owner

@vishal5007 commented on GitHub (Sep 5, 2025):

What command are you running?

As a student I don't know much but I just wanted a ai agent that can help me through my coding journey I wanted to build my own ai which can be free I'll just paste the commands I gave
Ollama run deepseek-coder:6.7b

After this command I just keep getting this as output for all my next convo with the ollama it just goes for downloading the deepseek file

If it's possible plz guide me in this journey of learning data science a humble request

<!-- gh-comment-id:3257222644 --> @vishal5007 commented on GitHub (Sep 5, 2025): > What command are you running? As a student I don't know much but I just wanted a ai agent that can help me through my coding journey I wanted to build my own ai which can be free I'll just paste the commands I gave Ollama run deepseek-coder:6.7b After this command I just keep getting this as output for all my next convo with the ollama it just goes for downloading the deepseek file If it's possible plz guide me in this journey of learning data science a humble request
Author
Owner

@rick-github commented on GitHub (Sep 5, 2025):

After this command I just keep getting this as output

What output?

<!-- gh-comment-id:3257236099 --> @rick-github commented on GitHub (Sep 5, 2025): > After this command I just keep getting this as output What output?
Author
Owner

@vishal5007 commented on GitHub (Sep 5, 2025):

After this command I just keep getting this as output

What output?

It starts downloading deepseek 6.7b
Even I tried uninstalling and installed again still same output for every chat it just starts the paused download

Can we contact brother coz I really need help in learning

<!-- gh-comment-id:3257246667 --> @vishal5007 commented on GitHub (Sep 5, 2025): > > After this command I just keep getting this as output > > What output? It starts downloading deepseek 6.7b Even I tried uninstalling and installed again still same output for every chat it just starts the paused download Can we contact brother coz I really need help in learning
Author
Owner

@rick-github commented on GitHub (Sep 5, 2025):

It starts downloading deepseek 6.7b

Yes, because that's what you asked it to do. Ollama is a local inference engine. When you run a model, ollama needs to download the model to your computer in order to run it.

$ ollama run deepseek-coder:6.7b
pulling manifest 
pulling 59bb50d8116b: 100% ▕████████████████████████▏ 3.8 GB                         
pulling a3a0e9449cb6: 100% ▕████████████████████████▏  13 KB                         
pulling 8893e08fa9f9: 100% ▕████████████████████████▏   59 B                         
pulling 8972a96b8ff1: 100% ▕████████████████████████▏  297 B                         
pulling 772f510b9558: 100% ▕████████████████████████▏  483 B                         
verifying sha256 digest 
writing manifest 
success 
>>> hello
Hello! How can I assist you with your programming or computer science inquiries today?


>>> write a python script that says "hello world"
Sure, here is a simple Python script that prints "Hello World":

```python
print("Hello World")
```

When this script is run, it will output the string `Hello World` to the console.


>>> /bye
$
<!-- gh-comment-id:3257269011 --> @rick-github commented on GitHub (Sep 5, 2025): > It starts downloading deepseek 6.7b Yes, because that's what you asked it to do. Ollama is a local inference engine. When you `run` a model, ollama needs to download the model to your computer in order to run it. ````console $ ollama run deepseek-coder:6.7b pulling manifest pulling 59bb50d8116b: 100% ▕████████████████████████▏ 3.8 GB pulling a3a0e9449cb6: 100% ▕████████████████████████▏ 13 KB pulling 8893e08fa9f9: 100% ▕████████████████████████▏ 59 B pulling 8972a96b8ff1: 100% ▕████████████████████████▏ 297 B pulling 772f510b9558: 100% ▕████████████████████████▏ 483 B verifying sha256 digest writing manifest success >>> hello Hello! How can I assist you with your programming or computer science inquiries today? >>> write a python script that says "hello world" Sure, here is a simple Python script that prints "Hello World": ```python print("Hello World") ``` When this script is run, it will output the string `Hello World` to the console. >>> /bye $ ````
Author
Owner

@vishal5007 commented on GitHub (Sep 5, 2025):

I know but my laptop cannot handle this 6.7b parameters so I tried going
with less parameters like the one 1.3b but even after giving the command
for 1.3b it keeps downloading the 6.7b file

On Fri, 5 Sep 2025 at 12:22 PM, frob @.***> wrote:

rick-github left a comment (ollama/ollama#12189)
https://github.com/ollama/ollama/issues/12189#issuecomment-3257269011

It starts downloading deepseek 6.7b

Yes, because that's what you asked it to do. Ollama is a local inference
engine. When you run a model, ollama needs to download the model to your
computer in order to run it.

$ ollama run deepseek-coder:6.7bpulling manifest pulling 59bb50d8116b: 100% ▕████████████████████████▏ 3.8 GB pulling a3a0e9449cb6: 100% ▕████████████████████████▏ 13 KB pulling 8893e08fa9f9: 100% ▕████████████████████████▏ 59 B pulling 8972a96b8ff1: 100% ▕████████████████████████▏ 297 B pulling 772f510b9558: 100% ▕████████████████████████▏ 483 B verifying sha256 digest writing manifest success >>> helloHello! How can I assist you with your programming or computer science inquiries today?

write a python script that says "hello world"Sure, here is a simple Python script that prints "Hello World":
pythonprint("Hello World")
When this script is run, it will output the string Hello World to the console.

/bye
$


Reply to this email directly, view it on GitHub
https://github.com/ollama/ollama/issues/12189#issuecomment-3257269011,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/BSOBRPJPFXZ6QYYLKM5QXCD3REXMLAVCNFSM6AAAAACFWCX5IWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZTENJXGI3DSMBRGE
.
You are receiving this because you authored the thread.Message ID:
@.***>

<!-- gh-comment-id:3257278130 --> @vishal5007 commented on GitHub (Sep 5, 2025): I know but my laptop cannot handle this 6.7b parameters so I tried going with less parameters like the one 1.3b but even after giving the command for 1.3b it keeps downloading the 6.7b file On Fri, 5 Sep 2025 at 12:22 PM, frob ***@***.***> wrote: > *rick-github* left a comment (ollama/ollama#12189) > <https://github.com/ollama/ollama/issues/12189#issuecomment-3257269011> > > It starts downloading deepseek 6.7b > > Yes, because that's what you asked it to do. Ollama is a local inference > engine. When you run a model, ollama needs to download the model to your > computer in order to run it. > > $ ollama run deepseek-coder:6.7bpulling manifest pulling 59bb50d8116b: 100% ▕████████████████████████▏ 3.8 GB pulling a3a0e9449cb6: 100% ▕████████████████████████▏ 13 KB pulling 8893e08fa9f9: 100% ▕████████████████████████▏ 59 B pulling 8972a96b8ff1: 100% ▕████████████████████████▏ 297 B pulling 772f510b9558: 100% ▕████████████████████████▏ 483 B verifying sha256 digest writing manifest success >>> helloHello! How can I assist you with your programming or computer science inquiries today? > > >>> write a python script that says "hello world"Sure, here is a simple Python script that prints "Hello World": > ```pythonprint("Hello World")``` > When this script is run, it will output the string `Hello World` to the console. > > >>> /bye > $ > > — > Reply to this email directly, view it on GitHub > <https://github.com/ollama/ollama/issues/12189#issuecomment-3257269011>, > or unsubscribe > <https://github.com/notifications/unsubscribe-auth/BSOBRPJPFXZ6QYYLKM5QXCD3REXMLAVCNFSM6AAAAACFWCX5IWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZTENJXGI3DSMBRGE> > . > You are receiving this because you authored the thread.Message ID: > ***@***.***> >
Author
Owner

@rick-github commented on GitHub (Sep 5, 2025):

How do you know it's downloading the 6.7b model instead of the 1.3b?

<!-- gh-comment-id:3257284377 --> @rick-github commented on GitHub (Sep 5, 2025): How do you know it's downloading the 6.7b model instead of the 1.3b?
Author
Owner

@vishal5007 commented on GitHub (Sep 5, 2025):

The file size is of 12gb

On Fri, 5 Sep 2025 at 12:30 PM, frob @.***> wrote:

rick-github left a comment (ollama/ollama#12189)
https://github.com/ollama/ollama/issues/12189#issuecomment-3257284377

How do you know it's downloading the 6.7b model instead of the 1.3b?


Reply to this email directly, view it on GitHub
https://github.com/ollama/ollama/issues/12189#issuecomment-3257284377,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/BSOBRPNO4Z6EFGERV4QI2VD3REYI5AVCNFSM6AAAAACFWCX5IWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZTENJXGI4DIMZXG4
.
You are receiving this because you authored the thread.Message ID:
@.***>

<!-- gh-comment-id:3257286865 --> @vishal5007 commented on GitHub (Sep 5, 2025): The file size is of 12gb On Fri, 5 Sep 2025 at 12:30 PM, frob ***@***.***> wrote: > *rick-github* left a comment (ollama/ollama#12189) > <https://github.com/ollama/ollama/issues/12189#issuecomment-3257284377> > > How do you know it's downloading the 6.7b model instead of the 1.3b? > > — > Reply to this email directly, view it on GitHub > <https://github.com/ollama/ollama/issues/12189#issuecomment-3257284377>, > or unsubscribe > <https://github.com/notifications/unsubscribe-auth/BSOBRPNO4Z6EFGERV4QI2VD3REYI5AVCNFSM6AAAAACFWCX5IWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZTENJXGI4DIMZXG4> > . > You are receiving this because you authored the thread.Message ID: > ***@***.***> >
Author
Owner

@rick-github commented on GitHub (Sep 5, 2025):

Neither of the models is 12GB.

$ ollama list deepseek-coder
NAME                                          ID              SIZE      MODIFIED           
deepseek-coder:1.3b                           3ddd2d3fc8d2    776 MB    About a minute ago    
deepseek-coder:6.7b                           ce298d984115    3.8 GB    9 minutes ago       
<!-- gh-comment-id:3257288710 --> @rick-github commented on GitHub (Sep 5, 2025): Neither of the models is 12GB. ```console $ ollama list deepseek-coder NAME ID SIZE MODIFIED deepseek-coder:1.3b 3ddd2d3fc8d2 776 MB About a minute ago deepseek-coder:6.7b ce298d984115 3.8 GB 9 minutes ago ```
Author
Owner

@vishal5007 commented on GitHub (Sep 5, 2025):

I’m trying to send a attachment but it’s not uploading

On Fri, 5 Sep 2025 at 12:32 PM, frob @.***> wrote:

rick-github left a comment (ollama/ollama#12189)
https://github.com/ollama/ollama/issues/12189#issuecomment-3257288710

Neither of the models is 12GB.

$ ollama list deepseek-coderNAME ID SIZE MODIFIED deepseek-coder:1.3b 3ddd2d3fc8d2 776 MB About a minute ago deepseek-coder:6.7b ce298d984115 3.8 GB 9 minutes ago


Reply to this email directly, view it on GitHub
https://github.com/ollama/ollama/issues/12189#issuecomment-3257288710,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/BSOBRPI7S75G5ZC2MVVRUQD3REYQNAVCNFSM6AAAAACFWCX5IWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZTENJXGI4DQNZRGA
.
You are receiving this because you authored the thread.Message ID:
@.***>

<!-- gh-comment-id:3257311592 --> @vishal5007 commented on GitHub (Sep 5, 2025): I’m trying to send a attachment but it’s not uploading On Fri, 5 Sep 2025 at 12:32 PM, frob ***@***.***> wrote: > *rick-github* left a comment (ollama/ollama#12189) > <https://github.com/ollama/ollama/issues/12189#issuecomment-3257288710> > > Neither of the models is 12GB. > > $ ollama list deepseek-coderNAME ID SIZE MODIFIED deepseek-coder:1.3b 3ddd2d3fc8d2 776 MB About a minute ago deepseek-coder:6.7b ce298d984115 3.8 GB 9 minutes ago > > — > Reply to this email directly, view it on GitHub > <https://github.com/ollama/ollama/issues/12189#issuecomment-3257288710>, > or unsubscribe > <https://github.com/notifications/unsubscribe-auth/BSOBRPI7S75G5ZC2MVVRUQD3REYQNAVCNFSM6AAAAACFWCX5IWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZTENJXGI4DQNZRGA> > . > You are receiving this because you authored the thread.Message ID: > ***@***.***> >
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#8105