[GH-ISSUE #10556] deepseek v3.1 support #6947

Closed
opened 2026-04-12 18:50:26 -05:00 by GiteaMirror · 5 comments
Owner

Originally created by @olumolu on GitHub (May 4, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10556

V3.1 launch a month ago but still no support though the architecture of the model is similar.
Kindly support v3.1 .
https://huggingface.co/deepseek-ai/DeepSeek-V3-0324

Originally created by @olumolu on GitHub (May 4, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10556 V3.1 launch a month ago but still no support though the architecture of the model is similar. Kindly support v3.1 . https://huggingface.co/deepseek-ai/DeepSeek-V3-0324
GiteaMirror added the model label 2026-04-12 18:50:26 -05:00
Author
Owner

@wrapss commented on GitHub (May 4, 2025):

wdym ? just run ollama run hf.co/unsloth/DeepSeek-V3-0324-GGUF:Q4_K_M and it's work

<!-- gh-comment-id:2849413206 --> @wrapss commented on GitHub (May 4, 2025): wdym ? just run ollama run hf.co/unsloth/DeepSeek-V3-0324-GGUF:Q4_K_M and it's work
Author
Owner

@rick-github commented on GitHub (May 4, 2025):

just run ollama run hf.co/unsloth/DeepSeek-V3-0324-GGUF:Q4_K_M and it's work

This does not work. The model is sharded which is not yet supported in ollama.

<!-- gh-comment-id:2849418554 --> @rick-github commented on GitHub (May 4, 2025): > just run ollama run hf.co/unsloth/DeepSeek-V3-0324-GGUF:Q4_K_M and it's work This does not work. The model is sharded which is not yet supported in ollama.
Author
Owner

@misterjice commented on GitHub (May 5, 2025):

wdym ? just run ollama run hf.co/unsloth/DeepSeek-V3-0324-GGUF:Q4_K_M and it's work

Can confirm @rick-github comment. I ran in WSL2:

ollama run hf.co/unsloth/DeepSeek-V3-0324-GGUF:Q4_K_M
pulling manifest
Error: pull model manifest: 400: {"error":"The specified repository contains sharded GGUF. Ollama does not support this yet. Follow this issue for more info: https://github.com/ollama/ollama/issues/5245"}

This issue seems to relate to sharded gguf files generally and not DeepSeek-V3 model specifically. It seems the ollama devs are aware of this; hance their inclusion of the github issues URL in their code.

<!-- gh-comment-id:2849762856 --> @misterjice commented on GitHub (May 5, 2025): > wdym ? just run ollama run hf.co/unsloth/DeepSeek-V3-0324-GGUF:Q4_K_M and it's work Can confirm @rick-github comment. I ran in WSL2: ollama run hf.co/unsloth/DeepSeek-V3-0324-GGUF:Q4_K_M pulling manifest Error: pull model manifest: 400: {"error":"The specified repository contains sharded GGUF. Ollama does not support this yet. Follow this issue for more info: https://github.com/ollama/ollama/issues/5245"} This issue seems to relate to sharded gguf files generally and not DeepSeek-V3 model specifically. It seems the ollama devs are aware of this; hance their inclusion of the github issues URL in their code.
Author
Owner

@olumolu commented on GitHub (May 17, 2025):

any idea when to expect the support for v3.1

<!-- gh-comment-id:2888478421 --> @olumolu commented on GitHub (May 17, 2025): any idea when to expect the support for v3.1
Author
Owner

@rick-github commented on GitHub (Sep 1, 2025):

https://github.com/ollama/ollama/releases/tag/v0.11.7

<!-- gh-comment-id:3243090601 --> @rick-github commented on GitHub (Sep 1, 2025): https://github.com/ollama/ollama/releases/tag/v0.11.7
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#6947