The local deepseek r1 32b deployed by Ollama cannot respond properly. #3699

Closed
opened 2025-11-11 15:37:27 -06:00 by GiteaMirror · 0 comments
Owner

Originally created by @XWcode11 on GitHub (Feb 8, 2025).

Bug Report

I deployed DeepSeek R1 32B locally using Ollama, from https://huggingface.co/bartowski/DeepSeek-R1-Distill-Qwen-32B-GGUF. When I selected this model in OpenWebUI, the responses were completely unrelated to the questions. Refreshing the webpage and restarting the conversation did not resolve the issue.


Installation Method

Docker

Environment

  • Open WebUI Version: v0.5.10

  • Ollama (if applicable): 0.5.7

  • Operating System: centos 8

Confirmation:

Expected Behavior:

Answer questions normally.

Actual Behavior:

When I selected this model in OpenWebUI, the responses were completely unrelated to the questions

Image

Originally created by @XWcode11 on GitHub (Feb 8, 2025). # Bug Report I deployed DeepSeek R1 32B locally using Ollama, from https://huggingface.co/bartowski/DeepSeek-R1-Distill-Qwen-32B-GGUF. When I selected this model in OpenWebUI, the responses were completely unrelated to the questions. Refreshing the webpage and restarting the conversation did not resolve the issue. --- ## Installation Method Docker ## Environment - **Open WebUI Version:** v0.5.10 - **Ollama (if applicable):** 0.5.7 - **Operating System:** centos 8 **Confirmation:** ## Expected Behavior: Answer questions normally. ## Actual Behavior: When I selected this model in OpenWebUI, the responses were completely unrelated to the questions ![Image](https://github.com/user-attachments/assets/07b35663-f93b-4f48-b3e1-3defec8f9b56)
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#3699