[GH-ISSUE #4249] The model does not output correctly in Ollama, but it works fine in LM Studio. #64687

Open
opened 2026-05-03 18:30:33 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @vawterdada on GitHub (May 8, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/4249

What is the issue?

I use a small data set to fine-tune the model, and then load it into Ollama. It will speak nonsense, but if I load it into LM Studio, it can be used normally. What is the reason for this? Is there anything that needs to be adjusted?

image
image

OS

Windows

GPU

Nvidia

CPU

Intel

Ollama version

0.1.34

Originally created by @vawterdada on GitHub (May 8, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/4249 ### What is the issue? I use a small data set to fine-tune the model, and then load it into Ollama. It will speak nonsense, but if I load it into LM Studio, it can be used normally. What is the reason for this? Is there anything that needs to be adjusted? ![image](https://github.com/ollama/ollama/assets/130421680/a1d34eae-a765-4de2-a959-87356eff89f9) ![image](https://github.com/ollama/ollama/assets/130421680/719fca9d-5949-4eaf-b5e7-df333a772f99) ### OS Windows ### GPU Nvidia ### CPU Intel ### Ollama version 0.1.34
GiteaMirror added the bug label 2026-05-03 18:30:33 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#64687