[GH-ISSUE #10000] How do I make a model simply complete my text? #6555

Closed
opened 2026-04-12 18:10:51 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @Explosion-Scratch on GitHub (Mar 26, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10000

I've been playing around with modelfiles for a while and I can't get the model to simply complete my input, how can I do this?

Originally created by @Explosion-Scratch on GitHub (Mar 26, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10000 I've been playing around with modelfiles for a while and I can't get the model to simply complete my input, how can I do this?
GiteaMirror added the question label 2026-04-12 18:10:51 -05:00
Author
Owner

@Explosion-Scratch commented on GitHub (Mar 26, 2025):

WHOOA

<!-- gh-comment-id:2754802177 --> @Explosion-Scratch commented on GitHub (Mar 26, 2025): WHOOA
Author
Owner

@Explosion-Scratch commented on GitHub (Mar 26, 2025):

Issue 10000

<!-- gh-comment-id:2754802446 --> @Explosion-Scratch commented on GitHub (Mar 26, 2025): Issue 10000
Author
Owner

@rick-github commented on GitHub (Mar 26, 2025):

Don't use a finetuned model, use a base or text model. For example, llama3.2:3b-text-q4_K_M instead of llama3.2:3b-instruct-q4_K_M. Note that most models in the ollama library are finetuned or instruct models, you might need to go to HF to get the base model.

<!-- gh-comment-id:2754827204 --> @rick-github commented on GitHub (Mar 26, 2025): Don't use a finetuned model, use a base or text model. For example, [llama3.2:3b-text-q4_K_M](https://ollama.com/library/llama3.2:3b-text-q4_K_M) instead of [llama3.2:3b-instruct-q4_K_M](https://ollama.com/library/llama3.2:3b-instruct-q4_K_M). Note that most models in the ollama library are finetuned or instruct models, you might need to go to HF to get the base model.
Author
Owner

@Explosion-Scratch commented on GitHub (Mar 26, 2025):

Don't use a finetuned model, use a base or text model. For example, llama3.2:3b-text-q4_K_M instead of llama3.2:3b-instruct-q4_K_M. Note that most models in the ollama library are finetuned or instruct models, you might need to go to HF to get the base model.

Thanks. Then what should my modelfile look like?

<!-- gh-comment-id:2755588725 --> @Explosion-Scratch commented on GitHub (Mar 26, 2025): > Don't use a finetuned model, use a base or text model. For example, [llama3.2:3b-text-q4_K_M](https://ollama.com/library/llama3.2:3b-text-q4_K_M) instead of [llama3.2:3b-instruct-q4_K_M](https://ollama.com/library/llama3.2:3b-instruct-q4_K_M). Note that most models in the ollama library are finetuned or instruct models, you might need to go to HF to get the base model. Thanks. Then what should my modelfile look like?
Author
Owner

@rick-github commented on GitHub (Mar 26, 2025):

ollama models come with a Modelfile, so there's no need to create a new one unless you want to tweak parameters like temperature, etc. If that's the case, check the doc.

$ ollama show --modelfile llama3.2:3b-text-q4_K_M
# Modelfile generated by "ollama show"
# To build a new Modelfile based on this, replace FROM with:
# FROM llama3.2:3b-text-q4_K_M

FROM /root/.ollama/models/blobs/sha256-e794be6824b360baedfb10c2f0f277f3ae0c3739539f3e37b0e3836688ec1831
TEMPLATE {{ .Prompt }}
LICENSE "LLAMA 3.2 COMMUNITY LICENSE AGREEMENT
Llama 3.2 Version Release Date: September 25, 2024
...

But you can also set parameters in the API call.

$ curl -s localhost:11434/api/generate -d '{
  "model":"llama3.2:3b-text-q4_K_M",
  "prompt":"The sky appears blue because of a phenomenon called Rayleigh ",
  "options":{"temperature":0,"stop":["\n"]},"stream":false}' | jq .response
" scattering. This is the scattering of light by particles in the atmosphere that are smaller than the wavelength of visible light."
<!-- gh-comment-id:2755652626 --> @rick-github commented on GitHub (Mar 26, 2025): ollama models come with a Modelfile, so there's no need to create a new one unless you want to tweak parameters like `temperature`, etc. If that's the case, check the [doc](https://github.com/ollama/ollama/blob/main/docs/modelfile.md). ```console $ ollama show --modelfile llama3.2:3b-text-q4_K_M # Modelfile generated by "ollama show" # To build a new Modelfile based on this, replace FROM with: # FROM llama3.2:3b-text-q4_K_M FROM /root/.ollama/models/blobs/sha256-e794be6824b360baedfb10c2f0f277f3ae0c3739539f3e37b0e3836688ec1831 TEMPLATE {{ .Prompt }} LICENSE "LLAMA 3.2 COMMUNITY LICENSE AGREEMENT Llama 3.2 Version Release Date: September 25, 2024 ... ``` But you can also set parameters in the API call. ```console $ curl -s localhost:11434/api/generate -d '{ "model":"llama3.2:3b-text-q4_K_M", "prompt":"The sky appears blue because of a phenomenon called Rayleigh ", "options":{"temperature":0,"stop":["\n"]},"stream":false}' | jq .response " scattering. This is the scattering of light by particles in the atmosphere that are smaller than the wavelength of visible light." ```
Author
Owner

@Explosion-Scratch commented on GitHub (Mar 27, 2025):

Ok! Thanks so much!

<!-- gh-comment-id:2758593960 --> @Explosion-Scratch commented on GitHub (Mar 27, 2025): Ok! Thanks so much!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#6555