[GH-ISSUE #1380] Is it possible to add model and prompt params like max_tokens or temperature? #26489

Closed
opened 2026-04-22 02:47:21 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @TumblerWarren on GitHub (Dec 4, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1380

It would be great if I could set params like temperature and max_tokens.
Also is it possible to turn of streaming?

Originally created by @TumblerWarren on GitHub (Dec 4, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1380 It would be great if I could set params like temperature and max_tokens. Also is it possible to turn of streaming?
Author
Owner

@technovangelist commented on GitHub (Dec 4, 2023):

Yes, you can add custom models and set parameters like temperature and others. take a look at https://github.com/jmorganca/ollama/blob/main/docs/modelfile.md for more about setting parameters and adding models, and then https://ollama.ai/signup to signup to push models. I will go ahead and close this issue now. If you think there is anything we left out, reopen and we can address. Thanks for being part of this great community.

<!-- gh-comment-id:1839536265 --> @technovangelist commented on GitHub (Dec 4, 2023): Yes, you can add custom models and set parameters like temperature and others. take a look at https://github.com/jmorganca/ollama/blob/main/docs/modelfile.md for more about setting parameters and adding models, and then https://ollama.ai/signup to signup to push models. I will go ahead and close this issue now. If you think there is anything we left out, reopen and we can address. Thanks for being part of this great community.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#26489