[GH-ISSUE #362] How to get (log) conditional probability of next word given a context in Ollama? #62199

Closed
opened 2026-05-03 07:51:45 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @HeningWang on GitHub (Aug 16, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/362

Hi,

I'm new to Ollama. I'd like to get (log) conditional probability of next word given a context like with other LLMs. I cannot find theis usage in the turorial or API. I'm thankful if anybody can help me with that. Sorry, if this question is too basic or not appropriate for an issue.

Best

Originally created by @HeningWang on GitHub (Aug 16, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/362 Hi, I'm new to Ollama. I'd like to get (log) conditional probability of next word given a context like with other LLMs. I cannot find theis usage in the turorial or API. I'm thankful if anybody can help me with that. Sorry, if this question is too basic or not appropriate for an issue. Best
GiteaMirror added the feature request label 2026-05-03 07:51:45 -05:00
Author
Owner

@charles-dyfis-net commented on GitHub (Aug 17, 2023):

I think this may be more of a feature request than a question -- looking at the API, I don't see any option to either do this, or to adjust weights for individual tokens.

As these features would both be necessary to use ollama as a LMTP backend (a transport protocol intended to be suitable for tools that constrain a LLM's output to only consider outputs that match provided rules/patterns), there's certainly value to them.

<!-- gh-comment-id:1683011323 --> @charles-dyfis-net commented on GitHub (Aug 17, 2023): I think this may be more of a feature request than a question -- looking at the API, I don't see any option to either do this, _or_ to adjust weights for individual tokens. As these features would both be necessary to use ollama as a [LMTP](https://github.com/eth-sri/lmql/blob/main/src/lmql/models/lmtp/README.md) backend (a transport protocol intended to be suitable for tools that constrain a LLM's output to only consider outputs that match provided rules/patterns), there's certainly value to them.
Author
Owner

@James4Ever0 commented on GitHub (Dec 20, 2023):

I think maybe I want some guided generation, like I have probability of every generation step, and I can choose the token I want instead of by max prob.

<!-- gh-comment-id:1864797343 --> @James4Ever0 commented on GitHub (Dec 20, 2023): I think maybe I want some guided generation, like I have probability of every generation step, and I can choose the token I want instead of by max prob.
Author
Owner

@jmorganca commented on GitHub (Feb 20, 2024):

Closing for https://github.com/ollama/ollama/issues/2415 – thanks!

<!-- gh-comment-id:1953325174 --> @jmorganca commented on GitHub (Feb 20, 2024): Closing for https://github.com/ollama/ollama/issues/2415 – thanks!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#62199