[GH-ISSUE #13638] Logprobs not returned from Ollama Cloud API (ollama.com) #71030

Closed
opened 2026-05-04 23:48:58 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @romannekrasovaillm on GitHub (Jan 7, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/13638

What is the issue?

Description

When using Ollama Cloud (https://ollama.com) with logprobs: true parameter,
the API accepts the request but returns null for logprobs field.

Tested with both:

  • Native API: POST /api/chat
  • OpenAI-compatible API: POST /v1/chat/completions

Reproduction

curl 'https://ollama.com/api/chat' \
  -H "Authorization: Bearer $OLLAMA_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "deepseek-v3.2",
    "messages": [{"role": "user", "content": "What is 2+2?"}],
    "logprobs": true,
    "top_logprobs": 5,
    "stream": false
  }'

Expected behavior
Response should include logprobs array with token probabilities as documented:
https://docs.ollama.com/api/chat

Actual behavior
Response contains no logprobs field or "logprobs": null:

{
  "model": "deepseek-v3.2",
  "message": {"role": "assistant", "content": "4"},
  "done": true
}

Models tested
deepseek-v3.2
gpt-oss:120b-cloud
nemotron-3-nano:30b-cloud
All return the same result - no logprobs.

Question
Is logprobs intentionally disabled for Ollama Cloud, or is this a bug?
If disabled, please document this limitation.

Environment
Ollama Cloud (ollama.com)
Date: January 2026

### Relevant log output

```shell

OS

Linux

GPU

No response

CPU

No response

Ollama version

0.13.5

Originally created by @romannekrasovaillm on GitHub (Jan 7, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/13638 ### What is the issue? ### Description When using Ollama Cloud (`https://ollama.com`) with `logprobs: true` parameter, the API accepts the request but returns `null` for logprobs field. Tested with both: - Native API: `POST /api/chat` - OpenAI-compatible API: `POST /v1/chat/completions` ### Reproduction ```bash curl 'https://ollama.com/api/chat' \ -H "Authorization: Bearer $OLLAMA_API_KEY" \ -H "Content-Type: application/json" \ -d '{ "model": "deepseek-v3.2", "messages": [{"role": "user", "content": "What is 2+2?"}], "logprobs": true, "top_logprobs": 5, "stream": false }' Expected behavior Response should include logprobs array with token probabilities as documented: https://docs.ollama.com/api/chat Actual behavior Response contains no logprobs field or "logprobs": null: { "model": "deepseek-v3.2", "message": {"role": "assistant", "content": "4"}, "done": true } Models tested deepseek-v3.2 gpt-oss:120b-cloud nemotron-3-nano:30b-cloud All return the same result - no logprobs. Question Is logprobs intentionally disabled for Ollama Cloud, or is this a bug? If disabled, please document this limitation. Environment Ollama Cloud (ollama.com) Date: January 2026 ### Relevant log output ```shell ``` ### OS Linux ### GPU _No response_ ### CPU _No response_ ### Ollama version 0.13.5
GiteaMirror added the cloudbug labels 2026-05-04 23:48:58 -05:00
Author
Owner

@ParthSareen commented on GitHub (Jan 11, 2026):

Hi @romannekrasovaillm - we currently only support logprobs from local models

<!-- gh-comment-id:3734238510 --> @ParthSareen commented on GitHub (Jan 11, 2026): Hi @romannekrasovaillm - we currently only support logprobs from local models
Author
Owner

@kpe commented on GitHub (Mar 10, 2026):

looks like logprobs are only available in the generate api not the completions api

<!-- gh-comment-id:4035143545 --> @kpe commented on GitHub (Mar 10, 2026): looks like logprobs are only available in the generate api not the completions api
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#71030