[GH-ISSUE #11867] with gpt-oss:20b, with format, response is empty #33638

Closed
opened 2026-04-22 16:30:53 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @shaozi on GitHub (Aug 12, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/11867

What is the issue?

When supply format to the generate or chat api, for gpt-oss:20b model, its response is always empty.

curl -X POST http://localhost:11434/api/generate -H "Content-Type: application/json" -d '{
  "model": "gpt-oss:20b",
  "prompt": "Ollama is 22 years old and is busy saving the world. Respond using JSON",
  "stream": false,
  "format": {
    "type": "object",
    "properties": {
      "age": {
        "type": "integer"
      },
      "available": {
        "type": "boolean"
      }
    },
    "required": [
      "age",
      "available"
    ]
  }
}'

and the output: {"model":"gpt-oss:20b","created_at":"2025-08-12T15:49:03.363865Z","response":"","done":true,"done_reason":"stop","context":[...],"total_duration":12160488208,"load_duration":7345080125,"prompt_eval_count":85,"prompt_eval_duration":4330607833,"eval_count":12,"eval_duration":393131917}

Relevant log output


OS

Ubuntu, MacOS. I believe it is cross-platform

GPU

Tested on different GPUs

CPU

No response

Ollama version

0.11.4

Originally created by @shaozi on GitHub (Aug 12, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/11867 ### What is the issue? When supply `format` to the generate or chat api, for gpt-oss:20b model, its response is always empty. ```sh curl -X POST http://localhost:11434/api/generate -H "Content-Type: application/json" -d '{ "model": "gpt-oss:20b", "prompt": "Ollama is 22 years old and is busy saving the world. Respond using JSON", "stream": false, "format": { "type": "object", "properties": { "age": { "type": "integer" }, "available": { "type": "boolean" } }, "required": [ "age", "available" ] } }' ``` and the output: `{"model":"gpt-oss:20b","created_at":"2025-08-12T15:49:03.363865Z","response":"","done":true,"done_reason":"stop","context":[...],"total_duration":12160488208,"load_duration":7345080125,"prompt_eval_count":85,"prompt_eval_duration":4330607833,"eval_count":12,"eval_duration":393131917}` ### Relevant log output ```shell ``` ### OS Ubuntu, MacOS. I believe it is cross-platform ### GPU Tested on different GPUs ### CPU _No response_ ### Ollama version 0.11.4
GiteaMirror added the bug label 2026-04-22 16:30:53 -05:00
Author
Owner

@rick-github commented on GitHub (Aug 12, 2025):

#11691

<!-- gh-comment-id:3179989795 --> @rick-github commented on GitHub (Aug 12, 2025): #11691
Author
Owner

@pdevine commented on GitHub (Aug 12, 2025):

Going to close this as a dupe.

<!-- gh-comment-id:3181217357 --> @pdevine commented on GitHub (Aug 12, 2025): Going to close this as a dupe.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#33638