[GH-ISSUE #10001] Improve compatibility with OpenAI structured outputs json_schema response format #6556

Closed
opened 2026-04-12 18:10:59 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @SuperPat45 on GitHub (Mar 26, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10001

What is the issue?

Can you improve the /chat/completions compatibility with OpenAI about the json_schema response_format?

I tested with the gemma3 model, and this OpenAI syntax is ignored:

    "response_format": {,
        "type": "json_schema",
        "json_schema": {
            "name": "result",
            "schema": {
                "additionalProperties": false,,
                "type": "object",
                "required": [
                    "parm1"
                ],
                "properties": {
                    "parm1": {
....

Whereas Ollama only need a simpler format parameter:

            "format": {
                "type": "object",
                "required": [
                    "parm1"
                ],
                "properties": {
                    "parm1": {
....

It should be easy to modify the request coming from /chat/completions before relaying to the /api/chat.

Relevant log output


OS

Linux

GPU

Nvidia

CPU

AMD

Ollama version

0.6.2

Originally created by @SuperPat45 on GitHub (Mar 26, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10001 ### What is the issue? Can you improve the /chat/completions compatibility with OpenAI about the json_schema response_format? I tested with the gemma3 model, and this OpenAI syntax is ignored: ``` "response_format": {, "type": "json_schema", "json_schema": { "name": "result", "schema": { "additionalProperties": false,, "type": "object", "required": [ "parm1" ], "properties": { "parm1": { .... ``` Whereas Ollama only need a simpler format parameter: ``` "format": { "type": "object", "required": [ "parm1" ], "properties": { "parm1": { .... ``` It should be easy to modify the request coming from /chat/completions before relaying to the /api/chat. ### Relevant log output ```shell ``` ### OS Linux ### GPU Nvidia ### CPU AMD ### Ollama version 0.6.2
GiteaMirror added the bug label 2026-04-12 18:10:59 -05:00
Author
Owner

@rick-github commented on GitHub (Mar 27, 2025):

Can you provide more information about how the schemas are created and passed to ollama? I created a little script to test schema passing and the wire traffic looks like the fragments you have posted. Both schemas return results.

#!/usr/bin/env python3

from pydantic import BaseModel
from openai import OpenAI
from ollama import Client

ollama_url = "http://localhost:11434"
model = "gemma3:4b"

class result(BaseModel):
    parm1: str

messages=[
    {"role": "system", "content": "Extract the information."},
    {"role": "user", "content": "parmeter 1 is 'hello'."},
]

def openai():
  client = OpenAI(api_key="key", base_url=f"{ollama_url}/v1")
  completion = client.beta.chat.completions.parse(
      model=model,
      messages=messages,
      response_format=result,
  )

  r = completion.choices[0].message.parsed
  print(r)

def ollama():
  ollama_client = Client(host=ollama_url)
  response = ollama_client.chat(
      model=model,
      messages=messages,
      format=result.model_json_schema())
  r = result.model_validate_json(response.message.content)
  print(r)

openai()
ollama()
$ strace -e sendto -s 1024 ./10001.py  2>&1 1>/dev/tty | grep -v POST | sed -ne 's/^[^"]*"\(.*\)"[^"]*$/\1/p' | tr \\n \\0 | xargs -0i@ printf @'\n' | jq '{"schema_format": (.response_format // .format)}' 
parm1='hello'
parm1='hello'
{
  "schema_format": {
    "type": "json_schema",
    "json_schema": {
      "schema": {
        "properties": {
          "parm1": {
            "title": "Parm1",
            "type": "string"
          }
        },
        "required": [
          "parm1"
        ],
        "title": "result",
        "type": "object",
        "additionalProperties": false
      },
      "name": "result",
      "strict": true
    }
  }
}
{
  "schema_format": {
    "properties": {
      "parm1": {
        "title": "Parm1",
        "type": "string"
      }
    },
    "required": [
      "parm1"
    ],
    "title": "result",
    "type": "object"
  }
}

<!-- gh-comment-id:2756267970 --> @rick-github commented on GitHub (Mar 27, 2025): Can you provide more information about how the schemas are created and passed to ollama? I created a little script to test schema passing and the wire traffic looks like the fragments you have posted. Both schemas return results. ```python #!/usr/bin/env python3 from pydantic import BaseModel from openai import OpenAI from ollama import Client ollama_url = "http://localhost:11434" model = "gemma3:4b" class result(BaseModel): parm1: str messages=[ {"role": "system", "content": "Extract the information."}, {"role": "user", "content": "parmeter 1 is 'hello'."}, ] def openai(): client = OpenAI(api_key="key", base_url=f"{ollama_url}/v1") completion = client.beta.chat.completions.parse( model=model, messages=messages, response_format=result, ) r = completion.choices[0].message.parsed print(r) def ollama(): ollama_client = Client(host=ollama_url) response = ollama_client.chat( model=model, messages=messages, format=result.model_json_schema()) r = result.model_validate_json(response.message.content) print(r) openai() ollama() ``` ```console $ strace -e sendto -s 1024 ./10001.py 2>&1 1>/dev/tty | grep -v POST | sed -ne 's/^[^"]*"\(.*\)"[^"]*$/\1/p' | tr \\n \\0 | xargs -0i@ printf @'\n' | jq '{"schema_format": (.response_format // .format)}' parm1='hello' parm1='hello' { "schema_format": { "type": "json_schema", "json_schema": { "schema": { "properties": { "parm1": { "title": "Parm1", "type": "string" } }, "required": [ "parm1" ], "title": "result", "type": "object", "additionalProperties": false }, "name": "result", "strict": true } } } { "schema_format": { "properties": { "parm1": { "title": "Parm1", "type": "string" } }, "required": [ "parm1" ], "title": "result", "type": "object" } } ```
Author
Owner

@SuperPat45 commented on GitHub (Mar 27, 2025):

Sorry, The problem seems to be with Open WebUI

<!-- gh-comment-id:2757968273 --> @SuperPat45 commented on GitHub (Mar 27, 2025): Sorry, The problem seems to be with Open WebUI
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#6556