[GH-ISSUE #11777] Use Google ADK with Ollama or gpt-oss to generate responses from user input #69865

Closed
opened 2026-05-04 19:38:09 -05:00 by GiteaMirror · 5 comments
Owner

Originally created by @Chocolee-1024 on GitHub (Aug 7, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/11777

What is the issue?

It seems that Ollama did not return any content in the response body, which caused json.loads() to fail when attempting to parse an empty string.

I tested the same setup with models and did not encounter this issue — only gpt-oss:20b triggers the error.

Image

Relevant log output

APIConnectionError
Message:
litellm.APIConnectionError: Expecting value: line 1 column 1 (char 0)
Traceback (most recent call last):
  File "/usr/lib/python3.13/site-packages/litellm/main.py", line 524, in acompletion
    response = await init_response
               ^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.13/site-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 238, in async_completion
    return provider_config.transform_response(
           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^
        model=model,
        ^^^^^^^^^^^^
    ...<9 lines>...
        json_mode=json_mode,
        ^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "/usr/lib/python3.13/site-packages/litellm/llms/ollama/completion/transformation.py", line 261, in transform_response
    response_content = json.loads(response_json["response"])
  File "/usr/lib/python3.13/json/__init__.py", line 346, in loads
    return _default_decoder.decode(s)
           ~~~~~~~~~~~~~~~~~~~~~~~^^^
  File "/usr/lib/python3.13/json/decoder.py", line 345, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
               ~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.13/json/decoder.py", line 363, in raw_decode
    raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
. Received Model Group=gpt-oss:20b
Available Model Group Fallbacks=None LiteLLM Retried: 1 times, LiteLLM Max Retries: 2

OS

No response

GPU

No response

CPU

No response

Ollama version

No response

Originally created by @Chocolee-1024 on GitHub (Aug 7, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/11777 ### What is the issue? It seems that Ollama did not return any content in the response body, which caused json.loads() to fail when attempting to parse an empty string. I tested the same setup with models and did not encounter this issue — only gpt-oss:20b triggers the error. <img width="529" height="240" alt="Image" src="https://github.com/user-attachments/assets/f09b7ef0-7ddb-44dc-bd7c-733a463d01a5" /> ### Relevant log output ```shell APIConnectionError Message: litellm.APIConnectionError: Expecting value: line 1 column 1 (char 0) Traceback (most recent call last): File "/usr/lib/python3.13/site-packages/litellm/main.py", line 524, in acompletion response = await init_response ^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.13/site-packages/litellm/llms/custom_httpx/llm_http_handler.py", line 238, in async_completion return provider_config.transform_response( ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^ model=model, ^^^^^^^^^^^^ ...<9 lines>... json_mode=json_mode, ^^^^^^^^^^^^^^^^^^^^ ) ^ File "/usr/lib/python3.13/site-packages/litellm/llms/ollama/completion/transformation.py", line 261, in transform_response response_content = json.loads(response_json["response"]) File "/usr/lib/python3.13/json/__init__.py", line 346, in loads return _default_decoder.decode(s) ~~~~~~~~~~~~~~~~~~~~~~~^^^ File "/usr/lib/python3.13/json/decoder.py", line 345, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) ~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3.13/json/decoder.py", line 363, in raw_decode raise JSONDecodeError("Expecting value", s, err.value) from None json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) . Received Model Group=gpt-oss:20b Available Model Group Fallbacks=None LiteLLM Retried: 1 times, LiteLLM Max Retries: 2 ``` ### OS _No response_ ### GPU _No response_ ### CPU _No response_ ### Ollama version _No response_
GiteaMirror added the bug label 2026-05-04 19:38:09 -05:00
Author
Owner

@rick-github commented on GitHub (Aug 7, 2025):

gpt-oss doesn't currently support structured outputs (json_mode). #11691

<!-- gh-comment-id:3163060707 --> @rick-github commented on GitHub (Aug 7, 2025): gpt-oss doesn't currently support structured outputs (`json_mode`). #11691
Author
Owner

@Chocolee-1024 commented on GitHub (Aug 7, 2025):

If it can't output a structured image format (JSON), how does it implement function calling (tool calling)?

So, can it be used as an MCP host?

<!-- gh-comment-id:3163313717 --> @Chocolee-1024 commented on GitHub (Aug 7, 2025): If it can't output a structured image format (JSON), how does it implement function calling (tool calling)? So, can it be used as an MCP host?
Author
Owner

@rick-github commented on GitHub (Aug 7, 2025):

If it can't output a structured image format (JSON), how does it implement function calling (tool calling)?

In ollama, tool use doesn't rely on structured outputs. Tool support was added in 0.3.0, structured outputs in 0.5.0.

<!-- gh-comment-id:3163347026 --> @rick-github commented on GitHub (Aug 7, 2025): > If it can't output a structured image format (JSON), how does it implement function calling (tool calling)? In ollama, tool use doesn't rely on structured outputs. Tool support was added in [0.3.0](https://github.com/ollama/ollama/releases/tag/v0.3.0), structured outputs in [0.5.0](https://github.com/ollama/ollama/releases/tag/v0.5.0).
Author
Owner

@Chocolee-1024 commented on GitHub (Aug 7, 2025):

I understand. Thank you very much. I would also like to ask if there is any plan to implement structured output (json_mode) function. Thank you.

<!-- gh-comment-id:3163924372 --> @Chocolee-1024 commented on GitHub (Aug 7, 2025): I understand. Thank you very much. I would also like to ask if there is any plan to implement structured output (json_mode) function. Thank you.
Author
Owner

@rick-github commented on GitHub (Aug 7, 2025):

Yes, structured outputs will be implemented for gpt-oss, although when is currently unknown.

<!-- gh-comment-id:3163930336 --> @rick-github commented on GitHub (Aug 7, 2025): Yes, structured outputs will be implemented for gpt-oss, although when is currently unknown.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#69865