[GH-ISSUE #12420] openai: don't return stream chunks with an empty list of choices #8249

Open
opened 2026-04-12 20:45:57 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @jmorganca on GitHub (Sep 26, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/12420

What is the issue?

When using Ollama with VSCode it will sometimes return a chat chunk without an empty list of choices. This causes tools like VSCode to fail

Sorry, your request failed. Please try again. Request id: ab0e1553-b4d7-4068-a365-f5a815120fde

Reason: Response contained no choices.

Relevant log output


OS

No response

GPU

No response

CPU

No response

Ollama version

No response

Originally created by @jmorganca on GitHub (Sep 26, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/12420 ### What is the issue? When using Ollama with VSCode it will sometimes return a chat chunk without an empty list of `choices`. This causes tools like VSCode to fail ``` Sorry, your request failed. Please try again. Request id: ab0e1553-b4d7-4068-a365-f5a815120fde Reason: Response contained no choices. ``` ### Relevant log output ```shell ``` ### OS _No response_ ### GPU _No response_ ### CPU _No response_ ### Ollama version _No response_
GiteaMirror added the bug label 2026-04-12 20:45:57 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#8249