mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-07 11:28:35 -05:00
[GH-ISSUE #12441] issue: Bad Request (400 no Body) from OpenAI Chat Completion API Endpoint in n8n-Tool-Integration #32132
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @floriankick on GitHub (Apr 4, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/12441
Check Existing Issues
Installation Method
Docker
Open WebUI Version
0.6.0
Ollama Version (if applicable)
0.6.2
Operating System
Linux
Browser (if applicable)
No response
Confirmation
README.md.Expected Behavior
The API endpoint /ollama/v1/chat/completions should be compatible with the corresponding endpoint of the Ollama instance.
Actual Behavior
The WebUI API endpoint /ollama/v1/chat/completions does not accept a message content=null whereas the corresponding Ollama API endpoint /v1/chat/completions can handle the same request.
This issue causes n8n OpenAI Chat model node to fail on the second request after calling a tool, because the tool call message from the LLM (that is sent as part of the conversation history within the context window) has an empty content field.
Steps to Reproduce
Send a request to http://webui.instance/ollama/v1/chat/completions with the following body:
Use any model available, it doesn't matter because the request won't reach the model anyways.
For Reference send the same Request to http://ollama.instance/v1/chat/completions which works as expected.
Logs & Screenshots
Docker logs:
Additional Information
No response
@gaby commented on GitHub (Apr 4, 2025):
@floriankick I think thats the wrong endpoint. You should use
https://yourhostname/api