mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-07 11:28:35 -05:00
[GH-ISSUE #16413] feat: OpenAPI tool server should be able to return plain text #17896
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @stoerr on GitHub (Aug 9, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/16413
Check Existing Issues
Problem Description
I'd like to add an OpenAI tool server that can return plain text for some actions. That makes perfect sense e.g. for retrieving text content - why would the tool server need to pack text content into JSON? But currently open-webui tries to parse the response as JSON and fails. OpenAI GPTs themselves do not have any problems with that - a GPT cooperates with my https://github.com/stoerr/CoDeveloperGPTengine which does just return plain text when you access a file with it.
(BTW: why does it parse the tool response at all? Isn't that just returned to the model?)
Desired Solution you'd like
OpenAI tool servers should be supported with other return types than JSON, especially plain text, but possibly more than that.
Alternatives Considered
I could rewrite my tool server to return JSON instead of plaintext, but that does not fit the problem and I'd also expect the LLM to perform worse if I encode retrieved texts, documents and so forth as JSON. Especially if it's JSON - then the JSON is encoded as a string within JSON? Ouch. :-)
Additional Context
The stacktrace.txt contains the stacktrace in open-webui when it accesses a tool server that returns plaintext. I did this with the following node.js example that both offers a json tool (which works) and a plain text returning tool (which does currently not work but should.)
plaintextserver.js.txt
@tjbck commented on GitHub (Aug 9, 2025):
It definitely should today, could you provide a screen recording of the issue?
@rgaricano commented on GitHub (Aug 9, 2025):
Those are MCP protocol specifications: https://modelcontextprotocol.io/docs/learn/architecture#data-layer-protocol
also openai tool/funcion calls have those kind of protocol specifications:
https://openai.github.io/openai-agents-js/guides/mcp/
https://platform.openai.com/docs/guides/function-calling#handling-function-calls
@stoerr commented on GitHub (Aug 10, 2025):
This is not about MCP - this is about using OpenAPI Tool Servers.
I ran the sample server plaintextserver.js and added it to the global tool servers (if I entered into the users tool servers I somehow never could see the tool in the model):
I created a model:
Using the jsontestrequest that returns a JSON works fine, but not so the plaintexttestrequest that returns text/plain , and that's the problem I'm describing.
@rgaricano commented on GitHub (Aug 10, 2025):
Sure that @tjbck can explain better, but I think that it's because for tools servers always is expected a json response:
30d0f8b1f6/backend/open_webui/utils/tools.py (L629)30d0f8b1f6/backend/open_webui/utils/tools.py (L639)and request headers:
30d0f8b1f6/backend/open_webui/utils/tools.py (L611)@stoerr commented on GitHub (Aug 11, 2025):
Right, Open WebUI expects the response to be JSON - that's the problem I'm complaining about. 😄
I don't see a reason for that, and as I explained in quite a couple of cases this is troublesome, even harmful. And with OpenAI GPTs returning plaintext works fine - I happily used that when developing the CoDeveloper GPT Engine. If you look at the OpenAI function call spec which @rgaricano cited above: that says
So could you pretty please relax that restriction?
BTW: Why is the response even parsed and not just returned as is to the model? Is that processed somehow?
@tjbck commented on GitHub (Aug 11, 2025):
@stoerr should be addressed with
f890fe6901in dev, testing wanted here!