mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 19:08:59 -05:00
[GH-ISSUE #18129] issue: main_V.0.6.33 Doesn't work verify connection openAI compatible #18504
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @rgaricano on GitHub (Oct 8, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/18129
Check Existing Issues
Installation Method
Git Clone
Open WebUI Version
v0.6.33
Ollama Version (if applicable)
No response
Operating System
ubuntu 24.04
Browser (if applicable)
No response
Confirmation
README.md.Expected Behavior
Verify Connection Error/Suscessfull Toast
Actual Behavior
Verify openAI compatible connection button have not action, if doesn't show any message or toast
Steps to Reproduce
Add a openAI compatible connection + API.
Click on verify connection button.
Logs & Screenshots
Browser Console:
Additional Information
Connection is correctly added and work, but verify connection not.
WORKAROUND: If header is setted with
{}it work correctly@silentoplayz commented on GitHub (Oct 8, 2025):
I can confirm this issue on the
devbranch. Below is the browser console error I receive in Firefox when clicking theVerify Connectionbutton for an OpenAI API compatible connection added:@dromeuf commented on GitHub (Oct 14, 2025):
In fact, I have the same symptom when checking the connection to an external API. But my old RAG chatbot MODELS no longer work either. If I use a local generator LLM, I have no problem, but if I use a generator LLM via an external connection, then it no longer works as before. Do you have the same problem? The query goes around in circles and produces no results. On the other hand, an important point is that I can still use these external OpenAI LLMs in the basic chat interface. The problem seems to only be in my RAG chatbot MODELS.
@dromeuf commented on GitHub (Oct 14, 2025):
something else when i use external azure or deepseek, so now i get the following error message :
This model's maximum context length is 128000 tokens. However, your messages resulted in 783487 tokens. Please reduce the length of the messages.@rgaricano commented on GitHub (Oct 14, 2025):
Not related with this issue,
Your problem is that you are overpassing the max context lenght of the model/config ussualy this happen sending long/a lot docuents or images in base64.
@dromeuf commented on GitHub (Oct 15, 2025):
The problem is that before updating to 6.33, the same query did not produce this effect and error message. Again this works when I use the external LLM in the OpenWebUI home chat but not in my custom chatbot MODELS RAG.
@dromeuf commented on GitHub (Oct 18, 2025):
@rgaricano Version 0.6.34 has partially solved my reported problem. Checking the external API connection to an external LLM now works well in 0.6.34. However, since this new version I am forced to reselect the external LLM in the settings of my RAG chatbot agent for the user's request to work. So I have to reconfigure the LLM in the chatbot agent for it to work again as before. Thank you for your great work.