[GH-ISSUE #18129] issue: main_V.0.6.33 Doesn't work verify connection openAI compatible #18504

Closed
opened 2026-04-20 00:44:06 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @rgaricano on GitHub (Oct 8, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/18129

Check Existing Issues

  • I have searched for any existing and/or related issues.
  • I have searched for any existing and/or related discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Git Clone

Open WebUI Version

v0.6.33

Ollama Version (if applicable)

No response

Operating System

ubuntu 24.04

Browser (if applicable)

No response

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

Verify Connection Error/Suscessfull Toast

Actual Behavior

Verify openAI compatible connection button have not action, if doesn't show any message or toast

Steps to Reproduce

Add a openAI compatible connection + API.
Click on verify connection button.

Logs & Screenshots

Browser Console:

VM18191:1 Uncaught (in promise) SyntaxError: Unexpected end of JSON input
    at JSON.parse (<anonymous>)
    at le (AddConnectionModal.svelte:97:20)
    at G (AddConnectionModal.svelte:114:4)
    at HTMLButtonElement.ae (AddConnectionModal.svelte:315:11)
le @ AddConnectionModal.svelte:97
G @ AddConnectionModal.svelte:114
ae @ AddConnectionModal.svelte:315

Image

Additional Information

Connection is correctly added and work, but verify connection not.

WORKAROUND: If header is setted with {} it work correctly

Originally created by @rgaricano on GitHub (Oct 8, 2025). Original GitHub issue: https://github.com/open-webui/open-webui/issues/18129 ### Check Existing Issues - [x] I have searched for any existing and/or related issues. - [x] I have searched for any existing and/or related discussions. - [x] I am using the latest version of Open WebUI. ### Installation Method Git Clone ### Open WebUI Version v0.6.33 ### Ollama Version (if applicable) _No response_ ### Operating System ubuntu 24.04 ### Browser (if applicable) _No response_ ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have **provided every relevant configuration, setting, and environment variable used in my setup.** - [x] I have clearly **listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup** (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc). - [x] I have documented **step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation**. My steps: - Start with the initial platform/version/OS and dependencies used, - Specify exact install/launch/configure commands, - List URLs visited, user input (incl. example values/emails/passwords if needed), - Describe all options and toggles enabled or changed, - Include any files or environmental changes, - Identify the expected and actual result at each stage, - Ensure any reasonably skilled user can follow and hit the same issue. ### Expected Behavior Verify Connection Error/Suscessfull Toast ### Actual Behavior Verify openAI compatible connection button have not action, if doesn't show any message or toast ### Steps to Reproduce Add a openAI compatible connection + API. Click on verify connection button. ### Logs & Screenshots Browser Console: ``` VM18191:1 Uncaught (in promise) SyntaxError: Unexpected end of JSON input at JSON.parse (<anonymous>) at le (AddConnectionModal.svelte:97:20) at G (AddConnectionModal.svelte:114:4) at HTMLButtonElement.ae (AddConnectionModal.svelte:315:11) le @ AddConnectionModal.svelte:97 G @ AddConnectionModal.svelte:114 ae @ AddConnectionModal.svelte:315 ``` ![Image](https://github.com/user-attachments/assets/54b9064f-ef88-4fce-b207-5c73bc5e0124) ### Additional Information Connection is correctly added and work, but verify connection not. **WORKAROUND:** If header is setted with `{}` it work correctly
GiteaMirror added the bug label 2026-04-20 00:44:06 -05:00
Author
Owner

@silentoplayz commented on GitHub (Oct 8, 2025):

I can confirm this issue on the dev branch. Below is the browser console error I receive in Firefox when clicking the Verify Connection button for an OpenAI API compatible connection added:

Uncaught (in promise) SyntaxError: JSON.parse: unexpected end of data at line 1 column 1 of the JSON data
    le AddConnectionModal.svelte:97
    G AddConnectionModal.svelte:114
    ae AddConnectionModal.svelte:315
    rt dom.js:361
    m AddConnectionModal.svelte:331
    m Tooltip.svelte:69
    m Tooltip.svelte:67
    ne Component.js:44
    m AddConnectionModal.svelte:303
    m Modal.svelte:106
    p Modal.svelte:82
    Zt scheduler.js:119
    vt scheduler.js:79
    promise callback*wt scheduler.js:20
    se Component.js:81
    ctx Component.js:139
    $ OpenAIConnection.svelte:106
    rt dom.js:361
    m OpenAIConnection.svelte:111
    m Tooltip.svelte:69
    m Tooltip.svelte:67
    ne Component.js:44
    m OpenAIConnection.svelte:113
    ne Component.js:44
    m Connections.svelte:266
    m Connections.svelte:287
    m Connections.svelte:239
    p Connections.svelte:214
    Zt scheduler.js:119
    vt scheduler.js:79
    promise callback*wt scheduler.js:20
    se Component.js:81
    ctx Component.js:139
    Tp Connections.svelte:156
    at utils.js:41
    ne Component.js:47
    vt scheduler.js:99
    promise callback*wt scheduler.js:20
    se Component.js:81
    ctx Component.js:139
    $$set root.svelte:69
    $set Component.js:507
    W client.js:1595
    Oe client.js:409
    Qn client.js:1941
    d Settings.svelte:122
AddConnectionModal.svelte:97:20
<!-- gh-comment-id:3379384047 --> @silentoplayz commented on GitHub (Oct 8, 2025): I can confirm this issue on the `dev` branch. Below is the browser console error I receive in Firefox when clicking the `Verify Connection` button for an OpenAI API compatible connection added: ```js Uncaught (in promise) SyntaxError: JSON.parse: unexpected end of data at line 1 column 1 of the JSON data le AddConnectionModal.svelte:97 G AddConnectionModal.svelte:114 ae AddConnectionModal.svelte:315 rt dom.js:361 m AddConnectionModal.svelte:331 m Tooltip.svelte:69 m Tooltip.svelte:67 ne Component.js:44 m AddConnectionModal.svelte:303 m Modal.svelte:106 p Modal.svelte:82 Zt scheduler.js:119 vt scheduler.js:79 promise callback*wt scheduler.js:20 se Component.js:81 ctx Component.js:139 $ OpenAIConnection.svelte:106 rt dom.js:361 m OpenAIConnection.svelte:111 m Tooltip.svelte:69 m Tooltip.svelte:67 ne Component.js:44 m OpenAIConnection.svelte:113 ne Component.js:44 m Connections.svelte:266 m Connections.svelte:287 m Connections.svelte:239 p Connections.svelte:214 Zt scheduler.js:119 vt scheduler.js:79 promise callback*wt scheduler.js:20 se Component.js:81 ctx Component.js:139 Tp Connections.svelte:156 at utils.js:41 ne Component.js:47 vt scheduler.js:99 promise callback*wt scheduler.js:20 se Component.js:81 ctx Component.js:139 $$set root.svelte:69 $set Component.js:507 W client.js:1595 Oe client.js:409 Qn client.js:1941 d Settings.svelte:122 AddConnectionModal.svelte:97:20 ```
Author
Owner

@dromeuf commented on GitHub (Oct 14, 2025):

In fact, I have the same symptom when checking the connection to an external API. But my old RAG chatbot MODELS no longer work either. If I use a local generator LLM, I have no problem, but if I use a generator LLM via an external connection, then it no longer works as before. Do you have the same problem? The query goes around in circles and produces no results. On the other hand, an important point is that I can still use these external OpenAI LLMs in the basic chat interface. The problem seems to only be in my RAG chatbot MODELS.

<!-- gh-comment-id:3402683466 --> @dromeuf commented on GitHub (Oct 14, 2025): In fact, I have the same symptom when checking the connection to an external API. But my old RAG chatbot MODELS no longer work either. If I use a local generator LLM, I have no problem, but if I use a generator LLM via an external connection, then it no longer works as before. Do you have the same problem? The query goes around in circles and produces no results. On the other hand, an important point is that I can still use these external OpenAI LLMs in the basic chat interface. The problem seems to only be in my RAG chatbot MODELS.
Author
Owner

@dromeuf commented on GitHub (Oct 14, 2025):

something else when i use external azure or deepseek, so now i get the following error message : This model's maximum context length is 128000 tokens. However, your messages resulted in 783487 tokens. Please reduce the length of the messages.

<!-- gh-comment-id:3402721290 --> @dromeuf commented on GitHub (Oct 14, 2025): something else when i use external azure or deepseek, so now i get the following error message : `This model's maximum context length is 128000 tokens. However, your messages resulted in 783487 tokens. Please reduce the length of the messages.`
Author
Owner

@rgaricano commented on GitHub (Oct 14, 2025):

something else when i use external azure or deepseek, so now i get the following error message : This model's maximum context length is 128000 tokens. However, your messages resulted in 783487 tokens. Please reduce the length of the messages.

Not related with this issue,
Your problem is that you are overpassing the max context lenght of the model/config ussualy this happen sending long/a lot docuents or images in base64.

<!-- gh-comment-id:3403479196 --> @rgaricano commented on GitHub (Oct 14, 2025): > something else when i use external azure or deepseek, so now i get the following error message : `This model's maximum context length is 128000 tokens. However, your messages resulted in 783487 tokens. Please reduce the length of the messages.` Not related with this issue, Your problem is that you are overpassing the max context lenght of the model/config ussualy this happen sending long/a lot docuents or images in base64.
Author
Owner

@dromeuf commented on GitHub (Oct 15, 2025):

The problem is that before updating to 6.33, the same query did not produce this effect and error message. Again this works when I use the external LLM in the OpenWebUI home chat but not in my custom chatbot MODELS RAG.

<!-- gh-comment-id:3404943003 --> @dromeuf commented on GitHub (Oct 15, 2025): The problem is that before updating to 6.33, the same query did not produce this effect and error message. Again this works when I use the external LLM in the OpenWebUI home chat but not in my custom chatbot MODELS RAG.
Author
Owner

@dromeuf commented on GitHub (Oct 18, 2025):

@rgaricano Version 0.6.34 has partially solved my reported problem. Checking the external API connection to an external LLM now works well in 0.6.34. However, since this new version I am forced to reselect the external LLM in the settings of my RAG chatbot agent for the user's request to work. So I have to reconfigure the LLM in the chatbot agent for it to work again as before. Thank you for your great work.

<!-- gh-comment-id:3418803178 --> @dromeuf commented on GitHub (Oct 18, 2025): @rgaricano Version 0.6.34 has partially solved my reported problem. Checking the external API connection to an external LLM now works well in 0.6.34. However, since this new version I am forced to reselect the external LLM in the settings of my RAG chatbot agent for the user's request to work. So I have to reconfigure the LLM in the chatbot agent for it to work again as before. Thank you for your great work.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#18504