[GH-ISSUE #16223] issue: Network error but model still streams text #17829

Closed
opened 2026-04-19 23:43:16 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @OracleToes on GitHub (Aug 2, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/16223

Check Existing Issues

  • I have searched the existing issues and discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Pip Install

Open WebUI Version

v0.6.18 (latest)

Ollama Version (if applicable)

0.10.1

Operating System

Arch

Browser (if applicable)

Firefox and Floorp (firefox based), this issue is not happening with chrome, or on brave on mobile (also chrome based). All latest versions

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

You write a message in the chat, and expect to get a reply without errors.

Actual Behavior

When using firefox or a firefox based browser to use open-webui, most of the time, submitting a message in the chat will return TypeError: NetworkError when attempting to fetch resource. but moments later, the text box starts streaming tokens from the server. When this error occours, you can no longer stop the text generation early, because webui thinks it aborted, the stop button is gone.

Steps to Reproduce

  1. Install openwebui via pip in its own venv
  2. install ollama through the official method (curl)
  3. pull a (reasoning? all tested models have been qwen3 or merges of qwen3 so far) model from ollama and ollama serve
  4. open-webui serve from activated webui venv
  5. go to the chat section, select a model, and submit a message
  6. wait for a response, at the bottom of the response is a big red TypeError: NetworkError when attempting to fetch resource. it should appear before the model starts reasoning or responding
  7. moments later the webui will begin streaming text from the server
  8. try to stop the generation early (you can't because there is no stop button)

Logs & Screenshots

There was only one log in the browser, and i'm not using docker so nothing to post there.
This log is already present before i even run the query:

[tiptap warn]: Duplicate extension names found: ['codeBlock', 'bulletList', 'listItem', 'listKeymap', 'orderedList']. This can lead to issues.

Image

Additional Information

in the screenshot provided, the thinking... section is working, if you expand that, the text is still being streamed to it, and it will finish that thought and complete the request. but as you can see in the bottom right, there is a headphones icon instead of a stop icon, which is normally there when it is behaving normally.

If i've missed anything submitting this issue, please forgive me, and let me know what else I should submit.

Originally created by @OracleToes on GitHub (Aug 2, 2025). Original GitHub issue: https://github.com/open-webui/open-webui/issues/16223 ### Check Existing Issues - [x] I have searched the existing issues and discussions. - [x] I am using the latest version of Open WebUI. ### Installation Method Pip Install ### Open WebUI Version v0.6.18 (latest) ### Ollama Version (if applicable) 0.10.1 ### Operating System Arch ### Browser (if applicable) Firefox and Floorp (firefox based), this issue is not happening with chrome, or on brave on mobile (also chrome based). All latest versions ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have **provided every relevant configuration, setting, and environment variable used in my setup.** - [x] I have clearly **listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup** (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc). - [x] I have documented **step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation**. My steps: - Start with the initial platform/version/OS and dependencies used, - Specify exact install/launch/configure commands, - List URLs visited, user input (incl. example values/emails/passwords if needed), - Describe all options and toggles enabled or changed, - Include any files or environmental changes, - Identify the expected and actual result at each stage, - Ensure any reasonably skilled user can follow and hit the same issue. ### Expected Behavior You write a message in the chat, and expect to get a reply without errors. ### Actual Behavior When using firefox or a firefox based browser to use open-webui, most of the time, submitting a message in the chat will return `TypeError: NetworkError when attempting to fetch resource.` but moments later, the text box starts streaming tokens from the server. When this error occours, you can no longer stop the text generation early, because webui thinks it aborted, the stop button is gone. ### Steps to Reproduce 1. Install openwebui via pip in its own venv 2. install ollama through the official method (curl) 3. pull a (reasoning? all tested models have been qwen3 or merges of qwen3 so far) model from ollama and `ollama serve` 4. `open-webui serve` from activated webui venv 5. go to the chat section, select a model, and submit a message 6. wait for a response, at the bottom of the response is a big red `TypeError: NetworkError when attempting to fetch resource.` it should appear before the model starts reasoning or responding 7. moments later the webui will begin streaming text from the server 8. try to stop the generation early (you can't because there is no stop button) ### Logs & Screenshots There was only one log in the browser, and i'm not using docker so nothing to post there. This log is already present before i even run the query: [tiptap warn]: Duplicate extension names found: ['codeBlock', 'bulletList', 'listItem', 'listKeymap', 'orderedList']. This can lead to issues. <img width="674" height="1029" alt="Image" src="https://github.com/user-attachments/assets/e512b691-3827-4192-9926-407a80bac98c" /> ### Additional Information in the screenshot provided, the `thinking...` section is working, if you expand that, the text is still being streamed to it, and it will finish that thought and complete the request. but as you can see in the bottom right, there is a headphones icon instead of a stop icon, which is normally there when it is behaving normally. If i've missed anything submitting this issue, please forgive me, and let me know what else I should submit.
GiteaMirror added the bug label 2026-04-19 23:43:16 -05:00
Author
Owner

@tjbck commented on GitHub (Aug 2, 2025):

Hmm, we're unable to reproduce, keep us updated!

<!-- gh-comment-id:3146428407 --> @tjbck commented on GitHub (Aug 2, 2025): Hmm, we're unable to reproduce, keep us updated!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#17829