mirror of
https://github.com/open-webui/open-webui.git
synced 2026-03-11 00:04:08 -05:00
Open-WebUI: ‘stream=true’ Mode Not Working with Ollama on RunPod #3049
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @mballesterosc on GitHub (Dec 19, 2024).
Installation Method
pip
Environment
Open WebUI Version: 0.4.8
**Ollama [v0.5.4] on RunPod
Operating System: Windows Server 2016
**Browser Chrome
Bug Summary:
Thought about stream mode issue durante un segundo
I have Open-WebUI installed on a local server with an Ollama installation on a RunPod server. The connection and inference work perfectly, but I can’t get the stream=True mode to function. I’ve tried connecting to Ollama both via HTTP and TCP (IP:PORT), and I’ve configured the HOST_OLLAMA environment variable to 0.0.0.0 on RunPod. The call correctly sends the stream=true parameter.
If I call the Ollama API from Python, it works correctly.
If I use “Arena Model” with only my RunPod Ollama model (LLAMA 3.3) available in Open-WebUI, streaming works correctly.
I’ve checked all the threads, asked RunPod, but haven’t gotten anywhere.
Could you help me or guide me in solving this issue? I’m not sure if it’s a bug in Open-WebUI, a RunPod issue, or something related to my local server.
Thank you in advance.