mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-11 08:22:09 -05:00
[GH-ISSUE #5315] Can not run local Ollama model, Offline #13937
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @DirtyKnightForVi on GitHub (Sep 10, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/5315
Bug Report
Installation Method
Docker
Environment
Open WebUI Version: v0.3.21
Ollama (if applicable): v0.3.9
Operating System: Linux
Browser (if applicable): [e.g., Chrome 100.0, Firefox 98.0]
Expected Behavior:
Once select a model, you'll get a response.
Actual Behavior:
It actually looks like it's waiting for the model to load, which takes way too long.
What's confusing is that other products' requests are responding normally, but this version of Open Webui just isn't cutting it. Neither ollama nor docker are throwing any errors.
ollama run can initiate locally.
@DirtyKnightForVi commented on GitHub (Sep 10, 2024):
发现一个有意思的事情,但仍可能是以前的老问题:
本地离线机器: