[GH-ISSUE #5315] Can not run local Ollama model, Offline #13937

Closed
opened 2026-04-19 20:28:11 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @DirtyKnightForVi on GitHub (Sep 10, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/5315

Bug Report

Installation Method

Docker

Environment

  • Open WebUI Version: v0.3.21

  • Ollama (if applicable): v0.3.9

  • Operating System: Linux

  • Browser (if applicable): [e.g., Chrome 100.0, Firefox 98.0]

Expected Behavior:

Once select a model, you'll get a response.

Actual Behavior:

  1. It actually looks like it's waiting for the model to load, which takes way too long.

  2. What's confusing is that other products' requests are responding normally, but this version of Open Webui just isn't cutting it. Neither ollama nor docker are throwing any errors.

  3. ollama run can initiate locally.

Originally created by @DirtyKnightForVi on GitHub (Sep 10, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/5315 # Bug Report ## Installation Method Docker ## Environment - **Open WebUI Version:** v0.3.21 - **Ollama (if applicable):** v0.3.9 - **Operating System:** Linux - **Browser (if applicable):** [e.g., Chrome 100.0, Firefox 98.0] ## Expected Behavior: Once select a model, you'll get a response. ## Actual Behavior: 1. It actually looks like it's waiting for the model to load, which takes way too long. 2. What's confusing is that other products' requests are responding normally, but this version of Open Webui just isn't cutting it. Neither ollama nor docker are throwing any errors. 3. ollama run can initiate locally.
Author
Owner

@DirtyKnightForVi commented on GitHub (Sep 10, 2024):

发现一个有意思的事情,但仍可能是以前的老问题:
本地离线机器:

  1. 启动了ollama、open webui(docker)、maxkb
  2. 通过open webui 尝试run一个ollama的模型,没起作用
  3. 通过maxkb 尝试run一个ollama的模型,ollama可以正常运行模型A
  4. 通过open webui尝试运行已经加载的模型A,正常
  5. 然后open webUI的其他模型也能正常加载了
<!-- gh-comment-id:2340013460 --> @DirtyKnightForVi commented on GitHub (Sep 10, 2024): 发现一个有意思的事情,但仍可能是以前的老问题: 本地离线机器: 1. 启动了ollama、open webui(docker)、maxkb 2. 通过open webui 尝试run一个ollama的模型,没起作用 3. 通过maxkb 尝试run一个ollama的模型,ollama可以正常运行模型A 4. 通过open webui尝试运行已经加载的模型A,正常 5. 然后open webUI的其他模型也能正常加载了
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#13937