the device is disconnected from the internet, the service starts up normally, but the interface cannot select a model and cannot engage in a dialogue #4419

Closed
opened 2025-11-11 15:53:41 -06:00 by GiteaMirror · 0 comments
Owner

Originally created by @hellolinch on GitHub (Mar 14, 2025).

Check Existing Issues

  • I have searched the existing issues and discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Pip Install

Open WebUI Version

v0.5.20

Ollama Version (if applicable)

v0.5.7

Operating System

windows

Browser (if applicable)

edge

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have listed steps to reproduce the bug in detail.

Expected Behavior

After installing through conda/pip install open-webui, the service can automatically recognize the Ollama model upon startup and runs normally. After the device is disconnected from the internet, the service starts up normally and should be able to normally obtain the Ollama model for normal conversation.

Actual Behavior

After installing through conda/pip install open-webui, the service can automatically recognize the Ollama model upon startup and runs normally. However, when the device is disconnected from the internet, the service starts up normally, but the interface cannot select a model and cannot engage in a dialogue. It prompts:
2025-03-14 08:08:04.880 | INFO | open_webui.routers.openai:get_all_models:379 - get_all_models() - {}
2025-03-14 08:08:10.979 | INFO | open_webui.routers.ollama:get_all_models:300 - get_all_models() - {}

Steps to Reproduce

  1. Install and start the service, register and log in, select a model, and engage in conversation - everything works normally.
  2. Disconnect from the internet.
  3. Prompt to select a model.

Logs & Screenshots

2025-03-14 08:08:04.880 | INFO | open_webui.routers.openai:get_all_models:379 - get_all_models() - {}
2025-03-14 08:08:10.979 | INFO | open_webui.routers.ollama:get_all_models:300 - get_all_models() - {}

Additional Information

No response

Originally created by @hellolinch on GitHub (Mar 14, 2025). ### Check Existing Issues - [x] I have searched the existing issues and discussions. - [x] I am using the latest version of Open WebUI. ### Installation Method Pip Install ### Open WebUI Version v0.5.20 ### Ollama Version (if applicable) v0.5.7 ### Operating System windows ### Browser (if applicable) edge ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have listed steps to reproduce the bug in detail. ### Expected Behavior After installing through `conda/pip install open-webui`, the service can automatically recognize the Ollama model upon startup and runs normally. After the device is disconnected from the internet, the service starts up normally and should be able to normally obtain the Ollama model for normal conversation. ### Actual Behavior After installing through `conda/pip install open-webui`, the service can automatically recognize the Ollama model upon startup and runs normally. However, when the device is disconnected from the internet, the service starts up normally, but the interface cannot select a model and cannot engage in a dialogue. It prompts: 2025-03-14 08:08:04.880 | INFO | open_webui.routers.openai:get_all_models:379 - get_all_models() - {} 2025-03-14 08:08:10.979 | INFO | open_webui.routers.ollama:get_all_models:300 - get_all_models() - {} ### Steps to Reproduce 1. Install and start the service, register and log in, select a model, and engage in conversation - everything works normally. 2. Disconnect from the internet. 3. Prompt to select a model. ### Logs & Screenshots 2025-03-14 08:08:04.880 | INFO | open_webui.routers.openai:get_all_models:379 - get_all_models() - {} 2025-03-14 08:08:10.979 | INFO | open_webui.routers.ollama:get_all_models:300 - get_all_models() - {} ### Additional Information _No response_
GiteaMirror added the bug label 2025-11-11 15:53:41 -06:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#4419