[GH-ISSUE #19188] issue: Model drop-down fails to show models from remote hosts (ollama, llama.cpp) #34330

Closed
opened 2026-04-25 08:16:16 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @d-shehu on GitHub (Nov 14, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/19188

Check Existing Issues

  • I have searched for any existing and/or related issues.
  • I have searched for any existing and/or related discussions.
  • I have also searched in the CLOSED issues AND CLOSED discussions and found no related items (your issue might already be addressed on the development branch!).
  • I am using the latest version of Open WebUI.

Installation Method

Docker

Open WebUI Version

v0.6.36

Ollama Version (if applicable)

0.12.10

Operating System

Ubuntu 24.04

Browser (if applicable)

Chrome v142.0.744.162

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

After reconnecting to remote (Ollama) host on another network machines, it should load the models automatically and models appear in the drop down when creating a new chat.

Actual Behavior

Drop-down does not populate unless I repeatedly verify the connection to Ollama host and reload the apps. Behavior is inconsistent. Sometimes models load automatically and other times I need to retry by manually verifying the connection, reloading, starting a new thread and selecting a model from drop down.

In all cases the connection is good (verified with green checkmark) and models appear in the list in the admin panel. Only the dropdown in the chat fails to load properly.

Steps to Reproduce

  1. OpenWebUI is running in Docker 24/7 on machine A - same subnet as ollama server.
  2. Ollama running in docker a server that gets taken down periodically for maintenance, etc.
  3. After server is restarted, reload OpenWebUI on a client like Mac laptop. Same network.
  4. Observe that models fail to load in drop-down of chat.
  5. Go to admin panel and confirm connection to ollama server.
  6. Confirm models loaded in admin panel.
  7. Reload UI.
  8. Rinse and repeat steps 5-7, 2-3 times. Models start appearing in dropdown.

Logs & Screenshots

N/A

Additional Information

No response

Originally created by @d-shehu on GitHub (Nov 14, 2025). Original GitHub issue: https://github.com/open-webui/open-webui/issues/19188 ### Check Existing Issues - [x] I have searched for any existing and/or related issues. - [x] I have searched for any existing and/or related discussions. - [x] I have also searched in the CLOSED issues AND CLOSED discussions and found no related items (your issue might already be addressed on the development branch!). - [x] I am using the latest version of Open WebUI. ### Installation Method Docker ### Open WebUI Version v0.6.36 ### Ollama Version (if applicable) 0.12.10 ### Operating System Ubuntu 24.04 ### Browser (if applicable) Chrome v142.0.744.162 ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have **provided every relevant configuration, setting, and environment variable used in my setup.** - [x] I have clearly **listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup** (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc). - [x] I have documented **step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation**. My steps: - Start with the initial platform/version/OS and dependencies used, - Specify exact install/launch/configure commands, - List URLs visited, user input (incl. example values/emails/passwords if needed), - Describe all options and toggles enabled or changed, - Include any files or environmental changes, - Identify the expected and actual result at each stage, - Ensure any reasonably skilled user can follow and hit the same issue. ### Expected Behavior After reconnecting to remote (Ollama) host on another network machines, it should load the models automatically and models appear in the drop down when creating a new chat. ### Actual Behavior Drop-down does not populate unless I repeatedly verify the connection to Ollama host and reload the apps. Behavior is inconsistent. Sometimes models load automatically and other times I need to retry by manually verifying the connection, reloading, starting a new thread and selecting a model from drop down. In all cases the connection is good (verified with green checkmark) and models appear in the list in the admin panel. Only the dropdown in the chat fails to load properly. ### Steps to Reproduce 1. OpenWebUI is running in Docker 24/7 on machine A - same subnet as ollama server. 2. Ollama running in docker a server that gets taken down periodically for maintenance, etc. 3. After server is restarted, reload OpenWebUI on a client like Mac laptop. Same network. 4. Observe that models fail to load in drop-down of chat. 5. Go to admin panel and confirm connection to ollama server. 6. Confirm models loaded in admin panel. 7. Reload UI. 8. Rinse and repeat steps 5-7, 2-3 times. Models start appearing in dropdown. ### Logs & Screenshots N/A ### Additional Information _No response_
GiteaMirror added the bug label 2026-04-25 08:16:16 -05:00
Author
Owner

@tjbck commented on GitHub (Nov 16, 2025):

Unable to reproduce, the inference providers must be reachable in order for the models to be listed.

<!-- gh-comment-id:3539427719 --> @tjbck commented on GitHub (Nov 16, 2025): Unable to reproduce, the inference providers must be reachable in order for the models to be listed.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#34330