Ollama models stop showing after upgrade to v0.1.123 #839

Closed
opened 2025-11-11 14:32:12 -06:00 by GiteaMirror · 2 comments
Owner

Originally created by @llagerlof on GitHub (May 7, 2024).

Bug Report

Description

Bug Summary:
After upgrade to v0.1.123 the model isn't showing in "Select a model" combo box anymore.

Steps to Reproduce:

  1. Start the container on Windows 11 using cmd, normal user (not admin).

  2. docker run -d -p 3000:8080 -e OLLAMA_BASE_URL=http://127.0.0.1:11434 -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

  3. Access http://localhost:3000, log in and click on "Select a model" combo box field.

Expected Behavior:
When opening the "Select a model" field the models available in ollama should appear.

Actual Behavior:
The model isn't showing in "Select a model" combo box.

Environment

  • Open WebUI Version: 0.1.123
  • Ollama (if applicable): 0.1.33
  • Operating System: Windows 11 / cmd
  • Browser (if applicable): Any

Reproduction Details

  • ollama list shows me the models list.
  • http://127.0.0.1:11434 is accessible using a browser. Returns Ollama is running.
Originally created by @llagerlof on GitHub (May 7, 2024). # Bug Report ## Description **Bug Summary:** After upgrade to v0.1.123 the model isn't showing in "Select a model" combo box anymore. **Steps to Reproduce:** 1. Start the container on Windows 11 using `cmd`, normal user (not admin). 2. `docker run -d -p 3000:8080 -e OLLAMA_BASE_URL=http://127.0.0.1:11434 -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main` 3. Access `http://localhost:3000`, log in and click on "Select a model" combo box field. **Expected Behavior:** When opening the "Select a model" field the models available in ollama should appear. **Actual Behavior:** The model isn't showing in "Select a model" combo box. ## Environment - **Open WebUI Version:** 0.1.123 - **Ollama (if applicable):** 0.1.33 - **Operating System:** Windows 11 / cmd - **Browser (if applicable):** Any ## Reproduction Details - `ollama list` shows me the models list. - `http://127.0.0.1:11434` is accessible using a browser. Returns `Ollama is running`.
Author
Owner

@rdlu commented on GitHub (May 7, 2024):

I'm having the same. There's one error in the docker logs:

ERROR:apps.openai.main:Connection error: Cannot connect to host api.openai.com:443 ssl:default [Temporary failure in name resolution]

I have the openai integration enabled. And my dns is working inside the docker host terminal, nslookup and dig returns the correct ip.

I'm using Docker with a debian 12 host, not windows.

@rdlu commented on GitHub (May 7, 2024): I'm having the same. There's one error in the docker logs: `ERROR:apps.openai.main:Connection error: Cannot connect to host api.openai.com:443 ssl:default [Temporary failure in name resolution]` I have the openai integration enabled. And my dns is working inside the docker host terminal, nslookup and dig returns the correct ip. I'm using Docker with a debian 12 host, not windows.
Author
Owner

@llagerlof commented on GitHub (May 7, 2024):

Just to add info, I don't have the openai integration enabled.

@llagerlof commented on GitHub (May 7, 2024): Just to add info, I **don't** have the openai integration enabled.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#839