Ability to download new models are gone #797

Closed
opened 2025-11-11 14:31:27 -06:00 by GiteaMirror · 4 comments
Owner

Originally created by @zono50 on GitHub (May 3, 2024).

On the latest version of open-webui, and when i go to settings > Models, this is the only option I have

image

I remember on a previous installation, i had the ability to download models from the webui. Not sure why that is not showing here.

Installed this using docker command - docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

OS is Arch Linux

Any help would be greatly appreciated

Originally created by @zono50 on GitHub (May 3, 2024). On the latest version of open-webui, and when i go to settings > Models, this is the only option I have ![image](https://github.com/open-webui/open-webui/assets/8764705/f0d97d5e-fc87-4dfb-b54c-23b87af657d0) I remember on a previous installation, i had the ability to download models from the webui. Not sure why that is not showing here. Installed this using docker command - docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main OS is Arch Linux Any help would be greatly appreciated
Author
Owner

@justinh-rahb commented on GitHub (May 3, 2024):

What does this show?

ss -lntp | grep 11434
@justinh-rahb commented on GitHub (May 3, 2024): What does this show? ```bash ss -lntp | grep 11434 ```
Author
Owner

@zono50 commented on GitHub (May 3, 2024):

LISTEN 0 4096 127.0.0.1:11434 0.0.0.0:*

@zono50 commented on GitHub (May 3, 2024): LISTEN 0 4096 127.0.0.1:11434 0.0.0.0:*
Author
Owner

@justinh-rahb commented on GitHub (May 3, 2024):

LISTEN 0 4096 127.0.0.1:11434 0.0.0.0:*

Ya, the issue is Ollama not listening to all interfaces, only localhost.

https://github.com/ollama/ollama/blob/main/docs/faq.md#setting-environment-variables-on-linux

Set the OLLAMA_HOST=0.0.0.0 environment variable.

@justinh-rahb commented on GitHub (May 3, 2024): > LISTEN 0 4096 127.0.0.1:11434 0.0.0.0:* > Ya, the issue is Ollama not listening to all interfaces, only `localhost`. https://github.com/ollama/ollama/blob/main/docs/faq.md#setting-environment-variables-on-linux Set the `OLLAMA_HOST=0.0.0.0` environment variable.
Author
Owner

@zono50 commented on GitHub (May 3, 2024):

That was it! thanks for the assistance

@zono50 commented on GitHub (May 3, 2024): That was it! thanks for the assistance
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#797