[GH-ISSUE #5272] Pull a model from Ollama.com // multiple ollama servers #52587

Closed
opened 2026-05-05 13:41:33 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @UberMetroid on GitHub (Sep 8, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/5272

Is your feature request related to a problem? Please describe.
I'm always frustrated when I download a new LLM from ollama.com because I have multiple ollama servers and have to download the model for each server.

Describe the solution you'd like
I would like an option to match llms across ollama servers so I dont have to manually check each ollama server to make sure the llm is there.

Describe alternatives you've considered
Currently you have to download the llm for one server, then switch to the next and download it again.

Originally created by @UberMetroid on GitHub (Sep 8, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/5272 **Is your feature request related to a problem? Please describe.** I'm always frustrated when I download a new LLM from ollama.com because I have multiple ollama servers and have to download the model for each server. **Describe the solution you'd like** I would like an option to match llms across ollama servers so I dont have to manually check each ollama server to make sure the llm is there. **Describe alternatives you've considered** Currently you have to download the llm for one server, then switch to the next and download it again.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#52587