[GH-ISSUE #2841] Add/Remove Model Repos + Self Host Your Own Model Repo + Pull Models From Other Repos #27488

Closed
opened 2026-04-22 04:52:17 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @trymeouteh on GitHub (Feb 29, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2841

  1. The ability to manage the model repos in Ollama. Simular to how in F-Droid (Android app store) you can add and remove repos which allows you to get apps from other sources.

  2. Self host your own repo. Allow anyone to self host their own repo.

  • Weather this means simply setting up a git repo (Github, Gitlab, Gitea, Forgeo) or self hosting a website and server for hosting models.
  • Models can be stored as direct downloads or torrents. Would like to see a torrent option to help reduce bandwidth on the repo provider and encourage users of the model to seed the model (sharing is caring)
  1. The ability to pull models from other sources. If the same model is available on Ollama repo and on another repo, to have a way to download both or distinguish which model you want to download.
Originally created by @trymeouteh on GitHub (Feb 29, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2841 1. The ability to manage the model repos in Ollama. Simular to how in F-Droid (Android app store) you can add and remove repos which allows you to get apps from other sources. 2. Self host your own repo. Allow anyone to self host their own repo. - Weather this means simply setting up a git repo (Github, Gitlab, Gitea, Forgeo) or self hosting a website and server for hosting models. - Models can be stored as direct downloads or torrents. Would like to see a torrent option to help reduce bandwidth on the repo provider and encourage users of the model to seed the model (sharing is caring) 3. The ability to pull models from other sources. If the same model is available on Ollama repo and on another repo, to have a way to download both or distinguish which model you want to download.
Author
Owner

@pdevine commented on GitHub (Mar 1, 2024):

  1. You can do this by signing up for an account on ollama.com. Upload your Ollama public key and then you can push your own models.
  2. This is also doable today using the docker registry, however, there are some breaking changes that are going to come for this over the next few months. Also, the way we handle auth is different than the way docker handles it (see the first comment about adding your public key)
  3. This already works. Also, because we deduplicate the data, you only store one copy on disk.

I'm going to go ahead and close the issue, but feel free to keep commenting.

<!-- gh-comment-id:1972261127 --> @pdevine commented on GitHub (Mar 1, 2024): 1. You can do this by signing up for an account on ollama.com. Upload your Ollama public key and then you can push your own models. 2. This is also doable today using the docker registry, however, there are some breaking changes that are going to come for this over the next few months. Also, the way we handle auth is different than the way docker handles it (see the first comment about adding your public key) 3. This already works. Also, because we deduplicate the data, you only store one copy on disk. I'm going to go ahead and close the issue, but feel free to keep commenting.
Author
Owner

@trymeouteh commented on GitHub (Mar 1, 2024):

1, The ability to manage the model repos in Ollama. Simular to how in F-Droid (Android app store) you can add and remove repos which allows you to get apps from other sources.

  1. You can do this by signing up for an account on ollama.com. Upload your Ollama public key and then you can push your own models.

ollama.com is like one repo, the official repo. Just like how f-droid.org is one repo and is the official repo. Or how docker hub is one repo for docker and is the official docker repo but there are other docker/container repos such as quay. I know in Podman CLI you can set the official repo and in my case, I set it to docker.io which is docker hub.

  1. Self host your own repo. Allow anyone to self host their own repo.
  1. This is also doable today using the docker registry, however, there are some breaking changes that are going to come for this over the next few months. Also, the way we handle auth is different than the way docker handles it (see the first comment about adding your public key)

Would like to see Ollama create a docker image and a bare metal release of hosting your own LLM repository. Weather this self hosted repository is only for one person to self host their own LLMs or allow others to host their LLMs on their repository.

Also would like to suggest implementing bittorrent as a way to host LLMs to reduce bandwidth for repositories

  1. The ability to pull models from other sources. If the same model is available on Ollama repo and on another repo, to have a way to download both or distinguish which model you want to download.
  1. The ability to pull models from other sources. If the same model is available on Ollama repo and on another repo, to have a way to download both or distinguish which model you want to download.

Lets say you download an image from docker hub named nginx and you also download an image from quay.io also named nginx. Both images version tags and numbers are the same, however the images are different since the quay.io image has an additional feature. If this was the case with two LLMs, one from ollama.com and the other from another repository, will you have two models on your system, or will one overwrite the other?

<!-- gh-comment-id:1972492928 --> @trymeouteh commented on GitHub (Mar 1, 2024): > 1, The ability to manage the model repos in Ollama. Simular to how in F-Droid (Android app store) you can add and remove repos which allows you to get apps from other sources. >> 1. You can do this by signing up for an account on ollama.com. Upload your Ollama public key and then you can push your own models. ollama.com is like one repo, the official repo. Just like how f-droid.org is one repo and is the official repo. Or how docker hub is one repo for docker and is the official docker repo but there are other docker/container repos such as quay. I know in Podman CLI you can set the official repo and in my case, I set it to docker.io which is docker hub. > 2. Self host your own repo. Allow anyone to self host their own repo. >> 2. This is also doable today using the docker registry, however, there are some breaking changes that are going to come for this over the next few months. Also, the way we handle auth is different than the way docker handles it (see the first comment about adding your public key) Would like to see Ollama create a docker image and a bare metal release of hosting your own LLM repository. Weather this self hosted repository is only for one person to self host their own LLMs or allow others to host their LLMs on their repository. Also would like to suggest implementing bittorrent as a way to host LLMs to reduce bandwidth for repositories > 3. The ability to pull models from other sources. If the same model is available on Ollama repo and on another repo, to have a way to download both or distinguish which model you want to download. >> 3. The ability to pull models from other sources. If the same model is available on Ollama repo and on another repo, to have a way to download both or distinguish which model you want to download. Lets say you download an image from docker hub named nginx and you also download an image from quay.io also named nginx. Both images version tags and numbers are the same, however the images are different since the quay.io image has an additional feature. If this was the case with two LLMs, one from ollama.com and the other from another repository, will you have two models on your system, or will one overwrite the other?
Author
Owner

@snooze92 commented on GitHub (Apr 25, 2024):

I would really like a way to host an internal private repository for models, "on-prem", so that members of one private organisation can share models easily without leaking them out.

<!-- gh-comment-id:2076631542 --> @snooze92 commented on GitHub (Apr 25, 2024): I would really like a way to host an internal private repository for models, "on-prem", so that members of one private organisation can share models easily without leaking them out.
Author
Owner

@xuxinping commented on GitHub (Jan 31, 2025):

我真的很想找到一种方法来托管一个“on-prem”的模型内部私有存储库,这样一个私人组织的成员就可以轻松共享模型,而不会泄露它们。

I would really like a way to host an internal private repository for models, "on-prem", so that members of one private organisation can share models easily without leaking them out.

The Folib artifact repository supports the management of over 20 types of language artifact repositories, including those for Maven, npm, Go, and Hugging Face. In the future, it will also support the private hosting of the Ollama model.

<!-- gh-comment-id:2627540081 --> @xuxinping commented on GitHub (Jan 31, 2025): > 我真的很想找到一种方法来托管一个“on-prem”的模型内部私有存储库,这样一个私人组织的成员就可以轻松共享模型,而不会泄露它们。 > I would really like a way to host an internal private repository for models, "on-prem", so that members of one private organisation can share models easily without leaking them out. The Folib artifact repository supports the management of over 20 types of language artifact repositories, including those for Maven, npm, Go, and Hugging Face. In the future, it will also support the private hosting of the Ollama model.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#27488