[GH-ISSUE #1451] [FEAT] One directory to model them all #777

Closed
opened 2026-04-12 10:27:30 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @kfsone on GitHub (Dec 10, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1451

Please consider adding a way to allow Ollama to share models with other resources/tools. Either by allowing a "models dir" config setting/option somewhere, or a modelmap.yaml file:

- mistral-7b-instruct:
  - presents-as: Mistral-7B-Instruct-v0.1
  - folder: /opt/ai/models/TheBloke/Mistral-7B-Instruct-v01-GGUF  # optional
  - files:
    - tag: Q5_K_M
      file: mistral-7b-instruct-v0.1.Q5_K_M.gguf

- claude2:
  - file: /opt/ai/models/TheBloke/claude2-alpaca-13B-GGUF/claude2-alpaca-13b.Q5_K_M.gguf
Originally created by @kfsone on GitHub (Dec 10, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1451 Please consider adding a way to allow Ollama to share models with other resources/tools. Either by allowing a "models dir" config setting/option somewhere, or a modelmap.yaml file: ``` - mistral-7b-instruct: - presents-as: Mistral-7B-Instruct-v0.1 - folder: /opt/ai/models/TheBloke/Mistral-7B-Instruct-v01-GGUF # optional - files: - tag: Q5_K_M file: mistral-7b-instruct-v0.1.Q5_K_M.gguf - claude2: - file: /opt/ai/models/TheBloke/claude2-alpaca-13B-GGUF/claude2-alpaca-13b.Q5_K_M.gguf ```
Author
Owner

@technovangelist commented on GitHub (Dec 10, 2023):

There is a OLLAMA_MODELS environment variable that you can set to configure the models directory. You can learn more about it here: https://github.com/jmorganca/ollama/blob/main/docs/faq.md#how-can-i-change-where-ollama-stores-models

This mostly talks about doing it on Linux, which is what I think you are using. If using Mac, you would need to stop the menubar app and run OLLAMA MODELS=my/model/dir ollama serve in a separate terminal. This isn't ideal and we are looking at an alternative approach on mac.

But I think you just want to make it easier to find the appropriate model to use with another app. So I created this a few months back that may solve your problem. It creates symbolic links in a folder with readable names representing each model.

#!/bin/bash

# Base directories
base_dir=~/.ollama/models
manifest_dir=$base_dir/manifests/registry.ollama.ai
blob_dir=$base_dir/blobs
publicmodels_dir=~/publicmodels/mattw/lmstudio

# Remove all existing symbolic links from publicmodels directory
find "$publicmodels_dir" -type l -exec rm {} +

# Create publicmodels directory if it doesn't exist
mkdir -p "$publicmodels_dir"

# Use find to get all files under the 'model' directories
find "$manifest_dir" -type f -mindepth 3 -maxdepth 3 | while IFS= read -r file; do
    user=$(basename "$(dirname "$(dirname "$file")")" | sed 's/^registry\.ollama\.ai/ollama/')
    model=$(basename "$(dirname "$file")")
    tag=$(basename "$file")

    digest=$(jq -r '.layers[] | select(.mediaType == "application/vnd.ollama.image.model") | .digest' "$file")

    # Create symbolic link
    ln -s "$blob_dir/$digest" "$publicmodels_dir/$user-$model-$tag.bin"

    # Print the user, model, and tag
    echo "$user - $model:$tag"
done

let me know if that works for you.

<!-- gh-comment-id:1848875004 --> @technovangelist commented on GitHub (Dec 10, 2023): There is a OLLAMA_MODELS environment variable that you can set to configure the models directory. You can learn more about it here: https://github.com/jmorganca/ollama/blob/main/docs/faq.md#how-can-i-change-where-ollama-stores-models This mostly talks about doing it on Linux, which is what I think you are using. If using Mac, you would need to stop the menubar app and run `OLLAMA MODELS=my/model/dir ollama serve` in a separate terminal. This isn't ideal and we are looking at an alternative approach on mac. But I think you just want to make it easier to find the appropriate model to use with another app. So I created this a few months back that may solve your problem. It creates symbolic links in a folder with readable names representing each model. ``` #!/bin/bash # Base directories base_dir=~/.ollama/models manifest_dir=$base_dir/manifests/registry.ollama.ai blob_dir=$base_dir/blobs publicmodels_dir=~/publicmodels/mattw/lmstudio # Remove all existing symbolic links from publicmodels directory find "$publicmodels_dir" -type l -exec rm {} + # Create publicmodels directory if it doesn't exist mkdir -p "$publicmodels_dir" # Use find to get all files under the 'model' directories find "$manifest_dir" -type f -mindepth 3 -maxdepth 3 | while IFS= read -r file; do user=$(basename "$(dirname "$(dirname "$file")")" | sed 's/^registry\.ollama\.ai/ollama/') model=$(basename "$(dirname "$file")") tag=$(basename "$file") digest=$(jq -r '.layers[] | select(.mediaType == "application/vnd.ollama.image.model") | .digest' "$file") # Create symbolic link ln -s "$blob_dir/$digest" "$publicmodels_dir/$user-$model-$tag.bin" # Print the user, model, and tag echo "$user - $model:$tag" done ``` let me know if that works for you.
Author
Owner

@kfsone commented on GitHub (Dec 12, 2023):

I'm actually working with all 4 (lin/mac/win/wsl) so I might take a look at that. what is jq? I'm ashamed to admit after 35 years working in *sh that I've become a (cross-platform) powershell addict.

<!-- gh-comment-id:1852729520 --> @kfsone commented on GitHub (Dec 12, 2023): I'm actually working with all 4 (lin/mac/win/wsl) so I might take a look at that. what is `jq`? I'm ashamed to admit after 35 years working in *sh that I've become a (cross-platform) powershell addict.
Author
Owner

@easp commented on GitHub (Dec 13, 2023):

jq = https://github.com/jqlang/jq

<!-- gh-comment-id:1853268301 --> @easp commented on GitHub (Dec 13, 2023): jq = https://github.com/jqlang/jq
Author
Owner

@pdevine commented on GitHub (Dec 19, 2023):

Going to close this.

<!-- gh-comment-id:1863366307 --> @pdevine commented on GitHub (Dec 19, 2023): Going to close this.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#777