[GH-ISSUE #6589] Can this be used with "LM Studio" to share models? If so, how can it be modified? #4147

Closed
opened 2026-04-12 15:03:31 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @Willy-Shenn on GitHub (Sep 2, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6589

I am currently using two UI systems, but they cannot share models (possibly due to differences in how the models are identified and created). Even after modifying the environment variables, both UIs cannot use models from the same path. Is there anyone who can guide me on how to modify the two UIs so they can use models from the same path? I would be very grateful.

Originally created by @Willy-Shenn on GitHub (Sep 2, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6589 I am currently using two UI systems, but they cannot share models (possibly due to differences in how the models are identified and created). Even after modifying the environment variables, both UIs cannot use models from the same path. Is there anyone who can guide me on how to modify the two UIs so they can use models from the same path? I would be very grateful.
Author
Owner

@rick-github commented on GitHub (Sep 2, 2024):

LM Studio works with GGUF files, so this would seem theoretically possible. You would just need to identify the GGUF files and then provide a reference to LM Studio to allow it to find and load the model.

If you are on linux (and maybe Mac) the following script will find all GGUF files of models downloaded from the main ollama library (not community supplied models but that can be done by adjusting paths) and prints the path to the GGUF file and it's name.

OLLAMA_MODELS=/path/to/ollama/models
find $OLLAMA_MODELS/manifests/registry.ollama.ai/library -type f | while read i ; do 
  jq -r \
    --arg models "${OLLAMA_MODELS%/}" \
    --arg name "${i#*/library/}" \
    '.layers[]|select(.mediaType=="application/vnd.ollama.image.model")|"\(.digest)"|gsub(":";"-")|"\($name|gsub("/";":")) \($models)/blobs/\(.)"' \
    "$i"
done

On a unix-y system you can then link the model by name to the corresponding GGUF file:

LMSTUDIO_MODELS=/some/path/here
./find-gguf | while read name path  ; do
  ln -s "$path" "$LMSTUDIO_MODELS/$name"
done

I don't use LM Studio so I don't know if it has central repo, YMMV.

<!-- gh-comment-id:2325254559 --> @rick-github commented on GitHub (Sep 2, 2024): LM Studio works with GGUF files, so this would seem theoretically possible. You would just need to identify the GGUF files and then provide a reference to LM Studio to allow it to find and load the model. If you are on linux (and maybe Mac) the following script will find all GGUF files of models downloaded from the main ollama library (not community supplied models but that can be done by adjusting paths) and prints the path to the GGUF file and it's name. ```sh OLLAMA_MODELS=/path/to/ollama/models find $OLLAMA_MODELS/manifests/registry.ollama.ai/library -type f | while read i ; do jq -r \ --arg models "${OLLAMA_MODELS%/}" \ --arg name "${i#*/library/}" \ '.layers[]|select(.mediaType=="application/vnd.ollama.image.model")|"\(.digest)"|gsub(":";"-")|"\($name|gsub("/";":")) \($models)/blobs/\(.)"' \ "$i" done ``` On a unix-y system you can then link the model by name to the corresponding GGUF file: ```sh LMSTUDIO_MODELS=/some/path/here ./find-gguf | while read name path ; do ln -s "$path" "$LMSTUDIO_MODELS/$name" done ``` I don't use LM Studio so I don't know if it has central repo, YMMV.
Author
Owner

@pdevine commented on GitHub (Sep 2, 2024):

@Willy-Shenn as Rick mentioned, it should in theory be possible, but it's definitely not "supported". Is there a feature in LM Studio which is missing in Ollama?

I'm going to close the issue, but feel free to comment.

<!-- gh-comment-id:2325331364 --> @pdevine commented on GitHub (Sep 2, 2024): @Willy-Shenn as Rick mentioned, it should in theory be possible, but it's definitely not "supported". Is there a feature in LM Studio which is missing in Ollama? I'm going to close the issue, but feel free to comment.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#4147