[GH-ISSUE #15263] Not detecting models from given Models directory in settings #35522

Open
opened 2026-04-22 20:05:30 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @Revive-Curiosity on GitHub (Apr 3, 2026).
Original GitHub issue: https://github.com/ollama/ollama/issues/15263

What is the issue?

Not detecting the models
Here is the models directory set in ollama (v 0.20)
Image

See here my models in LM Studio, there are many

Image

but not detecting any local models in Ollama
Image

Relevant log output


OS

No response

GPU

No response

CPU

No response

Ollama version

No response

Originally created by @Revive-Curiosity on GitHub (Apr 3, 2026). Original GitHub issue: https://github.com/ollama/ollama/issues/15263 ### What is the issue? Not detecting the models Here is the models directory set in ollama (v 0.20) <img width="726" height="116" alt="Image" src="https://github.com/user-attachments/assets/dfb841eb-b8bf-422f-aba0-f07b222145f6" /> See here my models in LM Studio, there are many <img width="1510" height="951" alt="Image" src="https://github.com/user-attachments/assets/cc072039-02c6-4d0f-8243-3e2ee4403c3a" /> but not detecting any local models in Ollama <img width="386" height="417" alt="Image" src="https://github.com/user-attachments/assets/bd34daaa-ffd3-4412-8b77-f6608df13be5" /> ### Relevant log output ```shell ``` ### OS _No response_ ### GPU _No response_ ### CPU _No response_ ### Ollama version _No response_
GiteaMirror added the bug label 2026-04-22 20:05:30 -05:00
Author
Owner

@rick-github commented on GitHub (Apr 3, 2026):

#6589

<!-- gh-comment-id:4183096570 --> @rick-github commented on GitHub (Apr 3, 2026): #6589
Author
Owner

@Revive-Curiosity commented on GitHub (Apr 3, 2026):

I read that, I knew something like this 1 year before. Till now why can't ollama not have option to detect LM Studio models from the given directory. Other open source AI inference platforms do that like vmlx and omlx on apple silicon. I guess, it is better to have interoperable local AI platforms. Please ship this feature with ollama's next release. Because most of the users use mlx models in LM Studio, so ollama now supporting mlx, should also do this to get LM Studio users with downloaded mlx models like other mlx inference platforms. Right ? Ollama is popular, but not as LM Studio for apple silicon users, reduce that friction for using ollama using LM Studio downloaded models.

<!-- gh-comment-id:4184499641 --> @Revive-Curiosity commented on GitHub (Apr 3, 2026): I read that, I knew something like this 1 year before. Till now why can't ollama not have option to detect LM Studio models from the given directory. Other open source AI inference platforms do that like vmlx and omlx on apple silicon. I guess, it is better to have interoperable local AI platforms. Please ship this feature with ollama's next release. Because most of the users use mlx models in LM Studio, so ollama now supporting mlx, should also do this to get LM Studio users with downloaded mlx models like other mlx inference platforms. Right ? Ollama is popular, but not as LM Studio for apple silicon users, reduce that friction for using ollama using LM Studio downloaded models.
Author
Owner

@rick-github commented on GitHub (Apr 3, 2026):

https://github.com/ollama/ollama/issues/8466#issuecomment-2599537619

<!-- gh-comment-id:4184574050 --> @rick-github commented on GitHub (Apr 3, 2026): https://github.com/ollama/ollama/issues/8466#issuecomment-2599537619
Author
Owner

@PureBlissAK commented on GitHub (Apr 18, 2026):

🤖 Automated Triage & Analysis Report

Issue: #15263
Analyzed: 2026-04-18T18:22:47.461025

Analysis

  • Type: unknown
  • Severity: medium
  • Components: unknown

Implementation Plan

  • Effort: medium
  • Steps:

This issue has been triaged and marked for implementation.

<!-- gh-comment-id:4274310634 --> @PureBlissAK commented on GitHub (Apr 18, 2026): <!-- ollama-issue-orchestrator:v1 issue:15263 --> ## 🤖 Automated Triage & Analysis Report **Issue**: #15263 **Analyzed**: 2026-04-18T18:22:47.461025 ### Analysis - **Type**: unknown - **Severity**: medium - **Components**: unknown ### Implementation Plan - **Effort**: medium - **Steps**: *This issue has been triaged and marked for implementation.*
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#35522