[GH-ISSUE #13425] Ollama GUI doesn't show all the models that cli has #8865

Closed
opened 2026-04-12 21:40:09 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @npelov on GitHub (Dec 11, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/13425

What is the issue?

Ollama gui only shows qwen, gemma, deepseek and gpt oss, while cli shows all downloaded models.

Image

.ollama in home dir is a junction to E:\ollama-win

I double checked models are in E:\ollama-win\models\manifests\registry.ollama.ai\library
ollama list shows all the models:
ollama-list.txt

and ollama run has no problem running any of the models

I tried settin model path in gui directly to E:\ollama-win\models - didn't fix the problem

More info on discord thread:
https://discord.com/channels/1128867683291627614/1211804431340019753/threads/1448614539665866755

Relevant log output


OS

Windows

GPU

Nvidia

CPU

Intel

Ollama version

0.13.2

Originally created by @npelov on GitHub (Dec 11, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/13425 ### What is the issue? Ollama gui only shows qwen, gemma, deepseek and gpt oss, while cli shows all downloaded models. <img width="323" height="380" alt="Image" src="https://github.com/user-attachments/assets/7939f2d7-4c2c-43b0-b0da-db2af4dbd86c" /> .ollama in home dir is a junction to `E:\ollama-win` I double checked models are in `E:\ollama-win\models\manifests\registry.ollama.ai\library` ollama list shows all the models: [ollama-list.txt](https://github.com/user-attachments/files/24100642/ollama-list.txt) and ollama run has no problem running any of the models I tried settin model path in gui directly to `E:\ollama-win\models` - didn't fix the problem More info on discord thread: https://discord.com/channels/1128867683291627614/1211804431340019753/threads/1448614539665866755 ### Relevant log output ```shell ``` ### OS Windows ### GPU Nvidia ### CPU Intel ### Ollama version 0.13.2
GiteaMirror added the bug label 2026-04-12 21:40:09 -05:00
Author
Owner

@jmkraus commented on GitHub (Dec 11, 2025):

Could this be a duplicate of https://github.com/ollama/ollama/issues/13105 ?

<!-- gh-comment-id:3642438993 --> @jmkraus commented on GitHub (Dec 11, 2025): Could this be a duplicate of https://github.com/ollama/ollama/issues/13105 ?
Author
Owner

@npelov commented on GitHub (Dec 11, 2025):

Yep! removing the var solved it. I checked expose to the network. still I think when OLLAMA_HOST is 0.0.0.0 it should look for it on 127.0.0.1

<!-- gh-comment-id:3643036581 --> @npelov commented on GitHub (Dec 11, 2025): Yep! removing the var solved it. I checked expose to the network. still I think when OLLAMA_HOST is 0.0.0.0 it should look for it on 127.0.0.1
Author
Owner

@mchiang0610 commented on GitHub (Dec 11, 2025):

thank you for reporting. Closing this one to move the conversation over to https://github.com/ollama/ollama/issues/13105

<!-- gh-comment-id:3643885609 --> @mchiang0610 commented on GitHub (Dec 11, 2025): thank you for reporting. Closing this one to move the conversation over to https://github.com/ollama/ollama/issues/13105
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#8865