[GH-ISSUE #13105] Ollama app's chat interface does not list all the Ollama models I have downloaded! #8676

Open
opened 2026-04-12 21:26:40 -05:00 by GiteaMirror · 7 comments
Owner

Originally created by @yin3331 on GitHub (Nov 16, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/13105

Originally assigned to: @hoyyeva on GitHub.

What is the issue?

The model selection in the Ollama app's chat interface does not list all the Ollama models I have downloaded! Even entering the model name in the "Find Model" field cannot search for the models that are not displayed.

Image

Relevant log output


OS

Windows

GPU

Nvidia

CPU

Intel

Ollama version

0.12.11

Originally created by @yin3331 on GitHub (Nov 16, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/13105 Originally assigned to: @hoyyeva on GitHub. ### What is the issue? The model selection in the Ollama app's chat interface does not list all the Ollama models I have downloaded! Even entering the model name in the "Find Model" field cannot search for the models that are not displayed. <img width="1137" height="919" alt="Image" src="https://github.com/user-attachments/assets/fb5af25e-b3fc-40fa-9981-17631889d42c" /> ### Relevant log output ```shell ``` ### OS Windows ### GPU Nvidia ### CPU Intel ### Ollama version 0.12.11
GiteaMirror added the appbug labels 2026-04-12 21:26:40 -05:00
Author
Owner

@hoyyeva commented on GitHub (Nov 18, 2025):

Hi @yin3331

I am suspecting that there might be some problem with port mismatching - wondering if you can run couple commands in PowerShell and share the output

# 1. Check how many models your backend returns directly
curl http://127.0.0.1:11434/api/tags | ConvertFrom-Json | Select-Object -ExpandProperty models | Measure-Object | Select-Object -ExpandProperty Count

# 2. Check your app logs for proxy configuration
Select-String -Path "$env:LOCALAPPDATA\Ollama\app.log" -Pattern "configuring ollama proxy|proxy error" | Select-Object -Last 20

# 3. Check if OLLAMA_HOST is set
$env:OLLAMA_HOST

# 4. Check which port Ollama is running on
netstat -ano | findstr :11434
<!-- gh-comment-id:3549932854 --> @hoyyeva commented on GitHub (Nov 18, 2025): Hi @yin3331 I am suspecting that there might be some problem with port mismatching - wondering if you can run couple commands in PowerShell and share the output > > ```powershell > # 1. Check how many models your backend returns directly > curl http://127.0.0.1:11434/api/tags | ConvertFrom-Json | Select-Object -ExpandProperty models | Measure-Object | Select-Object -ExpandProperty Count > > # 2. Check your app logs for proxy configuration > Select-String -Path "$env:LOCALAPPDATA\Ollama\app.log" -Pattern "configuring ollama proxy|proxy error" | Select-Object -Last 20 > > # 3. Check if OLLAMA_HOST is set > $env:OLLAMA_HOST > > # 4. Check which port Ollama is running on > netstat -ano | findstr :11434 > ``` > >
Author
Owner

@KeepGood2016 commented on GitHub (Nov 19, 2025):

Same thing happens with me. I found rolling back to 0.12.9 solved it. Anything higher than that version will only show the models that come in the default list.

<!-- gh-comment-id:3550162280 --> @KeepGood2016 commented on GitHub (Nov 19, 2025): Same thing happens with me. I found rolling back to 0.12.9 solved it. Anything higher than that version will only show the models that come in the default list.
Author
Owner

@yin3331 commented on GitHub (Nov 19, 2025):

Hi @yin3331

I am suspecting that there might be some problem with port mismatching - wondering if you can run couple commands in PowerShell and share the output

1. Check how many models your backend returns directly

curl http://127.0.0.1:11434/api/tags | ConvertFrom-Json | Select-Object -ExpandProperty models | Measure-Object | Select-Object -ExpandProperty Count

2. Check your app logs for proxy configuration

Select-String -Path "$env:LOCALAPPDATA\Ollama\app.log" -Pattern "configuring ollama proxy|proxy error" | Select-Object -Last 20

3. Check if OLLAMA_HOST is set

$env:OLLAMA_HOST

4. Check which port Ollama is running on

netstat -ano | findstr :11434

(base) PS C:\Windows\system32> curl http://127.0.0.1:11434/api/tags | ConvertFrom-Json | Select-Object -ExpandProperty models | Measure-Object | Select-Object -ExpandProperty Count
10
(base) PS C:\Windows\system32> Select-String -Path "$env:LOCALAPPDATA\Ollama\app.log" -Pattern "configuring ollama proxy|proxy error" | Select-Object -Last 20

C:\Users\heros\AppData\Local\Ollama\app.log:4:time=2025-11-19T14:49:10.196+08:00 level=INFO source=ui.go:138 msg="confi
guring ollama proxy" target=http://0.0.0.0
C:\Users\heros\AppData\Local\Ollama\app.log:8:time=2025-11-19T14:56:19.953+08:00 level=ERROR source=ui.go:150 msg="prox
y error" error="dial tcp 0.0.0.0:80: connectex: No connection could be made because the target machine actively refused
it." path=/api/tags target=http://0.0.0.0
C:\Users\heros\AppData\Local\Ollama\app.log:14:time=2025-11-19T14:56:20.053+08:00 level=ERROR source=ui.go:150 msg="pro
xy error" error="dial tcp 0.0.0.0:80: connectex: No connection could be made because the target machine actively refuse
d it." path=/api/show target=http://0.0.0.0
C:\Users\heros\AppData\Local\Ollama\app.log:15:time=2025-11-19T14:56:20.092+08:00 level=ERROR source=ui.go:150 msg="pro
xy error" error="dial tcp 0.0.0.0:80: connectex: No connection could be made because the target machine actively refuse
d it." path=/api/tags target=http://0.0.0.0
C:\Users\heros\AppData\Local\Ollama\app.log:16:time=2025-11-19T14:56:20.301+08:00 level=ERROR source=ui.go:150 msg="pro
xy error" error="dial tcp 0.0.0.0:80: connectex: No connection could be made because the target machine actively refuse
d it." path=/api/tags target=http://0.0.0.0
C:\Users\heros\AppData\Local\Ollama\app.log:17:time=2025-11-19T14:56:20.705+08:00 level=ERROR source=ui.go:150 msg="pro
xy error" error="dial tcp 0.0.0.0:80: connectex: No connection could be made because the target machine actively refuse
d it." path=/api/tags target=http://0.0.0.0
C:\Users\heros\AppData\Local\Ollama\app.log:18:time=2025-11-19T14:56:21.511+08:00 level=ERROR source=ui.go:150 msg="pro
xy error" error="dial tcp 0.0.0.0:80: connectex: No connection could be made because the target machine actively refuse
d it." path=/api/tags target=http://0.0.0.0

(base) PS C:\Windows\system32> $env:OLLAMA_HOST
0.0.0.0
(base) PS C:\Windows\system32> netstat -ano | findstr :11434
TCP 0.0.0.0:11434 0.0.0.0:0 LISTENING 10964
TCP [::]:11434 [::]:0 LISTENING 10964
TCP [::1]:11434 [::1]:49750 ESTABLISHED 10964
TCP [::1]:11434 [::1]:51304 ESTABLISHED 10964
TCP [::1]:11434 [::1]:52521 ESTABLISHED 10964
TCP [::1]:11434 [::1]:54262 ESTABLISHED 10964
TCP [::1]:11434 [::1]:55300 ESTABLISHED 10964
TCP [::1]:11434 [::1]:55918 ESTABLISHED 10964
TCP [::1]:49750 [::1]:11434 ESTABLISHED 19536
TCP [::1]:51304 [::1]:11434 ESTABLISHED 19536
TCP [::1]:52521 [::1]:11434 ESTABLISHED 19536
TCP [::1]:54262 [::1]:11434 ESTABLISHED 19536
TCP [::1]:55300 [::1]:11434 ESTABLISHED 19536
TCP [::1]:55918 [::1]:11434 ESTABLISHED 19536

I deleted the environment variable related to OLLAMA_HOST=0.0.0.0, and the problem was solved! But in previous versions, having OLLAMA_HOST=0.0.0.0 was not an issue

<!-- gh-comment-id:3551164197 --> @yin3331 commented on GitHub (Nov 19, 2025): > Hi [@yin3331](https://github.com/yin3331) > > I am suspecting that there might be some problem with port mismatching - wondering if you can run couple commands in PowerShell and share the output > > > # 1. Check how many models your backend returns directly > > curl http://127.0.0.1:11434/api/tags | ConvertFrom-Json | Select-Object -ExpandProperty models | Measure-Object | Select-Object -ExpandProperty Count > > > > # 2. Check your app logs for proxy configuration > > Select-String -Path "$env:LOCALAPPDATA\Ollama\app.log" -Pattern "configuring ollama proxy|proxy error" | Select-Object -Last 20 > > > > # 3. Check if OLLAMA_HOST is set > > $env:OLLAMA_HOST > > > > # 4. Check which port Ollama is running on > > netstat -ano | findstr :11434 (base) PS C:\Windows\system32> curl http://127.0.0.1:11434/api/tags | ConvertFrom-Json | Select-Object -ExpandProperty models | Measure-Object | Select-Object -ExpandProperty Count 10 (base) PS C:\Windows\system32> Select-String -Path "$env:LOCALAPPDATA\Ollama\app.log" -Pattern "configuring ollama proxy|proxy error" | Select-Object -Last 20 C:\Users\heros\AppData\Local\Ollama\app.log:4:time=2025-11-19T14:49:10.196+08:00 level=INFO source=ui.go:138 msg="confi guring ollama proxy" target=http://0.0.0.0 C:\Users\heros\AppData\Local\Ollama\app.log:8:time=2025-11-19T14:56:19.953+08:00 level=ERROR source=ui.go:150 msg="prox y error" error="dial tcp 0.0.0.0:80: connectex: No connection could be made because the target machine actively refused it." path=/api/tags target=http://0.0.0.0 C:\Users\heros\AppData\Local\Ollama\app.log:14:time=2025-11-19T14:56:20.053+08:00 level=ERROR source=ui.go:150 msg="pro xy error" error="dial tcp 0.0.0.0:80: connectex: No connection could be made because the target machine actively refuse d it." path=/api/show target=http://0.0.0.0 C:\Users\heros\AppData\Local\Ollama\app.log:15:time=2025-11-19T14:56:20.092+08:00 level=ERROR source=ui.go:150 msg="pro xy error" error="dial tcp 0.0.0.0:80: connectex: No connection could be made because the target machine actively refuse d it." path=/api/tags target=http://0.0.0.0 C:\Users\heros\AppData\Local\Ollama\app.log:16:time=2025-11-19T14:56:20.301+08:00 level=ERROR source=ui.go:150 msg="pro xy error" error="dial tcp 0.0.0.0:80: connectex: No connection could be made because the target machine actively refuse d it." path=/api/tags target=http://0.0.0.0 C:\Users\heros\AppData\Local\Ollama\app.log:17:time=2025-11-19T14:56:20.705+08:00 level=ERROR source=ui.go:150 msg="pro xy error" error="dial tcp 0.0.0.0:80: connectex: No connection could be made because the target machine actively refuse d it." path=/api/tags target=http://0.0.0.0 C:\Users\heros\AppData\Local\Ollama\app.log:18:time=2025-11-19T14:56:21.511+08:00 level=ERROR source=ui.go:150 msg="pro xy error" error="dial tcp 0.0.0.0:80: connectex: No connection could be made because the target machine actively refuse d it." path=/api/tags target=http://0.0.0.0 (base) PS C:\Windows\system32> $env:OLLAMA_HOST 0.0.0.0 (base) PS C:\Windows\system32> netstat -ano | findstr :11434 TCP 0.0.0.0:11434 0.0.0.0:0 LISTENING 10964 TCP [::]:11434 [::]:0 LISTENING 10964 TCP [::1]:11434 [::1]:49750 ESTABLISHED 10964 TCP [::1]:11434 [::1]:51304 ESTABLISHED 10964 TCP [::1]:11434 [::1]:52521 ESTABLISHED 10964 TCP [::1]:11434 [::1]:54262 ESTABLISHED 10964 TCP [::1]:11434 [::1]:55300 ESTABLISHED 10964 TCP [::1]:11434 [::1]:55918 ESTABLISHED 10964 TCP [::1]:49750 [::1]:11434 ESTABLISHED 19536 TCP [::1]:51304 [::1]:11434 ESTABLISHED 19536 TCP [::1]:52521 [::1]:11434 ESTABLISHED 19536 TCP [::1]:54262 [::1]:11434 ESTABLISHED 19536 TCP [::1]:55300 [::1]:11434 ESTABLISHED 19536 TCP [::1]:55918 [::1]:11434 ESTABLISHED 19536 **I deleted the environment variable related to OLLAMA_HOST=0.0.0.0, and the problem was solved! But in previous versions, having OLLAMA_HOST=0.0.0.0 was not an issue**
Author
Owner

@yin3331 commented on GitHub (Nov 19, 2025):

Same thing happens with me. I found rolling back to 0.12.9 solved it. Anything higher than that version will only show the models that come in the default list.

I deleted the environment variable related to OLLAMA_HOST=0.0.0.0, and the problem was solved! But in previous versions, having OLLAMA_HOST=0.0.0.0 was not an issue

<!-- gh-comment-id:3551166476 --> @yin3331 commented on GitHub (Nov 19, 2025): > Same thing happens with me. I found rolling back to 0.12.9 solved it. Anything higher than that version will only show the models that come in the default list. I deleted the environment variable related to OLLAMA_HOST=0.0.0.0, and the problem was solved! But in previous versions, having OLLAMA_HOST=0.0.0.0 was not an issue
Author
Owner

@hoyyeva commented on GitHub (Nov 19, 2025):

hi @yin3331 & @KeepGood2016 thank you for your quick reply! I think I have really good picture why it is broken!! will update you when the fix is up! again thank you so much

<!-- gh-comment-id:3553089207 --> @hoyyeva commented on GitHub (Nov 19, 2025): hi @yin3331 & @KeepGood2016 thank you for your quick reply! I think I have really good picture why it is broken!! will update you when the fix is up! again thank you so much
Author
Owner

@npelov commented on GitHub (Dec 11, 2025):

Maybe it's a good idea when OLLAMA_HOST is 0.0.0.0 gui should look for it on 127.0.0.1

<!-- gh-comment-id:3643038770 --> @npelov commented on GitHub (Dec 11, 2025): Maybe it's a good idea when OLLAMA_HOST is 0.0.0.0 gui should look for it on 127.0.0.1
Author
Owner

@shuv-amp commented on GitHub (Feb 19, 2026):

Hi @hoyyeva - quick follow-up on this issue.

I reviewed #13159 (merged on December 15, 2025). Since #13105 is still open, and #13377 is still showing related app-side behavior when OLLAMA_HOST is set to wildcard bind addresses, I can take a narrow follow-up if useful.

I can send a small patch scoped to app/ui/ui.go (proxy target normalization for wildcard addresses like 0.0.0.0 / :: while keeping normal host:port behavior intact) plus a regression test in app/ui/ui_test.go.

I do not see an open PR specifically targeting the remaining #13105/#13377 behavior. If you are already handling it, I can step back and pick another issue; otherwise I can open a PR and link both issues.

<!-- gh-comment-id:3929039802 --> @shuv-amp commented on GitHub (Feb 19, 2026): Hi @hoyyeva - quick follow-up on this issue. I reviewed #13159 (merged on December 15, 2025). Since #13105 is still open, and #13377 is still showing related app-side behavior when `OLLAMA_HOST` is set to wildcard bind addresses, I can take a narrow follow-up if useful. I can send a small patch scoped to `app/ui/ui.go` (proxy target normalization for wildcard addresses like `0.0.0.0` / `::` while keeping normal `host:port` behavior intact) plus a regression test in `app/ui/ui_test.go`. I do not see an open PR specifically targeting the remaining #13105/#13377 behavior. If you are already handling it, I can step back and pick another issue; otherwise I can open a PR and link both issues.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#8676