[GH-ISSUE #11632] Desktop App Does Not Work Offline Despite Local Models #69745

Closed
opened 2026-05-04 19:03:01 -05:00 by GiteaMirror · 23 comments
Owner

Originally created by @contactonder on GitHub (Aug 1, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/11632

Originally assigned to: @BruceMacD on GitHub.

What is the issue?

The Ollama desktop application (macOS) fails to respond to queries when offline, despite having models downloaded locally. The CLI interface works perfectly offline, but the GUI requires an internet connection to function.
Environment

OS: macOS (version not specified - please add your version)
Ollama Version: (please run ollama --version and add here)
App Type: Official Ollama desktop application
Models: qwen3:8b (5.2 GB, fully downloaded locally)

Expected Behavior
The desktop application should work completely offline once models are downloaded locally, similar to how the CLI interface functions.
Actual Behavior

With WiFi enabled: App works normally and responds to queries
With WiFi disabled: App appears to accept input but does not generate any responses
After re-enabling WiFi: Responses appear, suggesting the queries were queued

Steps to Reproduce

Install Ollama desktop app on macOS
Download a model (e.g., ollama pull qwen3:8b)
Verify model is downloaded: ollama list shows the model locally
Open Ollama desktop app
Disable WiFi/internet connection
Try to ask any question in the app
Observe no response is generated
Re-enable WiFi
Response appears

CLI Comparison (Works Correctly)
The CLI interface works perfectly offline:
bash# With WiFi disabled
ollama run qwen3:8b "what does fallacy mean?"

Returns proper response immediately

App Settings Checked

Ollama account: Not connected
"Expose Ollama to the network": Tested both enabled/disabled
Models stored locally at: /Users/[username]/ollama/models

Additional Context
This behavior suggests the desktop app has an unnecessary network dependency that prevents it from functioning offline, while the underlying Ollama service (accessible via CLI) works correctly without internet access.

Relevant log output


OS

No response

GPU

No response

CPU

No response

Ollama version

No response

Originally created by @contactonder on GitHub (Aug 1, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/11632 Originally assigned to: @BruceMacD on GitHub. ### What is the issue? The Ollama desktop application (macOS) fails to respond to queries when offline, despite having models downloaded locally. The CLI interface works perfectly offline, but the GUI requires an internet connection to function. Environment OS: macOS (version not specified - please add your version) Ollama Version: (please run ollama --version and add here) App Type: Official Ollama desktop application Models: qwen3:8b (5.2 GB, fully downloaded locally) Expected Behavior The desktop application should work completely offline once models are downloaded locally, similar to how the CLI interface functions. Actual Behavior With WiFi enabled: App works normally and responds to queries With WiFi disabled: App appears to accept input but does not generate any responses After re-enabling WiFi: Responses appear, suggesting the queries were queued Steps to Reproduce Install Ollama desktop app on macOS Download a model (e.g., ollama pull qwen3:8b) Verify model is downloaded: ollama list shows the model locally Open Ollama desktop app Disable WiFi/internet connection Try to ask any question in the app Observe no response is generated Re-enable WiFi Response appears CLI Comparison (Works Correctly) The CLI interface works perfectly offline: bash# With WiFi disabled ollama run qwen3:8b "what does fallacy mean?" # Returns proper response immediately App Settings Checked Ollama account: Not connected "Expose Ollama to the network": Tested both enabled/disabled Models stored locally at: /Users/[username]/ollama/models Additional Context This behavior suggests the desktop app has an unnecessary network dependency that prevents it from functioning offline, while the underlying Ollama service (accessible via CLI) works correctly without internet access. ### Relevant log output ```shell ``` ### OS _No response_ ### GPU _No response_ ### CPU _No response_ ### Ollama version _No response_
GiteaMirror added the appbug labels 2026-05-04 19:03:02 -05:00
Author
Owner

@technovangelist commented on GitHub (Aug 1, 2025):

i wasn't able to reproduce, but there were a couple of folks commenting on my video that said they experienced this.

<!-- gh-comment-id:3145472812 --> @technovangelist commented on GitHub (Aug 1, 2025): i wasn't able to reproduce, but there were a couple of folks commenting on my video that said they experienced this.
Author
Owner

@BPE3 commented on GitHub (Aug 4, 2025):

Have the same problem, please solve it.

<!-- gh-comment-id:3148881295 --> @BPE3 commented on GitHub (Aug 4, 2025): Have the same problem, please solve it.
Author
Owner

@andreacrema commented on GitHub (Aug 5, 2025):

Same issue on Windows. This is a big no.

<!-- gh-comment-id:3155287825 --> @andreacrema commented on GitHub (Aug 5, 2025): Same issue on Windows. This is a big no.
Author
Owner

@Zer0cool360 commented on GitHub (Aug 5, 2025):

I'm trying to use the new Open Ai oss models and also cannot use them while my macbook's wifi is off, Even with airplane mode on inside Ollama.

<!-- gh-comment-id:3156551772 --> @Zer0cool360 commented on GitHub (Aug 5, 2025): I'm trying to use the new Open Ai oss models and also cannot use them while my macbook's wifi is off, Even with airplane mode on inside Ollama.
Author
Owner

@emaayan commented on GitHub (Aug 6, 2025):

using windows 11 qwen3:4b , watched cports to verify no connections are made outside, still works.

Image
<!-- gh-comment-id:3158307548 --> @emaayan commented on GitHub (Aug 6, 2025): using windows 11 qwen3:4b , watched cports to verify no connections are made outside, still works. <img width="1581" height="416" alt="Image" src="https://github.com/user-attachments/assets/ac6fca14-947b-42da-ae59-61877ff2295b" />
Author
Owner

@androidguy17 commented on GitHub (Aug 7, 2025):

same bug, it seems it need network connection for processing or logging the input, however once starts generating response, it works regardless if the network is on or off

<!-- gh-comment-id:3162533932 --> @androidguy17 commented on GitHub (Aug 7, 2025): same bug, it seems it need network connection for processing or logging the input, however once starts generating response, it works regardless if the network is on or off
Author
Owner

@pdurlej commented on GitHub (Aug 11, 2025):

Same issue for me, tryied multiple times on latest MacOs

<!-- gh-comment-id:3176673002 --> @pdurlej commented on GitHub (Aug 11, 2025): Same issue for me, tryied multiple times on latest MacOs
Author
Owner

@pstashp commented on GitHub (Aug 12, 2025):

Rebooted the machine, with wifi still off:

pull model manifest: Get "https://registry.ollama.ai/v2/library/gpt-oss/manifests/20b": dial tcp: lookup registry.ollama.ai: no such host

<!-- gh-comment-id:3178218778 --> @pstashp commented on GitHub (Aug 12, 2025): Rebooted the machine, with wifi still off: > pull model manifest: Get "https://registry.ollama.ai/v2/library/gpt-oss/manifests/20b": dial tcp: lookup registry.ollama.ai: no such host
Author
Owner

@neilk17 commented on GitHub (Aug 13, 2025):

@pstashp did this work for you? Why does this problem happen?

<!-- gh-comment-id:3183196524 --> @neilk17 commented on GitHub (Aug 13, 2025): @pstashp did this work for you? Why does this problem happen?
Author
Owner

@marksoceanofcode commented on GitHub (Aug 14, 2025):

I am having the same issue with the Ollama desktop app v0.11.4 on macOS.

<!-- gh-comment-id:3186528566 --> @marksoceanofcode commented on GitHub (Aug 14, 2025): I am having the same issue with the Ollama desktop app v0.11.4 on macOS.
Author
Owner

@AdminPablo commented on GitHub (Aug 14, 2025):

Same issue on v0.11.4 macOS but ollama run model_name working fine

<!-- gh-comment-id:3188957523 --> @AdminPablo commented on GitHub (Aug 14, 2025): Same issue on v0.11.4 macOS but `ollama run model_name` working fine
Author
Owner

@dbann commented on GitHub (Aug 15, 2025):

Same issue on v0.11.4 macOS but ollama run model_name working fine ...

<!-- gh-comment-id:3190974874 --> @dbann commented on GitHub (Aug 15, 2025): Same issue on v0.11.4 macOS but ollama run model_name working fine ...
Author
Owner

@kaarele commented on GitHub (Aug 15, 2025):

Same on 0.11.4 macOS.

<!-- gh-comment-id:3191122164 --> @kaarele commented on GitHub (Aug 15, 2025): Same on 0.11.4 macOS.
Author
Owner

@BruceMacD commented on GitHub (Aug 15, 2025):

Hey everyone, thanks for all the reports. I think there were a few things causing this. The main one is a React query state going into "offline" mode in some scenarios that prevented state changes from propagating. I've also added some better error display and feedback to get to the bottom of other cases. These changes will be in the new release that will be out shortly.

<!-- gh-comment-id:3192339857 --> @BruceMacD commented on GitHub (Aug 15, 2025): Hey everyone, thanks for all the reports. I think there were a few things causing this. The main one is a React query state going into "offline" mode in some scenarios that prevented state changes from propagating. I've also added some better error display and feedback to get to the bottom of other cases. These changes will be in the new release that will be out shortly.
Author
Owner

@PalakSDarji commented on GitHub (Aug 16, 2025):

Same issue I am facing. Does not work without Wifi even though Model is downloaded.

<!-- gh-comment-id:3193473982 --> @PalakSDarji commented on GitHub (Aug 16, 2025): Same issue I am facing. Does not work without Wifi even though Model is downloaded.
Author
Owner

@BruceMacD commented on GitHub (Aug 20, 2025):

My fix is now available in the new release. It should fix most cases or provide better errors now. Leaving this open to collect any new reports.

<!-- gh-comment-id:3208087663 --> @BruceMacD commented on GitHub (Aug 20, 2025): My fix is now available in the new release. It should fix most cases or provide better errors now. Leaving this open to collect any new reports.
Author
Owner

@dbann commented on GitHub (Aug 27, 2025):

My fix is now available in the new release. It should fix most cases or provide better errors now. Leaving this open to collect any new reports.

thanks. a chat query with the already loaded model now works without wifi. however i can not change model unless wifi is on (even already pre-downloaded models) - in the GUI ollama app (macos)

<!-- gh-comment-id:3227442901 --> @dbann commented on GitHub (Aug 27, 2025): > My fix is now available in the new release. It should fix most cases or provide better errors now. Leaving this open to collect any new reports. thanks. a chat query with the already loaded model now works without wifi. however i can not change model unless wifi is on (even already pre-downloaded models) - in the GUI ollama app (macos)
Author
Owner

@BruceMacD commented on GitHub (Aug 27, 2025):

Thanks for the report @dbann, I'm trying to reproduce this scenario now. If you don't mind, a couple more details might help me.

  1. Which Ollama version? You can see this in About Ollama.
Image
  1. What MacOS version are you running?
  2. When you disable your internet connection are you turning off wifi, airplane mode, or something else?
<!-- gh-comment-id:3229596260 --> @BruceMacD commented on GitHub (Aug 27, 2025): Thanks for the report @dbann, I'm trying to reproduce this scenario now. If you don't mind, a couple more details might help me. 1. Which Ollama version? You can see this in `About Ollama`. <img width="341" height="71" alt="Image" src="https://github.com/user-attachments/assets/82e06a34-07c0-4450-8ee8-64e33c3f50de" /> 2. What MacOS version are you running? 3. When you disable your internet connection are you turning off wifi, airplane mode, or something else?
Author
Owner

@kaarele commented on GitHub (Aug 28, 2025):

I can confirm the issue that @dbann is having.
Ollama Version 0.11.6 (0.11.6)
MacOS 15.6.1 (24G90)

When simply turning off wifi during using the app, switching the model does not visually indicate that the model has been changed and it keeps using the previously selected model. When switching wifi back on, the model instantly changes without clicking anywhere. It seems to be probably related to the previous issue, that this action is also waiting for some web request in the background for the moodel change event to happen. It doesn't seem to matter from which model to which other model the switch is tried to be made. Both models are downloaded for offline use.

<!-- gh-comment-id:3232133155 --> @kaarele commented on GitHub (Aug 28, 2025): I can confirm the issue that @dbann is having. Ollama Version 0.11.6 (0.11.6) MacOS 15.6.1 (24G90) When simply turning off wifi during using the app, switching the model does not visually indicate that the model has been changed and it keeps using the previously selected model. When switching wifi back on, the model instantly changes without clicking anywhere. It seems to be probably related to the previous issue, that this action is also waiting for some web request in the background for the moodel change event to happen. It doesn't seem to matter from which model to which other model the switch is tried to be made. Both models are downloaded for offline use.
Author
Owner

@dbann commented on GitHub (Sep 1, 2025):

Thanks for the report @dbann, I'm trying to reproduce this scenario now. If you don't mind, a couple more details might help me.

  1. Which Ollama version? You can see this in About Ollama.
Image 2. What MacOS version are you running? 3. When you disable your internet connection are you turning off wifi, airplane mode, or something else?
  1. 0.11.8
  2. 15.6.1 (24G90)
  3. turning off wifi

In general I think ensuring that the app works as well offline as it does online would be important for local use

<!-- gh-comment-id:3242659726 --> @dbann commented on GitHub (Sep 1, 2025): > Thanks for the report [@dbann](https://github.com/dbann), I'm trying to reproduce this scenario now. If you don't mind, a couple more details might help me. > > 1. Which Ollama version? You can see this in `About Ollama`. > > <img alt="Image" width="341" height="71" src="https://private-user-images.githubusercontent.com/5853428/482840811-82e06a34-07c0-4450-8ee8-64e33c3f50de.png?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3NTY3Mzg4MTEsIm5iZiI6MTc1NjczODUxMSwicGF0aCI6Ii81ODUzNDI4LzQ4Mjg0MDgxMS04MmUwNmEzNC0wN2MwLTQ0NTAtOGVlOC02NGUzM2MzZjUwZGUucG5nP1gtQW16LUFsZ29yaXRobT1BV1M0LUhNQUMtU0hBMjU2JlgtQW16LUNyZWRlbnRpYWw9QUtJQVZDT0RZTFNBNTNQUUs0WkElMkYyMDI1MDkwMSUyRnVzLWVhc3QtMSUyRnMzJTJGYXdzNF9yZXF1ZXN0JlgtQW16LURhdGU9MjAyNTA5MDFUMTQ1NTExWiZYLUFtei1FeHBpcmVzPTMwMCZYLUFtei1TaWduYXR1cmU9MGIzNmRiOGY0ZjBlM2Y4MzhlYThkMzc3NTEzNDdlMTY1NThmMGYyNzQzMWZhMjhiNTdiZWZiZmFkZDVlOGY4MSZYLUFtei1TaWduZWRIZWFkZXJzPWhvc3QifQ.ulSfrR_ZUDTCjt7GNMFvAQG-IVjwvZ52oWkrNw4Oa5w"> > 2. What MacOS version are you running? > 3. When you disable your internet connection are you turning off wifi, airplane mode, or something else? 1. 0.11.8 2. 15.6.1 (24G90) 3. turning off wifi In general I think ensuring that the app works as well offline as it does online would be important for local use
Author
Owner

@eleius commented on GitHub (Sep 13, 2025):

I hope this issue can be fixed soon, as I only use LLMs offline. In the meantime I went back to llama.cpp

<!-- gh-comment-id:3288009025 --> @eleius commented on GitHub (Sep 13, 2025): I hope this issue can be fixed soon, as I only use LLMs offline. In the meantime I went back to llama.cpp
Author
Owner

@rtpHarry commented on GitHub (Sep 15, 2025):

all glory to the hypnotoad! I just updated and now on 0.11.10. Turned wifi off and deepseek is now responding offline. Maybe I will get something done on this flight...

<!-- gh-comment-id:3290136517 --> @rtpHarry commented on GitHub (Sep 15, 2025): all glory to the hypnotoad! I just updated and now on 0.11.10. Turned wifi off and deepseek is now responding offline. Maybe I will get something done on this flight...
Author
Owner

@kaarele commented on GitHub (Sep 15, 2025):

Perfect! The app now works fully offline with the model switching as well on 0.11.10. Thank you!

<!-- gh-comment-id:3290752772 --> @kaarele commented on GitHub (Sep 15, 2025): Perfect! The app now works fully offline with the model switching as well on 0.11.10. Thank you!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#69745