[GH-ISSUE #11690] Add option to Ollama settings to disable auto-launch of the chat UI every time the server is restarted #33495

Open
opened 2026-04-22 16:14:29 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @kullervo-wanona on GitHub (Aug 5, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/11690

Hi Ollama team,

Thank you for the chat UI. It is a useful feature.

I was wondering whether it is possible to add a setting option where it disables the auto-launch of this chat UI every time the server is restarted. My application requires the server to be restarted regularly and the chat UI is a big distraction. It pops up every time this happens and it has to be manually closed.

It would be a good option to add for people who are not using the chat UI but using the other pathways.

Thank you very much.

I am developing on a Windows machine if that matters.

Originally created by @kullervo-wanona on GitHub (Aug 5, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/11690 Hi Ollama team, Thank you for the chat UI. It is a useful feature. I was wondering whether it is possible to add a setting option where it disables the auto-launch of this chat UI every time the server is restarted. My application requires the server to be restarted regularly and the chat UI is a big distraction. It pops up every time this happens and it has to be manually closed. It would be a good option to add for people who are not using the chat UI but using the other pathways. Thank you very much. I am developing on a Windows machine if that matters.
GiteaMirror added the feature request label 2026-04-22 16:14:29 -05:00
Author
Owner

@bcgrillo commented on GitHub (Aug 6, 2025):

I would even go further. A UI-less installation mode would be great for all those applications we have created based on Ollama's http API ❤️

<!-- gh-comment-id:3158099972 --> @bcgrillo commented on GitHub (Aug 6, 2025): I would even go further. A UI-less installation mode would be great for all those applications we have created based on Ollama's http API ❤️
Author
Owner

@bcgrillo commented on GitHub (Aug 6, 2025):

I think #11604 is the same feature request

<!-- gh-comment-id:3158142895 --> @bcgrillo commented on GitHub (Aug 6, 2025): I think #11604 is the same feature request
Author
Owner

@kullervo-wanona commented on GitHub (Aug 6, 2025):

Fell free to close this if this is a replicate, Ollama team. Please see my comment in that other feature request as well.

Thank you again.

<!-- gh-comment-id:3159017242 --> @kullervo-wanona commented on GitHub (Aug 6, 2025): Fell free to close this if this is a replicate, Ollama team. Please see my comment in that other feature request as well. Thank you again.
Author
Owner

@bcgrillo commented on GitHub (Sep 13, 2025):

@kullervo-wanona and others :)

Returning to this issue after some time away, I have seen that Ollama now offers ways to install and run it without an interface.
For installation, you can see the instructions: https://github.com/ollama/ollama/blob/main/docs/windows.md, where a ZIP file with the executable is provided directly ollama-windows-amd64.zip.

And to launch the application without a GUI, simply run ‘ollama serve’; at least for me, it is working without any problems.

<!-- gh-comment-id:3288853992 --> @bcgrillo commented on GitHub (Sep 13, 2025): @kullervo-wanona and others :) Returning to this issue after some time away, I have seen that Ollama now offers ways to install and run it without an interface. For installation, you can see the instructions: https://github.com/ollama/ollama/blob/main/docs/windows.md, where a ZIP file with the executable is provided directly `ollama-windows-amd64.zip`. And to launch the application without a GUI, simply run ‘ollama serve’; at least for me, it is working without any problems.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#33495