[GH-ISSUE #10507] Fully support Tauri in OLLAMA_ORIGINS (Specifically missing Windows support) #68973

Open
opened 2026-05-04 16:32:05 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @camfulton on GitHub (Apr 30, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10507

Hi,

I see that ollama has support for requests coming from the tauri:// protocol in the defaults for OLLAMA_ORIGINS, which is great! However, because of backwards compatibility issues with Windows WebView, applications built for Windows using Tauri send requests from http(s)://tauri.localhost, rather than using the tauri:// protocol.

This means anyone developing apps with Tauri will either need to make changes to the OLLAMA_ORIGINS envar and prompt users to restart ollama, or ask users to make those changes themselves, but only for their Windows builds.

Given that you were willing to add support for the protocol, would the team be open to a PR which fully fleshes out support for Tauri? I'd be happy to open one up if the team would accept it.

Thanks!

Originally created by @camfulton on GitHub (Apr 30, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10507 Hi, I see that ollama has support for requests coming from the `tauri://` protocol in the defaults for OLLAMA_ORIGINS, which is great! However, because of backwards compatibility issues with Windows WebView, applications built for Windows using Tauri send requests from `http(s)://tauri.localhost`, rather than using the `tauri://` protocol. This means anyone developing apps with Tauri will either need to make changes to the `OLLAMA_ORIGINS` envar and prompt users to restart ollama, or ask users to make those changes themselves, but only for their Windows builds. Given that you were willing to add support for the protocol, would the team be open to a PR which fully fleshes out support for Tauri? **I'd be happy to open one up if the team would accept it.** Thanks!
GiteaMirror added the feature request label 2026-05-04 16:32:05 -05:00
Author
Owner

@miraculix95 commented on GitHub (May 1, 2026):

Hitting this exact issue today — Voquill (a Tauri 2.x voice dictation app) on Windows fails to connect to Ollama out of the box because of the missing http://tauri.localhost (and https://tauri.localhost) in the default OLLAMA_ORIGINS whitelist.

The user-side fix is documented in the Voquill issue tracker and requires:

[Environment]::SetEnvironmentVariable("OLLAMA_ORIGINS", "http://tauri.localhost", "User")

… plus a Windows logoff/login (because the running Ollama tray inherits env from the original login session). For users not deeply familiar with Windows env-var propagation quirks this turns into a multi-hour debugging marathon for what should be a zero-config experience.

Tauri 2.x has been stable for over a year now and adoption is growing fast (Voquill is just one of several Tauri-based AI dictation/agent apps showing up). Folding http://tauri.localhost and https://tauri.localhost into the default whitelist would solve this for everyone with no real security cost — those URLs are RFC-6761 reserved for loopback only, no public address resolution possible.

@camfulton — happy to test a PR if your offer in the original post is still on the table.

<!-- gh-comment-id:4357582452 --> @miraculix95 commented on GitHub (May 1, 2026): Hitting this exact issue today — Voquill (a Tauri 2.x voice dictation app) on Windows fails to connect to Ollama out of the box because of the missing `http://tauri.localhost` (and `https://tauri.localhost`) in the default `OLLAMA_ORIGINS` whitelist. The user-side fix is documented in the [Voquill issue tracker](https://github.com/voquill/voquill/issues/21) and requires: ```powershell [Environment]::SetEnvironmentVariable("OLLAMA_ORIGINS", "http://tauri.localhost", "User") ``` … plus a Windows logoff/login (because the running Ollama tray inherits env from the original login session). For users not deeply familiar with Windows env-var propagation quirks this turns into a multi-hour debugging marathon for what should be a zero-config experience. Tauri 2.x has been stable for over a year now and adoption is growing fast (Voquill is just one of several Tauri-based AI dictation/agent apps showing up). Folding `http://tauri.localhost` and `https://tauri.localhost` into the default whitelist would solve this for everyone with no real security cost — those URLs are RFC-6761 reserved for loopback only, no public address resolution possible. @camfulton — happy to test a PR if your offer in the original post is still on the table.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#68973