[GH-ISSUE #5834] Windows Client: Provide a way to allow connections to Ollama from web browser origins other than localhost and 0.0.0.0 #50147

Open
opened 2026-04-28 14:25:14 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @Dinkh on GitHub (Jul 21, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/5834

Running my WebApp on my machine works.

import ollama from "ollama/browser"

ollama.list().then(...)
// => http:127.0.0.1:11434/api/tags

Running it from my web host does not work

ollama.list().then(...)
// options => 204
// get     => GET http://127.0.0.1:11434/api/tags net::ERR_FAILED
           Access to fetch at 'http://127.0.0.1:11434/api/tags' from origin 'https://myWebSpace.com' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Private-Network' header was present in the preflight response for this private network request targeting the `local` address space.

Ollama installation:
Windows client 0.2.7

I have the server running as follows (without that the OPTIONS call doesn't work):

OLLAMA_ORIGINS=https://myWebSpace.com *

It would be great if this would work with Windows too.
(seems to work for other platforms https://github.com/ollama/ollama/issues/300)

After running ollama serve I can see that the OPTIONS calls, but not the other calls.

[GIN] 2024/07/21 - 23:06:59 | 204 |            0s |       127.0.0.1 | OPTIONS  "/api/tags"

And the settings from the serve call

 OLLAMA_DEBUG:true
 OLLAMA_FLASH_ATTENTION:false 
 OLLAMA_HOST:http://127.0.0.1:11434
 OLLAMA_INTEL_GPU:false
 OLLAMA_KEEP_ALIVE:5m0s
 OLLAMA_LLM_LIBRARY:
 OLLAMA_MAX_LOADED_MODELS:0
 OLLAMA_MAX_QUEUE:512
 OLLAMA_MAX_VRAM:0
 OLLAMA_MODELS:d:\\ollama\\models
 OLLAMA_NOHISTORY:false
 OLLAMA_NOPRUNE:false
 OLLAMA_NUM_PARALLEL:0
 OLLAMA_ORIGINS:[http://myWebSpace.com * http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://*]
OLLAMA_RUNNERS_DIR:C:\\Users\\LLM\\AppData\\Local\\Programs\\Ollama\\ollama_runners
 OLLAMA_SCHED_SPREAD:false 
 OLLAMA_TMPDIR:d:\\ollama\\tmpDir 
Originally created by @Dinkh on GitHub (Jul 21, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/5834 Running my WebApp on my machine works. ``` import ollama from "ollama/browser" ollama.list().then(...) // => http:127.0.0.1:11434/api/tags ``` Running it from my web host does not work ``` ollama.list().then(...) // options => 204 // get => GET http://127.0.0.1:11434/api/tags net::ERR_FAILED Access to fetch at 'http://127.0.0.1:11434/api/tags' from origin 'https://myWebSpace.com' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Private-Network' header was present in the preflight response for this private network request targeting the `local` address space. ``` Ollama installation: Windows client 0.2.7 I have the server running as follows (without that the OPTIONS call doesn't work): ``` OLLAMA_ORIGINS=https://myWebSpace.com * ``` It would be great if this would work with Windows too. (seems to work for other platforms https://github.com/ollama/ollama/issues/300) After running `ollama serve` I can see that the OPTIONS calls, but not the other calls. ``` [GIN] 2024/07/21 - 23:06:59 | 204 | 0s | 127.0.0.1 | OPTIONS "/api/tags" ``` And the settings from the serve call ``` OLLAMA_DEBUG:true OLLAMA_FLASH_ATTENTION:false OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MAX_VRAM:0 OLLAMA_MODELS:d:\\ollama\\models OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://myWebSpace.com * http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://*] OLLAMA_RUNNERS_DIR:C:\\Users\\LLM\\AppData\\Local\\Programs\\Ollama\\ollama_runners OLLAMA_SCHED_SPREAD:false OLLAMA_TMPDIR:d:\\ollama\\tmpDir ```
GiteaMirror added the feature requestapi labels 2026-04-28 14:25:17 -05:00
Author
Owner

@lsalazarm99 commented on GitHub (Jul 21, 2024):

I had to set the environment variable OLLAMA_HOST to 0.0.0.0:11434 and restart the Ollama app.

It allowed my web server instance (Traefik) running on another machine to access the Ollama REST API running in my Windows machine thought the 192.168.x.x IP address.

<!-- gh-comment-id:2241782850 --> @lsalazarm99 commented on GitHub (Jul 21, 2024): I had to set the environment variable `OLLAMA_HOST` to `0.0.0.0:11434` and restart the Ollama app. It allowed my web server instance (Traefik) running on another machine to access the Ollama REST API running in my Windows machine thought the 192.168.x.x IP address.
Author
Owner

@Dinkh commented on GitHub (Jul 22, 2024):

I changed OLLAMA_HOST to 0.0.0.0:11434 and restarted. Still the same issue

<!-- gh-comment-id:2242216331 --> @Dinkh commented on GitHub (Jul 22, 2024): I changed `OLLAMA_HOST` to `0.0.0.0:11434` and restarted. Still the same issue
Author
Owner

@louis030195 commented on GitHub (Aug 3, 2024):

same issue using https://tauri.app which uses http://tauri.localhost

<!-- gh-comment-id:2266786341 --> @louis030195 commented on GitHub (Aug 3, 2024): same issue using https://tauri.app which uses http://tauri.localhost
Author
Owner

@noisecode3 commented on GitHub (Sep 22, 2024):

defaulting to 0.0.0.0 is terrible for security but okej :) I see that my vpn has some DROP rule for 0.0.0.0 ... I'm on Linux with vpn and I have this problem to, lol. We can change it in the docker here right ? I think I give up this application for now...

docker: Error response from daemon: driver failed programming external connectivity on endpoint ollama (5cc79b59027a233a6755c6c72994b366b6707a98680181b7ae31005cf3bc6e45): failed to bind port 0.0.0.0:11434/tcp: Error starting userland proxy:.
<!-- gh-comment-id:2366824134 --> @noisecode3 commented on GitHub (Sep 22, 2024): defaulting to 0.0.0.0 is terrible for security but okej :) I see that my vpn has some DROP rule for 0.0.0.0 ... I'm on Linux with vpn and I have this problem to, lol. We can change it in the docker [here](https://github.com/ollama/ollama/blob/ad935f45ac19a8ba090db32580f3a6469e9858bb/Dockerfile#L244) right ? I think I give up this application for now... ``` docker: Error response from daemon: driver failed programming external connectivity on endpoint ollama (5cc79b59027a233a6755c6c72994b366b6707a98680181b7ae31005cf3bc6e45): failed to bind port 0.0.0.0:11434/tcp: Error starting userland proxy:. ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#50147