[GH-ISSUE #7880] Add a CORS permissions model into the Ollama UI ("Allow example.com to use Ollama? [Yes] [No]") #51551

Open
opened 2026-04-28 20:34:10 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @blixt on GitHub (Nov 29, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7880

Lots of AI apps out there solve access to LLM in a few different ways:

  • Directly use a hosted model and foot the bill for the user
  • Ask the user to provide their own hosted model API key (😬)
  • Let the user host the app themselves, providing the API key this way
  • Connect with a local model provider like Ollama, but this has several issues today1

I think Ollama is in wide enough circulation that it could create a permissions standard around local model access from the browser. An initial draft of this could be very simple:

The first time a request comes in with an Origin value that's never been seen before, hold the request and ask the user with a system notification: "Allow example.com to use Ollama?" If the user chooses to allow, the domain gets added to an allow list, which is used to send a valid CORS header to the incoming request. If the user chooses to deny, add the domain to a deny list which just means the CORS header will not be sent. If the user makes no choice, then time out the request and ask again next time.

This can still be combined with the existing OLLAMA_ORIGINS setting so if something is in there it's automatically allowed (except for *).


  1. The first issue is that the user must run terminal commands to enable CORS. The second one is that unless they use * then turning on access for one app will remove access for another (unless they know how to read and combine the list of domains). And finally the third one is that the user will be lazy and pick * and now any site in the world can use their local model. ↩︎

Originally created by @blixt on GitHub (Nov 29, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7880 Lots of AI apps out there solve access to LLM in a few different ways: - Directly use a hosted model and foot the bill for the user - Ask the user to provide their own hosted model API key (😬) - Let the user host the app themselves, providing the API key this way - Connect with a local model provider like Ollama, but this has several issues today[^1] I think Ollama is in wide enough circulation that it could create a permissions standard around local model access from the browser. An initial draft of this could be very simple: The first time a request comes in with an `Origin` value that's never been seen before, hold the request and ask the user with a system notification: "Allow example.com to use Ollama?" If the user chooses to allow, the domain gets added to an allow list, which is used to send a valid CORS header to the incoming request. If the user chooses to deny, add the domain to a deny list which just means the CORS header will not be sent. If the user makes no choice, then time out the request and ask again next time. This can still be combined with the existing `OLLAMA_ORIGINS` setting so if something is in there it's automatically allowed (except for `*`). [^1]: The first issue is that the user must run terminal commands to enable CORS. The second one is that unless they use `*` then turning on access for one app will remove access for another (unless they know how to read and combine the list of domains). And finally the third one is that the user will be lazy and pick `*` and now any site in the world can use their local model.
GiteaMirror added the feature request label 2026-04-28 20:34:10 -05:00
Author
Owner

@j05hau commented on GitHub (Dec 12, 2024):

So much this. I would love to host OpenWebUI on a cloud but have the inference done on the local Ollama server by the user who is interacting with the cloud-hosted interface, among other interesting uses this would compliment.

<!-- gh-comment-id:2540156334 --> @j05hau commented on GitHub (Dec 12, 2024): So much this. I would love to host OpenWebUI on a cloud but have the inference done on the local Ollama server by the user who is interacting with the cloud-hosted interface, among other interesting uses this would compliment.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#51551