mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 19:08:59 -05:00
[GH-ISSUE #16412] issue: no connection to ollama 'Turbo' model due to CORS violation #56561
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @virusa78 on GitHub (Aug 9, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/16412
Check Existing Issues
Installation Method
Pip Install
Open WebUI Version
0.6.18
Ollama Version (if applicable)
0.11.13
Operating System
Ubuntu 24.04
Browser (if applicable)
Chrome, Brave
Confirmation
README.md.Expected Behavior
'Turbo' model should be added as 'External'
Actual Behavior
Steps to Reproduce
Open WebUI
Go to settings → Admin settings → Connections
Click +
For the URL put https://ollama.com
For the API key, create an API key on https://ollama.com/settings/keys and add it.
Click Save
Click Refresh
Logs & Screenshots
Access to fetch at 'https://ollama.com/models' from origin 'http://localhost:8080' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: Redirect is not allowed for a preflight request.
ollama.com/models:1 Failed to load resource: net::ERR_FAILED
d1dc362c-5e10-4c52-9ed6-85ec3e4dff84:1 Access to fetch at 'https://ollama.com/models' from origin 'http://localhost:8080' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: Redirect is not allowed for a preflight request.
ollama.com/models:1 Failed to load resource: net::ERR_FAILED
Additional Information
No response
@rgaricano commented on GitHub (Aug 9, 2025):
ollama.com is for download models to run in your local installation of ollama, it doesn't serve models.
You need to download ollama & install it: https://ollama.com/download
@tjbck commented on GitHub (Aug 9, 2025):
You should add them as admin connections NOT direct connections.
@virusa78 commented on GitHub (Aug 9, 2025):
thanks! but then. somehow, somebody should limit liberty to add any kind of direct connections )
@rgaricano commented on GitHub (Aug 9, 2025):
(I think Tim has misunderstood your message, which is somewhat confusing, by the way)
No, also you can add it as direct connection, but then you have to add it as openai API compatible (ollama serve openai API compatible endpoints just adding /v1 as suffix to your ollama server address string e.g. http://myollamaserver:11434/v1) also it must be publicly accessible (public IP if used outside the local network)
But first, as I mentioned, you need to install Ollama in order to serve the models that you can download from ollama.com.
@tjbck commented on GitHub (Aug 9, 2025):
FYI, CORS settings are set by Ollama.com hence we cannot make api calls directly from the browser, and if you set the connections from the admin settings, the api call are made from the backend which should effectively resolve this issue. Hope that clarifies!
@rgaricano commented on GitHub (Aug 9, 2025):
@tjbck,
Yes Tim, but the main problem is that @virusa78 is trying to add
http://ollama.comas the ollama model provider, no his own ollama server. And of course, ollama.com responds: no way!@BreakNRetest commented on GitHub (Aug 16, 2025):
I believe he is trying to use Ollama Turbo models(cloud), not the local models. I couldn't figure it out either.
@onelonemonkey commented on GitHub (Aug 23, 2025):
I'm keen on this too, am going to look into the admin connection
@bluekeybo commented on GitHub (Aug 29, 2025):
How can the Ollama Turbo models be accessed via OpenWebUI?
@rgaricano commented on GitHub (Aug 29, 2025):
https://github.com/ollama/ollama/blob/main/docs/turbo.md#community-integrations
@bluekeybo commented on GitHub (Aug 29, 2025):
Worked perfectly! Thank you!