[GH-ISSUE #16412] issue: no connection to ollama 'Turbo' model due to CORS violation #33424

Closed
opened 2026-04-25 07:19:59 -05:00 by GiteaMirror · 11 comments
Owner

Originally created by @virusa78 on GitHub (Aug 9, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/16412

Check Existing Issues

  • I have searched the existing issues and discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Pip Install

Open WebUI Version

0.6.18

Ollama Version (if applicable)

0.11.13

Operating System

Ubuntu 24.04

Browser (if applicable)

Chrome, Brave

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

'Turbo' model should be added as 'External'

Actual Behavior

Image Image

Steps to Reproduce

Open WebUI
Go to settings → Admin settings → Connections
Click +
For the URL put https://ollama.com
For the API key, create an API key on https://ollama.com/settings/keys and add it.
Click Save
Click Refresh

Logs & Screenshots

Access to fetch at 'https://ollama.com/models' from origin 'http://localhost:8080' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: Redirect is not allowed for a preflight request.
ollama.com/models:1 Failed to load resource: net::ERR_FAILED
d1dc362c-5e10-4c52-9ed6-85ec3e4dff84:1 Access to fetch at 'https://ollama.com/models' from origin 'http://localhost:8080' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: Redirect is not allowed for a preflight request.
ollama.com/models:1 Failed to load resource: net::ERR_FAILED

Additional Information

No response

Originally created by @virusa78 on GitHub (Aug 9, 2025). Original GitHub issue: https://github.com/open-webui/open-webui/issues/16412 ### Check Existing Issues - [x] I have searched the existing issues and discussions. - [x] I am using the latest version of Open WebUI. ### Installation Method Pip Install ### Open WebUI Version 0.6.18 ### Ollama Version (if applicable) 0.11.13 ### Operating System Ubuntu 24.04 ### Browser (if applicable) Chrome, Brave ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have **provided every relevant configuration, setting, and environment variable used in my setup.** - [x] I have clearly **listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup** (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc). - [x] I have documented **step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation**. My steps: - Start with the initial platform/version/OS and dependencies used, - Specify exact install/launch/configure commands, - List URLs visited, user input (incl. example values/emails/passwords if needed), - Describe all options and toggles enabled or changed, - Include any files or environmental changes, - Identify the expected and actual result at each stage, - Ensure any reasonably skilled user can follow and hit the same issue. ### Expected Behavior 'Turbo' model should be added as 'External' ### Actual Behavior <img width="1029" height="699" alt="Image" src="https://github.com/user-attachments/assets/1a38be3f-9fac-45ac-8981-9fe3776d68c2" /> <img width="1331" height="68" alt="Image" src="https://github.com/user-attachments/assets/b976d90d-4570-4442-b1a5-cb37879fb002" /> ### Steps to Reproduce Open WebUI Go to settings → Admin settings → Connections Click + For the URL put https://ollama.com For the API key, create an API key on https://ollama.com/settings/keys and add it. Click Save Click Refresh ### Logs & Screenshots Access to fetch at 'https://ollama.com/models' from origin 'http://localhost:8080' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: Redirect is not allowed for a preflight request. ollama.com/models:1 Failed to load resource: net::ERR_FAILED d1dc362c-5e10-4c52-9ed6-85ec3e4dff84:1 Access to fetch at 'https://ollama.com/models' from origin 'http://localhost:8080' has been blocked by CORS policy: Response to preflight request doesn't pass access control check: Redirect is not allowed for a preflight request. ollama.com/models:1 Failed to load resource: net::ERR_FAILED ### Additional Information _No response_
GiteaMirror added the bug label 2026-04-25 07:19:59 -05:00
Author
Owner

@rgaricano commented on GitHub (Aug 9, 2025):

ollama.com is for download models to run in your local installation of ollama, it doesn't serve models.
You need to download ollama & install it: https://ollama.com/download

<!-- gh-comment-id:3171523308 --> @rgaricano commented on GitHub (Aug 9, 2025): ollama.com is for download models to run in your local installation of ollama, it doesn't serve models. You need to download ollama & install it: https://ollama.com/download
Author
Owner

@tjbck commented on GitHub (Aug 9, 2025):

You should add them as admin connections NOT direct connections.

<!-- gh-comment-id:3171897195 --> @tjbck commented on GitHub (Aug 9, 2025): You should add them as admin connections NOT direct connections.
Author
Owner

@virusa78 commented on GitHub (Aug 9, 2025):

You should add them as admin connections NOT direct connections.

thanks! but then. somehow, somebody should limit liberty to add any kind of direct connections )

<!-- gh-comment-id:3171917610 --> @virusa78 commented on GitHub (Aug 9, 2025): > You should add them as admin connections NOT direct connections. thanks! but then. somehow, somebody should limit liberty to add any kind of direct connections )
Author
Owner

@rgaricano commented on GitHub (Aug 9, 2025):

(I think Tim has misunderstood your message, which is somewhat confusing, by the way)

No, also you can add it as direct connection, but then you have to add it as openai API compatible (ollama serve openai API compatible endpoints just adding /v1 as suffix to your ollama server address string e.g. http://myollamaserver:11434/v1) also it must be publicly accessible (public IP if used outside the local network)

But first, as I mentioned, you need to install Ollama in order to serve the models that you can download from ollama.com.

<!-- gh-comment-id:3171949945 --> @rgaricano commented on GitHub (Aug 9, 2025): (I think Tim has misunderstood your message, which is somewhat confusing, by the way) No, also you can add it as direct connection, but then you have to add it as openai API compatible (ollama serve openai API compatible endpoints just adding /v1 as suffix to your ollama server address string e.g. _http://myollamaserver:11434/v1_) also it must be publicly accessible (public IP if used outside the local network) But first, as I mentioned, you need to install Ollama in order to serve the models that you can download from ollama.com.
Author
Owner

@tjbck commented on GitHub (Aug 9, 2025):

FYI, CORS settings are set by Ollama.com hence we cannot make api calls directly from the browser, and if you set the connections from the admin settings, the api call are made from the backend which should effectively resolve this issue. Hope that clarifies!

<!-- gh-comment-id:3171958869 --> @tjbck commented on GitHub (Aug 9, 2025): FYI, CORS settings are set by Ollama.com hence we cannot make api calls directly from the browser, and if you set the connections from the admin settings, the api call are made from the backend which should effectively resolve this issue. Hope that clarifies!
Author
Owner

@rgaricano commented on GitHub (Aug 9, 2025):

@tjbck,
Yes Tim, but the main problem is that @virusa78 is trying to add http://ollama.com as the ollama model provider, no his own ollama server. And of course, ollama.com responds: no way!

<!-- gh-comment-id:3172036964 --> @rgaricano commented on GitHub (Aug 9, 2025): @tjbck, Yes Tim, but the main problem is that @virusa78 is trying to add `http://ollama.com` as the ollama model provider, no his own ollama server. And of course, ollama.com responds: no way!
Author
Owner

@BreakNRetest commented on GitHub (Aug 16, 2025):

I believe he is trying to use Ollama Turbo models(cloud), not the local models. I couldn't figure it out either.

<!-- gh-comment-id:3193075812 --> @BreakNRetest commented on GitHub (Aug 16, 2025): I believe he is trying to use Ollama Turbo models(cloud), not the local models. I couldn't figure it out either.
Author
Owner

@onelonemonkey commented on GitHub (Aug 23, 2025):

I'm keen on this too, am going to look into the admin connection

<!-- gh-comment-id:3216605139 --> @onelonemonkey commented on GitHub (Aug 23, 2025): I'm keen on this too, am going to look into the admin connection
Author
Owner

@bluekeybo commented on GitHub (Aug 29, 2025):

How can the Ollama Turbo models be accessed via OpenWebUI?

<!-- gh-comment-id:3235728037 --> @bluekeybo commented on GitHub (Aug 29, 2025): How can the Ollama Turbo models be accessed via OpenWebUI?
Author
Owner

@rgaricano commented on GitHub (Aug 29, 2025):

How can the Ollama Turbo models be accessed via OpenWebUI?

https://github.com/ollama/ollama/blob/main/docs/turbo.md#community-integrations

<!-- gh-comment-id:3236069885 --> @rgaricano commented on GitHub (Aug 29, 2025): > How can the Ollama Turbo models be accessed via OpenWebUI? https://github.com/ollama/ollama/blob/main/docs/turbo.md#community-integrations
Author
Owner

@bluekeybo commented on GitHub (Aug 29, 2025):

How can the Ollama Turbo models be accessed via OpenWebUI?

https://github.com/ollama/ollama/blob/main/docs/turbo.md#community-integrations

Worked perfectly! Thank you!

<!-- gh-comment-id:3236122387 --> @bluekeybo commented on GitHub (Aug 29, 2025): > > How can the Ollama Turbo models be accessed via OpenWebUI? > > https://github.com/ollama/ollama/blob/main/docs/turbo.md#community-integrations Worked perfectly! Thank you!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#33424