[GH-ISSUE #6489] Error 403 occurs when I call ollama's api #4083

Closed
opened 2026-04-12 14:59:15 -05:00 by GiteaMirror · 23 comments
Owner

Originally created by @brownplayer on GitHub (Aug 24, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6489

Originally assigned to: @jmorganca on GitHub.

What is the issue?

Prerequisite: Use the C++ interface of ipex-llm as ollama's acceleration backend. Then start the ollama server (port 127.0.0.1.11434). When you use the edge browser plug-in to access the api of ollama, error 403 occurs
image

OS

Windows

GPU

Intel

CPU

Intel

Ollama version

No response

Originally created by @brownplayer on GitHub (Aug 24, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6489 Originally assigned to: @jmorganca on GitHub. ### What is the issue? Prerequisite: Use the C++ interface of ipex-llm as ollama's acceleration backend. Then start the ollama server (port 127.0.0.1.11434). When you use the edge browser plug-in to access the api of ollama, error 403 occurs ![image](https://github.com/user-attachments/assets/b9d4eee0-0293-41e7-9437-23ae4dc660af) ### OS Windows ### GPU Intel ### CPU Intel ### Ollama version _No response_
GiteaMirror added the bug label 2026-04-12 14:59:15 -05:00
Author
Owner

@rick-github commented on GitHub (Aug 24, 2024):

The body of the 403 response might provide a reason for the failure, is it possible for you to capture it? Perhaps in the logs for ipex-llm?

<!-- gh-comment-id:2308355334 --> @rick-github commented on GitHub (Aug 24, 2024): The body of the 403 response might provide a reason for the failure, is it possible for you to capture it? Perhaps in the logs for ipex-llm?
Author
Owner

@rick-github commented on GitHub (Aug 24, 2024):

So it seems ipex-llm is a shim for ollama to enable intel GPU use, so it's not responsible for the 403s. What does the edge browser plug-in show when the 403 error is returned?

What does the following return:

curl http://localhost:11434/api/generate -d "
{
   \"model\": \"llama3.1\",
   \"prompt\": \"Why is the sky blue?\",
   \"stream\": false
}"
<!-- gh-comment-id:2308356911 --> @rick-github commented on GitHub (Aug 24, 2024): So it seems ipex-llm is a shim for ollama to enable intel GPU use, so it's not responsible for the 403s. What does the edge browser plug-in show when the 403 error is returned? What does the following return: ``` curl http://localhost:11434/api/generate -d " { \"model\": \"llama3.1\", \"prompt\": \"Why is the sky blue?\", \"stream\": false }" ```
Author
Owner

@brownplayer commented on GitHub (Aug 24, 2024):

The command output is as follows: [403:fetchError] The service returns an error, and no permission is granted to access the service

<!-- gh-comment-id:2308357872 --> @brownplayer commented on GitHub (Aug 24, 2024): The command output is as follows: [403:fetchError] The service returns an error, and no permission is granted to access the service
Author
Owner

@rick-github commented on GitHub (Aug 24, 2024):

That looks like a generic error message from the plug-in. What is the plug-in? What happens when you run the curl command in a cmd window?

<!-- gh-comment-id:2308358434 --> @rick-github commented on GitHub (Aug 24, 2024): That looks like a generic error message from the plug-in. What is the plug-in? What happens when you run the curl command in a cmd window?
Author
Owner

@brownplayer commented on GitHub (Aug 24, 2024):

Hold on. I'm trying

<!-- gh-comment-id:2308358713 --> @brownplayer commented on GitHub (Aug 24, 2024): Hold on. I'm trying
Author
Owner

@brownplayer commented on GitHub (Aug 24, 2024):

When I open the ollama server (ollama serve), I also open a terminal serial port to call the ollama server. Everything is normal, and the server returns a message indicating that the server is running properly. The name of the web plugin is Immersive Translate.
image
image
The first picture shows my local call to the ollama server (everything is fine). Figure 2 shows the name of the plug-in.
I don't know if the reason I'm using ollama is not the community edition

<!-- gh-comment-id:2308362668 --> @brownplayer commented on GitHub (Aug 24, 2024): When I open the ollama server (ollama serve), I also open a terminal serial port to call the ollama server. Everything is normal, and the server returns a message indicating that the server is running properly. The name of the web plugin is Immersive Translate. ![image](https://github.com/user-attachments/assets/64dfd3bf-3fde-40ff-bbb6-5bf23b6453c5) ![image](https://github.com/user-attachments/assets/feb8ffc3-f3d7-4d3d-928c-6b473eaf3554) The first picture shows my local call to the ollama server (everything is fine). Figure 2 shows the name of the plug-in. I don't know if the reason I'm using ollama is not the community edition
Author
Owner

@brownplayer commented on GitHub (Aug 24, 2024):

hi,I tried to connect the api that calls the ollama server to anythingllm, and the result was very successful. Does this mean that there is no problem with the ollama server?

<!-- gh-comment-id:2308371680 --> @brownplayer commented on GitHub (Aug 24, 2024): hi,I tried to connect the api that calls the ollama server to anythingllm, and the result was very successful. Does this mean that there is no problem with the ollama server?
Author
Owner

@rick-github commented on GitHub (Aug 24, 2024):

I originally thought that his might be the CORS issue reported in #5838. However, the problem in this case is that the Immersive Translate extension is sending an 'Origin: chrome-extension://amkbndfnliijdhojkpoglbnaaahippg' header which is not in the list of allowed origins. You can fix this by adding the environment variable OLLAMA_ORIGINS=chrome-extension://* to the server config.

<!-- gh-comment-id:2308577251 --> @rick-github commented on GitHub (Aug 24, 2024): I originally thought that his might be the CORS issue reported in #5838. However, the problem in this case is that the Immersive Translate extension is sending an 'Origin: chrome-extension://amkbndfnliijdhojkpoglbnaaahippg' header which is not in the list of allowed origins. You can fix this by [adding the environment variable](https://github.com/ollama/ollama/blob/main/docs/faq.md#setting-environment-variables-on-windows) `OLLAMA_ORIGINS=chrome-extension://*` to the server config.
Author
Owner

@brownplayer commented on GitHub (Aug 25, 2024):

Great. Problem solved. Thank you for your contribution

<!-- gh-comment-id:2308615794 --> @brownplayer commented on GitHub (Aug 25, 2024): Great. Problem solved. Thank you for your contribution
Author
Owner

@brownplayer commented on GitHub (Aug 25, 2024):

I will close this issue and thank you again for your help

<!-- gh-comment-id:2308616010 --> @brownplayer commented on GitHub (Aug 25, 2024): I will close this issue and thank you again for your help
Author
Owner

@Wayne-Kim commented on GitHub (Mar 24, 2025):

for mac user. good luck!

launchctl setenv OLLAMA_ORIGINS "chrome-extension://*"

you should turn off ollama( if you are using termial also). and than restart ollama.

<!-- gh-comment-id:2748345896 --> @Wayne-Kim commented on GitHub (Mar 24, 2025): for mac user. good luck! ``` launchctl setenv OLLAMA_ORIGINS "chrome-extension://*" ``` you should turn off ollama( if you are using termial also). and than restart ollama.
Author
Owner

@deepanjan0604 commented on GitHub (May 22, 2025):

[GIN] 2025/05/22 - 14:31:50 | 403 | 38.883µs | 10.244.4.1 | POST "/api/generate"

Getting 403 when I am trying to hit the ollama pod hosted on Azure Kubernetes Service AKS Cluster from postman using https://test.llm.com/llm/api/generate through ingress

Kindly suggest a fix

Thanks in advance!

<!-- gh-comment-id:2901469301 --> @deepanjan0604 commented on GitHub (May 22, 2025): `[GIN] 2025/05/22 - 14:31:50 | 403 | 38.883µs | 10.244.4.1 | POST "/api/generate"` Getting 403 when I am trying to hit the ollama pod hosted on Azure Kubernetes Service AKS Cluster from postman using **https://test.llm.com/llm/api/generate** through ingress Kindly suggest a fix Thanks in advance!
Author
Owner

@rick-github commented on GitHub (May 22, 2025):

@deepanjan0604 https://github.com/ollama/ollama/issues/6489#issuecomment-2308355334

<!-- gh-comment-id:2901482654 --> @rick-github commented on GitHub (May 22, 2025): @deepanjan0604 https://github.com/ollama/ollama/issues/6489#issuecomment-2308355334
Author
Owner

@deepanjan0604 commented on GitHub (May 22, 2025):

Is there any way we can run ollama serve on verbose mode? @rick-github

<!-- gh-comment-id:2901515221 --> @deepanjan0604 commented on GitHub (May 22, 2025): Is there any way we can run ollama serve on verbose mode? @rick-github
Author
Owner

@rick-github commented on GitHub (May 22, 2025):

OLLAMA_DEBUG=1 in the server environment.

<!-- gh-comment-id:2901520597 --> @rick-github commented on GitHub (May 22, 2025): `OLLAMA_DEBUG=1` in the server environment.
Author
Owner

@deepanjan0604 commented on GitHub (May 22, 2025):

@rick-github still getting: [GIN] 2025/05/22 - 14:51:26 | 403 | 26.68µs | 10.173.177.85 | POST "/api/generate"

Debug mode is not coming up

<!-- gh-comment-id:2901529744 --> @deepanjan0604 commented on GitHub (May 22, 2025): @rick-github still getting: [GIN] 2025/05/22 - 14:51:26 | 403 | 26.68µs | 10.173.177.85 | POST "/api/generate" Debug mode is not coming up
Author
Owner

@rick-github commented on GitHub (May 22, 2025):

Are you trying to get the 403 response? Look in the client logs, not ollama logs.

<!-- gh-comment-id:2901536052 --> @rick-github commented on GitHub (May 22, 2025): Are you trying to get the 403 response? Look in the client logs, not ollama logs.
Author
Owner

@deepanjan0604 commented on GitHub (May 22, 2025):

No, When I am trying to hit /api/generate api from postman which goes through azure aks ingress it is giving me 403 forbidden error and in the ollama pod logs it is giving 403

<!-- gh-comment-id:2901920292 --> @deepanjan0604 commented on GitHub (May 22, 2025): No, When I am trying to hit /api/generate api from postman which goes through azure aks ingress it is giving me 403 forbidden error and in the ollama pod logs it is giving 403
Author
Owner

@thinkrivan commented on GitHub (Jun 13, 2025):

im having the same problem, it used to work just fine though

<!-- gh-comment-id:2968672403 --> @thinkrivan commented on GitHub (Jun 13, 2025): im having the same problem, it used to work just fine though
Author
Owner

@Yann0s commented on GitHub (Oct 28, 2025):

Is there any way we can run ollama serve on verbose mode? @rick-github

yes : ollama run model-name --verbose

<!-- gh-comment-id:3457424230 --> @Yann0s commented on GitHub (Oct 28, 2025): > Is there any way we can run ollama serve on verbose mode? [@rick-github](https://github.com/rick-github) yes : ollama run model-name --verbose
Author
Owner

@rick-github commented on GitHub (Oct 28, 2025):

That's client verbose mode, not server verbose mode.

<!-- gh-comment-id:3457435500 --> @rick-github commented on GitHub (Oct 28, 2025): That's client verbose mode, not server verbose mode.
Author
Owner

@ssy341 commented on GitHub (Dec 1, 2025):

On windows,I created a new variable for OLLAMA_ORIGINS with value "*", it's works for me

<!-- gh-comment-id:3594431731 --> @ssy341 commented on GitHub (Dec 1, 2025): On windows,I created a new variable for OLLAMA_ORIGINS with value "*", it's works for me
Author
Owner

@luckylinux commented on GitHub (Feb 8, 2026):

A minor Note for those that maybe set the Port explicitly in the URL for CORS_ALLOW_ORIGIN.

A URL https://host:443 will be treated differently than https://host even though, by default, for the Web Browser (and for the vast majority of HTTPS Servers), they should be the same Thing.

Therefore, make sure to list BOTH of them in the Parameter:

CORS_ALLOW_ORIGIN=https://ollama.MYDOMAIN.TLD:443;https://ollama.MYDOMAIN.TLD
<!-- gh-comment-id:3867542411 --> @luckylinux commented on GitHub (Feb 8, 2026): A minor Note for those that maybe set the Port explicitly in the URL for `CORS_ALLOW_ORIGIN`. A URL `https://host:443` will be treated differently than `https://host` even though, by default, for the Web Browser (and for the vast majority of HTTPS Servers), they **should** be the same Thing. Therefore, make sure to list BOTH of them in the Parameter: ``` CORS_ALLOW_ORIGIN=https://ollama.MYDOMAIN.TLD:443;https://ollama.MYDOMAIN.TLD ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#4083