[GH-ISSUE #3449] CORS Error in Blower #48638

Closed
opened 2026-04-28 08:58:22 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @ghost on GitHub (Apr 2, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3449

Originally assigned to: @dhiltgen on GitHub.

What is the issue?

Thank you for your great work, the issue I met can be seen in the figure below.
image
I have modified the configuation within the ollama.service, but the problem still exists.
image

I build the blower in my computer, and I try to utilize the ollama in another server.

addMessageToChat('You', message);

        // Replace this URL with the actual endpoint where the backend API is hosted
        const apiURL = 'http://100.67.xxx.xxx:11434/api/generate';

        // Prepare the data to be sent in the POST request
        const requestData = {
            model: "llama2",
            prompt: message,
            options: {
                num_ctx: 4096
            }
        };

Could you please tell me how can I solve this issue.

What did you expect to see?

I hope to call API in server from another computer.

Steps to reproduce

No response

Are there any recent changes that introduced the issue?

No response

OS

No response

Architecture

No response

Platform

No response

Ollama version

latest

GPU

No response

GPU info

No response

CPU

No response

Other software

javascript, html nginx

Originally created by @ghost on GitHub (Apr 2, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3449 Originally assigned to: @dhiltgen on GitHub. ### What is the issue? Thank you for your great work, the issue I met can be seen in the figure below. ![image](https://github.com/ollama/ollama/assets/71435435/cada7bd2-8326-41e3-a2c8-0ca8b72dccb3) I have modified the configuation within the ollama.service, but the problem still exists. ![image](https://github.com/ollama/ollama/assets/71435435/0918a5c4-309d-44e1-b6ee-066983a5054c) I build the blower in my computer, and I try to utilize the ollama in another server. addMessageToChat('You', message); // Replace this URL with the actual endpoint where the backend API is hosted const apiURL = 'http://100.67.xxx.xxx:11434/api/generate'; // Prepare the data to be sent in the POST request const requestData = { model: "llama2", prompt: message, options: { num_ctx: 4096 } }; Could you please tell me how can I solve this issue. ### What did you expect to see? I hope to call API in server from another computer. ### Steps to reproduce _No response_ ### Are there any recent changes that introduced the issue? _No response_ ### OS _No response_ ### Architecture _No response_ ### Platform _No response_ ### Ollama version latest ### GPU _No response_ ### GPU info _No response_ ### CPU _No response_ ### Other software javascript, html nginx
GiteaMirror added the question label 2026-04-28 08:58:22 -05:00
Author
Owner

@YiuChoi commented on GitHub (Apr 13, 2024):

the same error

<!-- gh-comment-id:2052798712 --> @YiuChoi commented on GitHub (Apr 13, 2024): the same error
Author
Owner

@dhiltgen commented on GitHub (Jun 1, 2024):

CORS settings are described here - https://github.com/ollama/ollama/blob/main/docs/faq.md#how-can-i-allow-additional-web-origins-to-access-ollama

<!-- gh-comment-id:2143618757 --> @dhiltgen commented on GitHub (Jun 1, 2024): CORS settings are described here - https://github.com/ollama/ollama/blob/main/docs/faq.md#how-can-i-allow-additional-web-origins-to-access-ollama
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#48638