[GH-ISSUE #7132] Getting Error with OpenAI compatibility #4530

Closed
opened 2026-04-12 15:28:08 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @php10xdev on GitHub (Oct 8, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7132

What is the issue?

import { NextApiRequest } from 'next';
import { OpenAIStream, StreamingTextResponse } from 'ai';
import OpenAI from 'openai';

const openai = new OpenAI({
    baseURL: 'http://localhost:11434/v1',
    apiKey: 'ollama', // required but unused
  });

export async function POST(req: NextApiRequest) {
    const body = await req.json();
    console.log("messages", body);
    
    try {
        const response = await openai.chat.completions.create({
            model: 'llama3',
            messages: body.messages,
          });
    
        const stream = OpenAIStream(response);
    
        return new StreamingTextResponse(stream);
    } catch (error) {
        console.error("error", error);
    }
}

Log before Error:

messages { messages: [ { role: 'user', content: "What is today's date?" } ] }

Getting Error

error APIConnectionError: Connection error.
    at OpenAI.makeRequest (webpack-internal:///(rsc)/./node_modules/openai/core.mjs:321:19)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async POST (webpack-internal:///(rsc)/./src/app/api/chat/route.ts:20:26)

OS

macOS

GPU

Apple

CPU

Apple

Ollama version

0.3.0

Originally created by @php10xdev on GitHub (Oct 8, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7132 ### What is the issue? ```js import { NextApiRequest } from 'next'; import { OpenAIStream, StreamingTextResponse } from 'ai'; import OpenAI from 'openai'; const openai = new OpenAI({ baseURL: 'http://localhost:11434/v1', apiKey: 'ollama', // required but unused }); export async function POST(req: NextApiRequest) { const body = await req.json(); console.log("messages", body); try { const response = await openai.chat.completions.create({ model: 'llama3', messages: body.messages, }); const stream = OpenAIStream(response); return new StreamingTextResponse(stream); } catch (error) { console.error("error", error); } } ``` Log before Error: ``` messages { messages: [ { role: 'user', content: "What is today's date?" } ] } ``` Getting Error ``` error APIConnectionError: Connection error. at OpenAI.makeRequest (webpack-internal:///(rsc)/./node_modules/openai/core.mjs:321:19) at process.processTicksAndRejections (node:internal/process/task_queues:95:5) at async POST (webpack-internal:///(rsc)/./src/app/api/chat/route.ts:20:26) ``` ### OS macOS ### GPU Apple ### CPU Apple ### Ollama version 0.3.0
GiteaMirror added the bugapi labels 2026-04-12 15:28:08 -05:00
Author
Owner

@rick-github commented on GitHub (Oct 8, 2024):

Thanks for the script, but it's incomplete. Where is Request defined? What's the argument to POST?

<!-- gh-comment-id:2399839698 --> @rick-github commented on GitHub (Oct 8, 2024): Thanks for the script, but it's incomplete. Where is `Request` defined? What's the argument to `POST`?
Author
Owner

@php10xdev commented on GitHub (Oct 21, 2024):

@rick-github Updated the code still not working, official docs of next.js 14's Request and POST.

<!-- gh-comment-id:2426190286 --> @php10xdev commented on GitHub (Oct 21, 2024): @rick-github Updated the code still not working, official docs of [next.js 14](https://nextjs.org/docs/pages/building-your-application/routing/api-routes)'s `Request` and `POST`.
Author
Owner

@hecksadecimal commented on GitHub (Dec 10, 2024):

After updating to Ollama 0.5.1 (I'm not sure which version introduced this problem), I also seem to be getting errors when connecting to the OpenAI endpoint.

My code was working before this update. All of the libraries which I depend on are also up to date, and have been before I updated Ollama. Ollama is the only variable here.

Here's a Gist of my error, and the accompanying code.

Relevant environment variables:

OLLAMA_BASE_URL=http://192.168.2.15:11434/v1/

This is another computer on my network running Ollama. I can confirm that Ollama is listening on port 11434, and that I am able to reach the computer through the network via other things like ssh.

<!-- gh-comment-id:2530409191 --> @hecksadecimal commented on GitHub (Dec 10, 2024): After updating to Ollama 0.5.1 (I'm not sure which version introduced this problem), I also seem to be getting errors when connecting to the OpenAI endpoint. My code was working before this update. All of the libraries which I depend on are also up to date, and have been before I updated Ollama. Ollama is the only variable here. Here's a [Gist](https://gist.github.com/hecksadecimal/fef51f0358ab08c4fbb33774d84a85fa) of my error, and the [accompanying code](https://gist.github.com/hecksadecimal/6bd42a9dbfe13225111146c21c693a9f). Relevant environment variables: ```sh OLLAMA_BASE_URL=http://192.168.2.15:11434/v1/ ``` This is another computer on my network running Ollama. I can confirm that Ollama is listening on port 11434, and that I am able to reach the computer through the network via other things like ssh.
Author
Owner

@rick-github commented on GitHub (Dec 10, 2024):

Ollama server logs?

<!-- gh-comment-id:2530467197 --> @rick-github commented on GitHub (Dec 10, 2024): Ollama server logs?
Author
Owner

@hecksadecimal commented on GitHub (Dec 10, 2024):

I'm progressively downgrading to see if I can find a version that works again. I'll get that for you in a moment.

<!-- gh-comment-id:2530483494 --> @hecksadecimal commented on GitHub (Dec 10, 2024): I'm progressively downgrading to see if I can find a version that works again. I'll get that for you in a moment.
Author
Owner

@hecksadecimal commented on GitHub (Dec 10, 2024):

Ollama server logs?

Here are the logs from ollama serve
Of note is line 5. Listening on 127.0.0.1:11434. I have not changed anything about my configuration, so I'm assuming that in some previous version it defaulted to listening on 0.0.0.0.

After setting OLLAMA_HOST=0.0.0.0 in my service file, it's back to working as expected.

<!-- gh-comment-id:2530548802 --> @hecksadecimal commented on GitHub (Dec 10, 2024): > Ollama server logs? [Here](https://gist.github.com/hecksadecimal/9ed8587a485c748f9b4adf72f5049890) are the logs from `ollama serve` Of note is line 5. `Listening on 127.0.0.1:11434`. I have not changed anything about my configuration, so I'm assuming that in some previous version it defaulted to listening on `0.0.0.0`. After setting `OLLAMA_HOST=0.0.0.0` in my service file, it's back to working as expected.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#4530