[GH-ISSUE #3027] /v1/completions OpenAI compatible api #48372

Closed
opened 2026-04-28 07:57:36 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @Kreijstal on GitHub (Mar 9, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3027

This is more flexible than the chat based one, in case you want to do completitions and not chatting, is that okay? Or you want to have more fine grained control.

Originally created by @Kreijstal on GitHub (Mar 9, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3027 This is more flexible than the chat based one, in case you want to do completitions and not chatting, is that okay? Or you want to have more fine grained control.
GiteaMirror added the compatibilityfeature request labels 2026-04-28 07:57:36 -05:00
Author
Owner

@lks-ai commented on GitHub (Mar 21, 2024):

BUMP ... I really need this!

<!-- gh-comment-id:2012059106 --> @lks-ai commented on GitHub (Mar 21, 2024): BUMP ... I really need this!
Author
Owner

@niyazed commented on GitHub (Jun 11, 2024):

I am getting this error openai.NotFoundError: 404 page not found when using:

return (
            self.client.completions.create(
                user=self.name,
                prompt=messages,
                model=self.model_name,
                max_tokens=self.kwargs.get("max_tokens", 100),
                temperature=self.kwargs.get("temperature", 0.9),
                extra_body=self.kwargs.get("extra_body", {}),
            )
            .choices[0]
            .message.content
        )

When I call this function, it is triggering this endpoint: "/v1/completions"

<!-- gh-comment-id:2160079451 --> @niyazed commented on GitHub (Jun 11, 2024): I am getting this error `openai.NotFoundError: 404 page not found` when using: ``` return ( self.client.completions.create( user=self.name, prompt=messages, model=self.model_name, max_tokens=self.kwargs.get("max_tokens", 100), temperature=self.kwargs.get("temperature", 0.9), extra_body=self.kwargs.get("extra_body", {}), ) .choices[0] .message.content ) ``` When I call this function, it is triggering this endpoint: `"/v1/completions"`
Author
Owner

@sammcj commented on GitHub (Jun 22, 2024):

@niyazed I believe that error is because Ollama doesn't support the OpenAI compatible API parameter prompt.

I believe at the moment it's limited to accepting an array of messages - https://github.com/ollama/ollama/blob/main/docs/openai.md#curl

I think this PR would resolve that: https://github.com/ollama/ollama/pull/5209

<!-- gh-comment-id:2184210559 --> @sammcj commented on GitHub (Jun 22, 2024): @niyazed I believe that error is because Ollama doesn't support the OpenAI compatible API parameter `prompt`. I believe at the moment it's limited to accepting an array of messages - https://github.com/ollama/ollama/blob/main/docs/openai.md#curl I think this PR would resolve that: https://github.com/ollama/ollama/pull/5209
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#48372