[GH-ISSUE #1366] Support openai api #47229

Closed
opened 2026-04-28 03:26:48 -05:00 by GiteaMirror · 3 comments
Owner

Originally created by @thawkins on GitHub (Dec 3, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1366

Would it be possible to add a "serveOpenAI" command that changes the REST api schema to match that of the OpenAI api. This would open up a wide range of tools that could then be connected to ollama via this API.

Authentication could for now be ignored if provided until it felt that ollama needs that capability.

Api could be served on a different url structure.

Originally created by @thawkins on GitHub (Dec 3, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1366 Would it be possible to add a "serveOpenAI" command that changes the REST api schema to match that of the OpenAI api. This would open up a wide range of tools that could then be connected to ollama via this API. Authentication could for now be ignored if provided until it felt that ollama needs that capability. Api could be served on a different url structure.
Author
Owner

@phalexo commented on GitHub (Dec 4, 2023):

You can use litellm to wrap ollama and litellm will give you an openAI API.

<!-- gh-comment-id:1837686487 --> @phalexo commented on GitHub (Dec 4, 2023): You can use litellm to wrap ollama and litellm will give you an openAI API.
Author
Owner

@Thelouras58 commented on GitHub (Dec 4, 2023):

@phalexo is right. You can check https://www.youtube.com/watch?v=y7wMTwJN7rA

<!-- gh-comment-id:1838547876 --> @Thelouras58 commented on GitHub (Dec 4, 2023): @phalexo is right. You can check https://www.youtube.com/watch?v=y7wMTwJN7rA
Author
Owner

@technovangelist commented on GitHub (Dec 4, 2023):

Thanks for the issue, but it looks like it’s a duplicate of #305, plus this one mentions another workaround using litellm. I will go ahead and close it now. If you think there is anything we left out, reopen and we can address. Thanks for being part of this great community.

<!-- gh-comment-id:1839556819 --> @technovangelist commented on GitHub (Dec 4, 2023): Thanks for the issue, but it looks like it’s a duplicate of #305, plus this one mentions another workaround using litellm. I will go ahead and close it now. If you think there is anything we left out, reopen and we can address. Thanks for being part of this great community.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#47229