feat: support for azure open ai responses api (i.e. to use openai chat gpt o3 pro model) #5732

Closed
opened 2025-11-11 16:31:59 -06:00 by GiteaMirror · 0 comments
Owner

Originally created by @kaeferpsd on GitHub (Jul 10, 2025).

Check Existing Issues

  • I have searched the existing issues and discussions.

Problem Description

Open WebUI currently supports Azure OpenAI, including the GPT-4o model (gpt-4o), but OpenAI's gpt-4o variant "o3-pro" is not available via Azure OpenAI. This issue is to:

Document this limitation clearly

Plan support for o3-pro via Azure when/if Microsoft exposes it.

What works:
Azure OpenAI integration (using /chat/completions)
Models like:

gpt-4

gpt-4-32k

gpt-4o (regular Azure version, not o3-pro)

gpt-35-turbo

What doesn’t work:
Model variant gpt-4o (o3-pro) as listed under model: "gpt-4o" from api.openai.com
ℹ️ Azure currently maps gpt-4o to the May 2024 release, not to o3-pro with function calling and JSON mode enhancements.

Why this matters:

Users expect the same behavior from gpt-4o on Azure as from api.openai.com

o3-pro adds enhanced tool support and function call reliability, which is important for Open WebUI workflows

If/when Azure exposes o3-pro, Open WebUI should allow specifying this model version (possibly via deployment name or tag)

Desired Solution you'd like

Suggestions:

Add a warning or note in Azure OpenAI setup if user tries to use gpt-4o expecting o3-pro capabilities

Track availability of o3-pro in Azure and document the required deployment setup once released

Alternatives Considered

No response

Additional Context

No response

Originally created by @kaeferpsd on GitHub (Jul 10, 2025). ### Check Existing Issues - [x] I have searched the existing issues and discussions. ### Problem Description Open WebUI currently supports Azure OpenAI, including the GPT-4o model (gpt-4o), but OpenAI's gpt-4o variant "o3-pro" is not available via Azure OpenAI. This issue is to: Document this limitation clearly Plan support for o3-pro via Azure when/if Microsoft exposes it. What works: ✅ Azure OpenAI integration (using /chat/completions) ✅ Models like: gpt-4 gpt-4-32k gpt-4o (regular Azure version, not o3-pro) gpt-35-turbo What doesn’t work: ❌ Model variant gpt-4o (o3-pro) as listed under model: "gpt-4o" from api.openai.com ℹ️ Azure currently maps gpt-4o to the May 2024 release, not to o3-pro with function calling and JSON mode enhancements. Why this matters: Users expect the same behavior from gpt-4o on Azure as from api.openai.com o3-pro adds enhanced tool support and function call reliability, which is important for Open WebUI workflows If/when Azure exposes o3-pro, Open WebUI should allow specifying this model version (possibly via deployment name or tag) ### Desired Solution you'd like Suggestions: Add a warning or note in Azure OpenAI setup if user tries to use gpt-4o expecting o3-pro capabilities Track availability of o3-pro in Azure and document the required deployment setup once released ### Alternatives Considered _No response_ ### Additional Context _No response_
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#5732