mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-07 11:28:35 -05:00
[PR #21685] [CLOSED] feat: add native Anthropic provider integration #65059
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
📋 Pull Request Information
Original PR: https://github.com/open-webui/open-webui/pull/21685
Author: @MaderHatt3r
Created: 2/21/2026
Status: ❌ Closed
Base:
dev← Head:feat/native-anthropic-provider📝 Commits (2)
740badffeat: add native Anthropic provider integration549a8aestyle: run prettier and black formatters📊 Changes
11 files changed (+1596 additions, -13 deletions)
View changed files
📝
backend/open_webui/config.py(+42 -0)📝
backend/open_webui/main.py(+20 -1)➕
backend/open_webui/routers/anthropic.py(+469 -0)➕
backend/open_webui/utils/anthropic_payload.py(+375 -0)➕
backend/open_webui/utils/anthropic_response.py(+326 -0)📝
backend/open_webui/utils/chat.py(+12 -0)📝
backend/open_webui/utils/models.py(+14 -4)➕
src/lib/apis/anthropic/index.ts(+107 -0)📝
src/lib/components/admin/Settings/Connections.svelte(+144 -8)➕
src/lib/components/admin/Settings/Connections/AnthropicConnection.svelte(+86 -0)📝
src/lib/constants.ts(+1 -0)📄 Description
Add native Anthropic (Claude) API integration bypassing the OpenAI-compatible layer for direct access to Anthropic-specific features including extended thinking, native streaming events, native tool usage, and accurate token usage reports.
Backend: router, payload converter, response/stream converter, SDK-based client. Frontend: admin connection UI, API client, connection component.
Pull Request Checklist
Note to first-time contributors: Please open a discussion post in Discussions to discuss your idea/fix with the community before creating a pull request, and describe your changes before submitting a pull request.
This is to ensure large feature PRs are discussed with the community first, before starting work on it. If the community does not want this feature or it is not relevant for Open WebUI as a project, it can be identified in the discussion before working on the feature and submitting the PR.
Before submitting, make sure you've checked the following:
devbranch. PRs targetingmainwill be immediately closed.devto ensure no unrelated commits (e.g. frommain) are included. Push updates to the existing PR branch instead of closing and reopening.Changelog Entry
Description
Adds a native Anthropic (Claude) provider integration as a first-class connection type alongside OpenAI and Ollama. Rather than routing Claude requests through the OpenAI-compatible shim, this integration communicates directly with the Anthropic Messages API via the official
anthropicPython SDK, enabling access to Anthropic-specific features that are lost in translation through the OpenAI compatibility layer.The integration follows the same architectural patterns as the existing OpenAI and Ollama providers: a dedicated FastAPI router handles configuration, model discovery, and chat completions; payload and response converters translate between Open WebUI's internal OpenAI-compatible format and Anthropic's native format; and the frontend provides an admin connection UI consistent with the existing connection management experience.
Motivation: Users connecting to Anthropic models through OpenAI-compatible proxies lose access to features like extended thinking (reasoning blocks), accurate token usage reporting, and native tool use. This integration provides direct access to those capabilities while maintaining full compatibility with Open WebUI's middleware pipeline, model management, and access control systems.
Added
backend/open_webui/routers/anthropic.py, ~470 lines) — FastAPI router with endpoints for admin configuration (GET/POST /anthropic/config), model discovery (GET /anthropic/models), and chat completions (POST /anthropic/chat/completions). Supports multiple API keys/URLs with parallel model fetching.backend/open_webui/utils/anthropic_payload.py, ~370 lines) — Converts OpenAI-format chat completion payloads to Anthropic Messages API format, including system prompt extraction, image content conversion (base64 data URIs and URLs), tool/function definition conversion, tool choice mapping, and assistant message tool call formatting.backend/open_webui/utils/anthropic_response.py, ~330 lines) — Converts Anthropic's event-based SSE streaming protocol (message_start→content_block_start→content_block_delta→message_delta→message_stop) into OpenAI-compatible SSE format. Handles text content, tool call deltas, reasoning/thinking blocks, and token usage reporting.src/lib/apis/anthropic/index.ts, ~110 lines) — TypeScript client for Anthropic admin endpoints (get/update config, fetch models).src/lib/components/admin/Settings/Connections/AnthropicConnection.svelte, ~90 lines) — Svelte component for managing individual Anthropic connections with URL display, configuration modal, enable/disable toggle, and delete confirmation.ENABLE_ANTHROPIC_API— Enable/disable the Anthropic provider (default:true)ANTHROPIC_API_KEY/ANTHROPIC_API_KEYS— API key(s), semicolon-separated for multiple connectionsANTHROPIC_API_BASE_URL/ANTHROPIC_API_BASE_URLS— Base URL(s) (default:https://api.anthropic.com)Changed
backend/open_webui/utils/chat.py) — Added routing for models withowned_by == "anthropic"to the Anthropic chat completion handler, alongside existing OpenAI and Ollama routes.backend/open_webui/utils/models.py) — Added Anthropic models to the unified model registry fetched byget_all_base_models().backend/open_webui/main.py) — Registered Anthropic router at/anthropicprefix, added config state initialization, and added middleware support forx-api-keyheader authentication (Anthropic Messages API compatibility).src/lib/constants.ts) — AddedANTHROPIC_API_BASE_URLconstant.src/lib/components/admin/Settings/Connections.svelte) — Added Anthropic configuration section with state management, handlers, and UI.Deprecated
Removed
Fixed
Security
PersistentConfig(same mechanism as OpenAI keys) and are never exposed to non-admin users.AccessGrantssystem.Breaking Changes
Additional Information
Architecture
The integration follows a three-layer conversion pattern:
anthropic_payload.pyconverts the OpenAI payload to Anthropic's Messages API format — extracting system prompts, converting content blocks, mapping tool definitions, and handling parameter differences.anthropic_response.pyconverts the Anthropic streaming response back to OpenAI-compatible SSE events, which flow through the existing middleware pipeline for post-processing (usage extraction, reasoning tag detection, etc.).This approach means all of Open WebUI's existing features (chat controls, system prompts, tool calling, file attachments, access control, usage tracking) work with Anthropic models without modification.
Per-Connection Configuration
Each Anthropic connection supports:
enablemodel_idsprefix_idwork.claude-sonnet-4-20250514)tagsconnection_type"external"or"internal"Dependencies
anthropicPython SDK (latest) — Used forAsyncAnthropicclient,models.list()for discovery, andmessages.create()for chat completions. The SDK handles authentication, retries, and Anthropic-specific error types.Model Discovery
Models are fetched via the SDK's
models.list()API with automatic pagination. If the API call fails (e.g., the endpoint doesn't support model listing), the router falls back to a hardcoded list of known models (Claude Opus 4, Sonnet 4, Sonnet 4.5, Haiku 3.5) with accurate context window and max output token metadata.Error Handling
All Anthropic SDK exceptions are caught and mapped to appropriate HTTP status codes:
AuthenticationErrorPermissionDeniedErrorNotFoundErrorRateLimitErrorAPIStatusErrorAPIConnectionErrorAPITimeoutErrorTesting
Manually tested with:
Screenshots or Videos
Admin Connections page showing Anthropic section
Model selector showing Anthropic models with prefix/tags
Chat conversation with token usage tooltip
Contributor License Agreement
By submitting this pull request, I confirm that I have read and fully agree to the Contributor License Agreement (CLA), and I am providing my contributions under its terms.
🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.