[PR #21685] [CLOSED] feat: add native Anthropic provider integration #49251

Closed
opened 2026-04-30 01:34:28 -05:00 by GiteaMirror · 0 comments
Owner

📋 Pull Request Information

Original PR: https://github.com/open-webui/open-webui/pull/21685
Author: @MaderHatt3r
Created: 2/21/2026
Status: Closed

Base: devHead: feat/native-anthropic-provider


📝 Commits (2)

  • 740badf feat: add native Anthropic provider integration
  • 549a8ae style: run prettier and black formatters

📊 Changes

11 files changed (+1596 additions, -13 deletions)

View changed files

📝 backend/open_webui/config.py (+42 -0)
📝 backend/open_webui/main.py (+20 -1)
backend/open_webui/routers/anthropic.py (+469 -0)
backend/open_webui/utils/anthropic_payload.py (+375 -0)
backend/open_webui/utils/anthropic_response.py (+326 -0)
📝 backend/open_webui/utils/chat.py (+12 -0)
📝 backend/open_webui/utils/models.py (+14 -4)
src/lib/apis/anthropic/index.ts (+107 -0)
📝 src/lib/components/admin/Settings/Connections.svelte (+144 -8)
src/lib/components/admin/Settings/Connections/AnthropicConnection.svelte (+86 -0)
📝 src/lib/constants.ts (+1 -0)

📄 Description

Add native Anthropic (Claude) API integration bypassing the OpenAI-compatible layer for direct access to Anthropic-specific features including extended thinking, native streaming events, native tool usage, and accurate token usage reports.

Backend: router, payload converter, response/stream converter, SDK-based client. Frontend: admin connection UI, API client, connection component.

Pull Request Checklist

Note to first-time contributors: Please open a discussion post in Discussions to discuss your idea/fix with the community before creating a pull request, and describe your changes before submitting a pull request.

This is to ensure large feature PRs are discussed with the community first, before starting work on it. If the community does not want this feature or it is not relevant for Open WebUI as a project, it can be identified in the discussion before working on the feature and submitting the PR.

Before submitting, make sure you've checked the following:

  • Target branch: Verify that the pull request targets the dev branch. PRs targeting main will be immediately closed.
  • Description: Provide a concise description of the changes made in this pull request down below.
  • Changelog: Ensure a changelog entry following the format of Keep a Changelog is added at the bottom of the PR description.
  • Documentation: Add docs in Open WebUI Docs Repository. Document user-facing behavior, environment variables, public APIs/interfaces, or deployment steps.
  • Dependencies: Are there any new or upgraded dependencies? If so, explain why, update the changelog/docs, and include any compatibility notes. Actually run the code/function that uses updated library to ensure it doesn't crash.
  • Testing: Perform manual tests to verify the implemented fix/feature works as intended AND does not break any other functionality. Include reproducible steps to demonstrate the issue before the fix. Test edge cases (URL encoding, HTML entities, types). Take this as an opportunity to make screenshots of the feature/fix and include them in the PR description.
  • Agentic AI Code: Confirm this Pull Request is not written by any AI Agent or has at least gone through additional human review AND manual testing. If any AI Agent is the co-author of this PR, it may lead to immediate closure of the PR.
    • Generative ai acknowladgement. I heavily use cursor for understanding of existing architecture, desgin, and implementation. I recognize that I am responsible for code quality, reviews, and testing. I am the final author of my code.
  • Code review: Have you performed a self-review of your code, addressing any coding standard issues and ensuring adherence to the project's coding standards?
  • Design & Architecture: Prefer smart defaults over adding new settings; use local state for ephemeral UI logic. Open a Discussion for major architectural or UX changes.
  • Git Hygiene: Keep PRs atomic (one logical change). Clean up commits and rebase on dev to ensure no unrelated commits (e.g. from main) are included. Push updates to the existing PR branch instead of closing and reopening.
  • Title Prefix: To clearly categorize this pull request, prefix the pull request title using one of the following:
    • BREAKING CHANGE: Significant changes that may affect compatibility
    • build: Changes that affect the build system or external dependencies
    • ci: Changes to our continuous integration processes or workflows
    • chore: Refactor, cleanup, or other non-functional code changes
    • docs: Documentation update or addition
    • feat: Introduces a new feature or enhancement to the codebase
    • fix: Bug fix or error correction
    • i18n: Internationalization or localization changes
    • perf: Performance improvement
    • refactor: Code restructuring for better maintainability, readability, or scalability
    • style: Changes that do not affect the meaning of the code (white space, formatting, missing semi-colons, etc.)
    • test: Adding missing tests or correcting existing tests
    • WIP: Work in progress, a temporary label for incomplete or ongoing work

Changelog Entry

Description

Adds a native Anthropic (Claude) provider integration as a first-class connection type alongside OpenAI and Ollama. Rather than routing Claude requests through the OpenAI-compatible shim, this integration communicates directly with the Anthropic Messages API via the official anthropic Python SDK, enabling access to Anthropic-specific features that are lost in translation through the OpenAI compatibility layer.

The integration follows the same architectural patterns as the existing OpenAI and Ollama providers: a dedicated FastAPI router handles configuration, model discovery, and chat completions; payload and response converters translate between Open WebUI's internal OpenAI-compatible format and Anthropic's native format; and the frontend provides an admin connection UI consistent with the existing connection management experience.

Motivation: Users connecting to Anthropic models through OpenAI-compatible proxies lose access to features like extended thinking (reasoning blocks), accurate token usage reporting, and native tool use. This integration provides direct access to those capabilities while maintaining full compatibility with Open WebUI's middleware pipeline, model management, and access control systems.

Added

  • Native Anthropic router (backend/open_webui/routers/anthropic.py, ~470 lines) — FastAPI router with endpoints for admin configuration (GET/POST /anthropic/config), model discovery (GET /anthropic/models), and chat completions (POST /anthropic/chat/completions). Supports multiple API keys/URLs with parallel model fetching.
  • Payload converter (backend/open_webui/utils/anthropic_payload.py, ~370 lines) — Converts OpenAI-format chat completion payloads to Anthropic Messages API format, including system prompt extraction, image content conversion (base64 data URIs and URLs), tool/function definition conversion, tool choice mapping, and assistant message tool call formatting.
  • Response/stream converter (backend/open_webui/utils/anthropic_response.py, ~330 lines) — Converts Anthropic's event-based SSE streaming protocol (message_startcontent_block_startcontent_block_deltamessage_deltamessage_stop) into OpenAI-compatible SSE format. Handles text content, tool call deltas, reasoning/thinking blocks, and token usage reporting.
  • Frontend API client (src/lib/apis/anthropic/index.ts, ~110 lines) — TypeScript client for Anthropic admin endpoints (get/update config, fetch models).
  • Connection component (src/lib/components/admin/Settings/Connections/AnthropicConnection.svelte, ~90 lines) — Svelte component for managing individual Anthropic connections with URL display, configuration modal, enable/disable toggle, and delete confirmation.
  • Admin UI — Anthropic section in the Connections settings page with enable/disable toggle, connection list, and add-connection modal. Follows the same UX patterns as the existing OpenAI connection management.
  • Environment variables for configuration:
    • ENABLE_ANTHROPIC_API — Enable/disable the Anthropic provider (default: true)
    • ANTHROPIC_API_KEY / ANTHROPIC_API_KEYS — API key(s), semicolon-separated for multiple connections
    • ANTHROPIC_API_BASE_URL / ANTHROPIC_API_BASE_URLS — Base URL(s) (default: https://api.anthropic.com)

Changed

  • Model routing (backend/open_webui/utils/chat.py) — Added routing for models with owned_by == "anthropic" to the Anthropic chat completion handler, alongside existing OpenAI and Ollama routes.
  • Model discovery (backend/open_webui/utils/models.py) — Added Anthropic models to the unified model registry fetched by get_all_base_models().
  • Main application (backend/open_webui/main.py) — Registered Anthropic router at /anthropic prefix, added config state initialization, and added middleware support for x-api-key header authentication (Anthropic Messages API compatibility).
  • Frontend constants (src/lib/constants.ts) — Added ANTHROPIC_API_BASE_URL constant.
  • Admin Connections UI (src/lib/components/admin/Settings/Connections.svelte) — Added Anthropic configuration section with state management, handlers, and UI.

Deprecated

  • N/A

Removed

  • N/A

Fixed

  • N/A

Security

  • API keys are stored using PersistentConfig (same mechanism as OpenAI keys) and are never exposed to non-admin users.
  • Model access control is enforced through Open WebUI's existing AccessGrants system.

Breaking Changes

  • None. This is a purely additive change. Existing OpenAI, Ollama, and other provider connections are unaffected.

Additional Information

Architecture

The integration follows a three-layer conversion pattern:

  1. Inbound: Open WebUI's middleware processes the request normally (system prompts, tool injection, file handling, etc.) and produces an OpenAI-format payload.
  2. Conversion: anthropic_payload.py converts the OpenAI payload to Anthropic's Messages API format — extracting system prompts, converting content blocks, mapping tool definitions, and handling parameter differences.
  3. Outbound: anthropic_response.py converts the Anthropic streaming response back to OpenAI-compatible SSE events, which flow through the existing middleware pipeline for post-processing (usage extraction, reasoning tag detection, etc.).

This approach means all of Open WebUI's existing features (chat controls, system prompts, tool calling, file attachments, access control, usage tracking) work with Anthropic models without modification.

Per-Connection Configuration

Each Anthropic connection supports:

Option Description
enable Enable/disable the connection
model_ids Restrict which models are exposed (empty = all)
prefix_id Custom prefix for model IDs and names (e.g., work.claude-sonnet-4-20250514)
tags Custom tags shown in the model selector
connection_type Mark as "external" or "internal"

Dependencies

  • anthropic Python SDK (latest) — Used for AsyncAnthropic client, models.list() for discovery, and messages.create() for chat completions. The SDK handles authentication, retries, and Anthropic-specific error types.

Model Discovery

Models are fetched via the SDK's models.list() API with automatic pagination. If the API call fails (e.g., the endpoint doesn't support model listing), the router falls back to a hardcoded list of known models (Claude Opus 4, Sonnet 4, Sonnet 4.5, Haiku 3.5) with accurate context window and max output token metadata.

Error Handling

All Anthropic SDK exceptions are caught and mapped to appropriate HTTP status codes:

Anthropic Exception HTTP Status
AuthenticationError 401
PermissionDeniedError 403
NotFoundError 404
RateLimitError 429
APIStatusError Passthrough
APIConnectionError 502
APITimeoutError 504

Testing

Manually tested with:

  • Single and multi-turn conversations with Claude Sonnet 4
  • Streaming and non-streaming responses
  • System prompt propagation from chat controls
  • Token usage display via the existing info tooltip
  • Model discovery with prefix ID and tagging
  • Multiple connections with different configurations
  • Admin UI: add, configure, enable/disable, and delete connections

Screenshots or Videos

Admin Connections page showing Anthropic section

image image

Model selector showing Anthropic models with prefix/tags

image

Chat conversation with token usage tooltip

image

Contributor License Agreement

By submitting this pull request, I confirm that I have read and fully agree to the Contributor License Agreement (CLA), and I am providing my contributions under its terms.

Note

Deleting the CLA section will lead to immediate closure of your PR and it will not be merged in.


🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.

## 📋 Pull Request Information **Original PR:** https://github.com/open-webui/open-webui/pull/21685 **Author:** [@MaderHatt3r](https://github.com/MaderHatt3r) **Created:** 2/21/2026 **Status:** ❌ Closed **Base:** `dev` ← **Head:** `feat/native-anthropic-provider` --- ### 📝 Commits (2) - [`740badf`](https://github.com/open-webui/open-webui/commit/740badf510a242b882d4a0bb3e65d825b2db7c76) feat: add native Anthropic provider integration - [`549a8ae`](https://github.com/open-webui/open-webui/commit/549a8ae99d36aedfe24849381e614c96388532dc) style: run prettier and black formatters ### 📊 Changes **11 files changed** (+1596 additions, -13 deletions) <details> <summary>View changed files</summary> 📝 `backend/open_webui/config.py` (+42 -0) 📝 `backend/open_webui/main.py` (+20 -1) ➕ `backend/open_webui/routers/anthropic.py` (+469 -0) ➕ `backend/open_webui/utils/anthropic_payload.py` (+375 -0) ➕ `backend/open_webui/utils/anthropic_response.py` (+326 -0) 📝 `backend/open_webui/utils/chat.py` (+12 -0) 📝 `backend/open_webui/utils/models.py` (+14 -4) ➕ `src/lib/apis/anthropic/index.ts` (+107 -0) 📝 `src/lib/components/admin/Settings/Connections.svelte` (+144 -8) ➕ `src/lib/components/admin/Settings/Connections/AnthropicConnection.svelte` (+86 -0) 📝 `src/lib/constants.ts` (+1 -0) </details> ### 📄 Description Add native Anthropic (Claude) API integration bypassing the OpenAI-compatible layer for direct access to Anthropic-specific features including extended thinking, native streaming events, native tool usage, and accurate token usage reports. Backend: router, payload converter, response/stream converter, SDK-based client. Frontend: admin connection UI, API client, connection component. <!-- ⚠️ CRITICAL CHECKS FOR CONTRIBUTORS (READ, DON'T DELETE) ⚠️ 1. Target the `dev` branch. PRs targeting `main` will be automatically closed. 2. Do NOT delete the CLA section at the bottom. It is required for the bot to accept your PR. --> # Pull Request Checklist ### Note to first-time contributors: Please open a discussion post in [Discussions](https://github.com/open-webui/open-webui/discussions) to discuss your idea/fix with the community before creating a pull request, and describe your changes before submitting a pull request. This is to ensure large feature PRs are discussed with the community first, before starting work on it. If the community does not want this feature or it is not relevant for Open WebUI as a project, it can be identified in the discussion before working on the feature and submitting the PR. **Before submitting, make sure you've checked the following:** - [x] **Target branch:** Verify that the pull request targets the `dev` branch. **PRs targeting `main` will be immediately closed.** - [x] **Description:** Provide a concise description of the changes made in this pull request down below. - [x] **Changelog:** Ensure a changelog entry following the format of [Keep a Changelog](https://keepachangelog.com/) is added at the bottom of the PR description. - [x] **Documentation:** Add docs in [Open WebUI Docs Repository](https://github.com/open-webui/docs). Document user-facing behavior, environment variables, public APIs/interfaces, or deployment steps. - [x] Docs PR: https://github.com/open-webui/docs/pull/1096 - [x] **Dependencies:** Are there any new or upgraded dependencies? If so, explain why, update the changelog/docs, and include any compatibility notes. Actually run the code/function that uses updated library to ensure it doesn't crash. - [x] **Testing:** Perform manual tests to **verify the implemented fix/feature works as intended AND does not break any other functionality**. Include reproducible steps to demonstrate the issue before the fix. Test edge cases (URL encoding, HTML entities, types). Take this as an opportunity to **make screenshots of the feature/fix and include them in the PR description**. - [x] **Agentic AI Code:** Confirm this Pull Request is **not written by any AI Agent** or has at least **gone through additional human review AND manual testing**. If any AI Agent is the co-author of this PR, it may lead to immediate closure of the PR. - [X] Generative ai acknowladgement. I heavily use cursor for understanding of existing architecture, desgin, and implementation. I recognize that I am responsible for code quality, reviews, and testing. I am the final author of my code. - [x] **Code review:** Have you performed a self-review of your code, addressing any coding standard issues and ensuring adherence to the project's coding standards? - [x] **Design & Architecture:** Prefer smart defaults over adding new settings; use local state for ephemeral UI logic. Open a Discussion for major architectural or UX changes. - [x] **Git Hygiene:** Keep PRs atomic (one logical change). Clean up commits and rebase on `dev` to ensure no unrelated commits (e.g. from `main`) are included. Push updates to the existing PR branch instead of closing and reopening. - [x] **Title Prefix:** To clearly categorize this pull request, prefix the pull request title using one of the following: - **BREAKING CHANGE**: Significant changes that may affect compatibility - **build**: Changes that affect the build system or external dependencies - **ci**: Changes to our continuous integration processes or workflows - **chore**: Refactor, cleanup, or other non-functional code changes - **docs**: Documentation update or addition - **feat**: Introduces a new feature or enhancement to the codebase - **fix**: Bug fix or error correction - **i18n**: Internationalization or localization changes - **perf**: Performance improvement - **refactor**: Code restructuring for better maintainability, readability, or scalability - **style**: Changes that do not affect the meaning of the code (white space, formatting, missing semi-colons, etc.) - **test**: Adding missing tests or correcting existing tests - **WIP**: Work in progress, a temporary label for incomplete or ongoing work # Changelog Entry ### Description Adds a native Anthropic (Claude) provider integration as a first-class connection type alongside OpenAI and Ollama. Rather than routing Claude requests through the OpenAI-compatible shim, this integration communicates directly with the Anthropic Messages API via the official `anthropic` Python SDK, enabling access to Anthropic-specific features that are lost in translation through the OpenAI compatibility layer. The integration follows the same architectural patterns as the existing OpenAI and Ollama providers: a dedicated FastAPI router handles configuration, model discovery, and chat completions; payload and response converters translate between Open WebUI's internal OpenAI-compatible format and Anthropic's native format; and the frontend provides an admin connection UI consistent with the existing connection management experience. **Motivation:** Users connecting to Anthropic models through OpenAI-compatible proxies lose access to features like extended thinking (reasoning blocks), accurate token usage reporting, and native tool use. This integration provides direct access to those capabilities while maintaining full compatibility with Open WebUI's middleware pipeline, model management, and access control systems. ### Added - **Native Anthropic router** (`backend/open_webui/routers/anthropic.py`, ~470 lines) — FastAPI router with endpoints for admin configuration (`GET/POST /anthropic/config`), model discovery (`GET /anthropic/models`), and chat completions (`POST /anthropic/chat/completions`). Supports multiple API keys/URLs with parallel model fetching. - **Payload converter** (`backend/open_webui/utils/anthropic_payload.py`, ~370 lines) — Converts OpenAI-format chat completion payloads to Anthropic Messages API format, including system prompt extraction, image content conversion (base64 data URIs and URLs), tool/function definition conversion, tool choice mapping, and assistant message tool call formatting. - **Response/stream converter** (`backend/open_webui/utils/anthropic_response.py`, ~330 lines) — Converts Anthropic's event-based SSE streaming protocol (`message_start` → `content_block_start` → `content_block_delta` → `message_delta` → `message_stop`) into OpenAI-compatible SSE format. Handles text content, tool call deltas, reasoning/thinking blocks, and token usage reporting. - **Frontend API client** (`src/lib/apis/anthropic/index.ts`, ~110 lines) — TypeScript client for Anthropic admin endpoints (get/update config, fetch models). - **Connection component** (`src/lib/components/admin/Settings/Connections/AnthropicConnection.svelte`, ~90 lines) — Svelte component for managing individual Anthropic connections with URL display, configuration modal, enable/disable toggle, and delete confirmation. - **Admin UI** — Anthropic section in the Connections settings page with enable/disable toggle, connection list, and add-connection modal. Follows the same UX patterns as the existing OpenAI connection management. - **Environment variables** for configuration: - `ENABLE_ANTHROPIC_API` — Enable/disable the Anthropic provider (default: `true`) - `ANTHROPIC_API_KEY` / `ANTHROPIC_API_KEYS` — API key(s), semicolon-separated for multiple connections - `ANTHROPIC_API_BASE_URL` / `ANTHROPIC_API_BASE_URLS` — Base URL(s) (default: `https://api.anthropic.com`) ### Changed - **Model routing** (`backend/open_webui/utils/chat.py`) — Added routing for models with `owned_by == "anthropic"` to the Anthropic chat completion handler, alongside existing OpenAI and Ollama routes. - **Model discovery** (`backend/open_webui/utils/models.py`) — Added Anthropic models to the unified model registry fetched by `get_all_base_models()`. - **Main application** (`backend/open_webui/main.py`) — Registered Anthropic router at `/anthropic` prefix, added config state initialization, and added middleware support for `x-api-key` header authentication (Anthropic Messages API compatibility). - **Frontend constants** (`src/lib/constants.ts`) — Added `ANTHROPIC_API_BASE_URL` constant. - **Admin Connections UI** (`src/lib/components/admin/Settings/Connections.svelte`) — Added Anthropic configuration section with state management, handlers, and UI. ### Deprecated - N/A ### Removed - N/A ### Fixed - N/A ### Security - API keys are stored using `PersistentConfig` (same mechanism as OpenAI keys) and are never exposed to non-admin users. - Model access control is enforced through Open WebUI's existing `AccessGrants` system. ### Breaking Changes - None. This is a purely additive change. Existing OpenAI, Ollama, and other provider connections are unaffected. --- ### Additional Information #### Architecture The integration follows a three-layer conversion pattern: 1. **Inbound**: Open WebUI's middleware processes the request normally (system prompts, tool injection, file handling, etc.) and produces an OpenAI-format payload. 2. **Conversion**: `anthropic_payload.py` converts the OpenAI payload to Anthropic's Messages API format — extracting system prompts, converting content blocks, mapping tool definitions, and handling parameter differences. 3. **Outbound**: `anthropic_response.py` converts the Anthropic streaming response back to OpenAI-compatible SSE events, which flow through the existing middleware pipeline for post-processing (usage extraction, reasoning tag detection, etc.). This approach means all of Open WebUI's existing features (chat controls, system prompts, tool calling, file attachments, access control, usage tracking) work with Anthropic models without modification. #### Per-Connection Configuration Each Anthropic connection supports: | Option | Description | |--------|-------------| | `enable` | Enable/disable the connection | | `model_ids` | Restrict which models are exposed (empty = all) | | `prefix_id` | Custom prefix for model IDs and names (e.g., `work.claude-sonnet-4-20250514`) | | `tags` | Custom tags shown in the model selector | | `connection_type` | Mark as `"external"` or `"internal"` | #### Dependencies - **`anthropic` Python SDK** (latest) — Used for `AsyncAnthropic` client, `models.list()` for discovery, and `messages.create()` for chat completions. The SDK handles authentication, retries, and Anthropic-specific error types. #### Model Discovery Models are fetched via the SDK's `models.list()` API with automatic pagination. If the API call fails (e.g., the endpoint doesn't support model listing), the router falls back to a hardcoded list of known models (Claude Opus 4, Sonnet 4, Sonnet 4.5, Haiku 3.5) with accurate context window and max output token metadata. #### Error Handling All Anthropic SDK exceptions are caught and mapped to appropriate HTTP status codes: | Anthropic Exception | HTTP Status | |---------------------|-------------| | `AuthenticationError` | 401 | | `PermissionDeniedError` | 403 | | `NotFoundError` | 404 | | `RateLimitError` | 429 | | `APIStatusError` | Passthrough | | `APIConnectionError` | 502 | | `APITimeoutError` | 504 | #### Testing Manually tested with: - Single and multi-turn conversations with Claude Sonnet 4 - Streaming and non-streaming responses - System prompt propagation from chat controls - Token usage display via the existing info tooltip - Model discovery with prefix ID and tagging - Multiple connections with different configurations - Admin UI: add, configure, enable/disable, and delete connections ### Screenshots or Videos Admin Connections page showing Anthropic section <img width="1055" height="856" alt="image" src="https://github.com/user-attachments/assets/4cb61c80-cd46-48e6-9231-e269d1cc21f5" /> <img width="1068" height="823" alt="image" src="https://github.com/user-attachments/assets/f5fc6f9d-6935-4f8a-99f2-b2853287bce8" /> Model selector showing Anthropic models with prefix/tags <img width="1643" height="1291" alt="image" src="https://github.com/user-attachments/assets/1b858004-960e-40c9-8856-657884806d56" /> Chat conversation with token usage tooltip <img width="291" height="265" alt="image" src="https://github.com/user-attachments/assets/62dd4c75-778b-4322-b8fc-df4ca2946462" /> ### Contributor License Agreement <!-- 🚨 DO NOT DELETE THE TEXT BELOW 🚨 Keep the "Contributor License Agreement" confirmation text intact. Deleting it will trigger the CLA-Bot to INVALIDATE your PR. --> By submitting this pull request, I confirm that I have read and fully agree to the [Contributor License Agreement (CLA)](https://github.com/open-webui/open-webui/blob/main/CONTRIBUTOR_LICENSE_AGREEMENT), and I am providing my contributions under its terms. > [!NOTE] > Deleting the CLA section will lead to immediate closure of your PR and it will not be merged in. --- <sub>🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.</sub>
GiteaMirror added the pull-request label 2026-04-30 01:34:28 -05:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#49251