mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-07 11:28:35 -05:00
feat: Add support for Anthropic API (Claude etc) #1315
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @moodler on GitHub (Jun 19, 2024).
I often use Claude from Anthropic for some use cases, as it has some advantages. But I'd love to do it in Open WebUI so I can keep all my chat records in one place.
I'd like to propose support for the Anthropic API in the same way that OpenAI API is supported. The API is extremely similar: https://docs.anthropic.com/en/api/getting-started
And really, if we do this, why stop there, there could easily be a way that admins can define new external services out there that work in similar ways. Perhaps these definitions are shareable as JSON files.
@justinh-rahb commented on GitHub (Jun 19, 2024):
Supported natively via Functions: https://openwebui.com/f/justinrahb/anthropic/
@zaptrem commented on GitHub (Jun 27, 2024):
The process for actually setting this up is very unclear and unwieldly. For anyone else that doesn't want to lose the time I lost:
@justinh-rahb commented on GitHub (Jun 27, 2024):
Docs PRs always welcomed 👍
@moodler commented on GitHub (Jul 5, 2024):
Thanks to @justinh-rahb for the pipelines project (though it feels overkill making this a whole separate service). Thanks @zaptrem for the extra guidance.
I had all this working for a while, but then it just broke ... containers all seem fine but I'm getting random errors like:
What are Valves? No idea. Will keep playing and trying to make this work ...
@moblangeois commented on GitHub (Jul 5, 2024):
You can now integrate anthropic manifold through functions without installing Pipelines : https://openwebui.com/f/justinrahb/anthropic/
Works well for me.
@moodler commented on GitHub (Jul 10, 2024):
Fantastic, that is exactly what I'd asked for in the original post. Confirmed it works great!
Full install instructions:
@justinh-rahb commented on GitHub (Jul 10, 2024):
Glad that I could be of assistance 🫡
@Piste commented on GitHub (Jul 14, 2024):
Thank you, everyone! Amazing. The function works really well.
@theultimatetestings commented on GitHub (Jul 26, 2024):
Is anyone else encountering a network error 400 stating you have insufficient credit balance? I thought Claude 3.5-sonnet was free use.
@justinh-rahb commented on GitHub (Jul 26, 2024):
Their chat app at claude.ai has a limited free tier, the API does not.
@ddobrinskiy commented on GitHub (Aug 7, 2024):
Worked like a charm, thanks everyone!
For those confused about where to set the API key exactly, it's here: http://localhost:3000/workspace/functions
(took me some time to find where exactly the Anthropic API key should go)
@zuli12-dev commented on GitHub (Aug 13, 2024):
sad we cannot use the connections way, i would love to have the connection rather than an function installed :(
@darkvertex commented on GitHub (Aug 19, 2024):
FYI you can configure litellm as a proxy on the side and use that. It presents as an OpenAI-style API and you just use it instead of the OpenAI connection in OpenWebUI, and it exposes all the providers you would ever want all at once.
@darkvertex commented on GitHub (Aug 21, 2024):
@justinh-rahb hey, I'm on the newest stable OpenWebUI (v0.3.14) and now suddenly your Anthropic manifold pipe script is throwing errors:
I think maybe there was a breaking API change?
@darkvertex commented on GitHub (Aug 21, 2024):
Seems it's not the only pipe that broke: https://github.com/open-webui/open-webui/issues/4791
@justinh-rahb commented on GitHub (Aug 21, 2024):
@darkvertex investigating.
@sanjaymaniam commented on GitHub (Sep 4, 2024):
Title generation does not work when Anthropic models are installed as a function. Are there any known ways to make this work?
@justinh-rahb commented on GitHub (Sep 4, 2024):
@sanjaymaniam works fine on latest version for me.
@nnnnicholas commented on GitHub (Sep 6, 2024):
Would it be possible to add support for Claude API in the Connections menu? The above discussion suggests that the Pipes solution is fragile. I'm not crazy about handing my API key directly to a third party extension just to make basic Claude requests. It's a lot of friction as a new user of OpenWebUI. I would like to migrate from LibreChat but have tried and given up a few times now.
Thanks for building OWUI.. very excited to eventually get it up and running!
@heltonteixeira commented on GitHub (Sep 22, 2024):
Im running it on my local environment, but when I try to install using the function method this error shows
Anyone could make it work on the latest version?
@sanebg commented on GitHub (Sep 26, 2024):
Gosh, why just don't add the ability to enter the Claude API endpoint via the connections menu. If I am able to install functions and scripts and what not, I would not need WebUI chat in a first place but I was going to use Ollama. I though the idea is nice USER EXPERIENCE.
@moblangeois commented on GitHub (Sep 26, 2024):
Thanks for the feedback – truly inspiring.
@FedeCuci commented on GitHub (Oct 3, 2024):
@darkvertex Can you expand on the litellm proxy? I tried setting it up to include Anthropic support but am having issues.
@eugrus commented on GitHub (Nov 11, 2024):
Is an out of the box support planned though?
@MarioIshac commented on GitHub (Dec 6, 2024):
+1 for native support. Would be great at same level of easiness as OpenAI in connections menu.
@darkvertex commented on GitHub (Dec 11, 2024):
@FedeCuci Ok, so...
Make a
config.ymlfor litellm with your OpenAI and Claude models you wish to expose. Might look something like:(I'm using env vars to not hardcode any API keys in the config.)
Then you can use the container
ghcr.io/berriai/litellm:main-latestwith said file, ie:Then go check http://localhost:4000 to see if LiteLLM's API is alive.
If it's up, in theory you can go edit your OpenWebUI's Connections OpenAI API URL to point to your host instead of

https://api.openai.com:...and then it should work. ✨
@pavoltravnik commented on GitHub (Feb 15, 2025):
Thanks, this worked perfectly fine, however on latest version v0.5.10 i can not see models of anthropic in the select model.
@darkBuddha commented on GitHub (Feb 24, 2025):
Why is it not supported?
@sarzixon commented on GitHub (Feb 25, 2025):
This works perfectly, thanks!
Is Extended Thinking feature supported?
@arty-hlr commented on GitHub (Feb 25, 2025):
@sarzixon It has been added a few hours ago, see the relevant PR (not yet merged), but it's not working perfectly yet.
@i0ntempest commented on GitHub (Mar 4, 2025):
Is there still no native support for Anthropic API as a connection?
Also Anthropic API has an OpenAI compatibility layer though, but I cannot get it to work when adding a connection.
@ineiti commented on GitHub (Mar 13, 2025):
I have the same problem here - the function is installed, but I cannot choose the models :(
I see some
ssl connection errors to localhost:11434- not sure if this has to do with the function I imported or not.And the valves are empty:
@nobretere commented on GitHub (Mar 26, 2025):
Thank You
@eg-mattl commented on GitHub (Jun 12, 2025):
I was able to get it working by manually specifying the model name.
@arty-hlr commented on GitHub (Jun 12, 2025):
@eg-mattl They say it's only for testing and comparing models though, it won't have the full capabilities of their API. I recommend setting up a litellm proxy instead.
@dom6770 commented on GitHub (Jun 12, 2025):
I mean, to this day the Anthropic function hasn't been update to include Claude 4 moments, so I am really puzzled why Antrophic API isn't officially supported through 'Connections' as OpenAI.
@arty-hlr commented on GitHub (Jun 13, 2025):
@dom6770 Because the open-webui has made the decision to only support the openai API. I don't trust random user functions that as you say are not maintained or updated, hence why litellm-proxy.
@miversen33 commented on GitHub (Jun 26, 2025):
@dom6770 you have the code literally right there in your reply. Simply add
to the end of the array in the
get_anthropic_modelsmethod.That said, it is a strange decision to be so locked to openai here.
@wkbaran commented on GitHub (Jul 22, 2025):
Claude Code is dominating yet still no improvement here?
You can't use 'natively' and 'Functions' in the same sentence, especially when the Function isn't maintained by Open Webui.
@prodigy commented on GitHub (Aug 27, 2025):
In case anyone finds this for the use of opus 4.1 - You cannot set top_p and temperature in the same request.
I changed the payload generation to this to make it work:
And this is for the model:
@AumCoin commented on GitHub (Sep 6, 2025):
I installed the function and set the API key but I still don't see any Anthropic models listed when I try to set a model. The Pipelines method also does not work for me.
@arty-hlr commented on GitHub (Sep 8, 2025):
@AumCoin Use litellm-proxy instead. User-written functions or pipelines are not reliable.
@dbrans commented on GitHub (Nov 6, 2025):
I’m fine with doing an initial setup to support Claude models, but there should be a safe "update" process with changes to the payload and new model releases — admins shouldn’t need to hand-edit function each time.
@arty-hlr commented on GitHub (Nov 10, 2025):
@dbrans That's why litelm-proxy is recommended instead of using user made functions.