feat: Add support for Anthropic API (Claude etc) #1315

Closed
opened 2025-11-11 14:42:27 -06:00 by GiteaMirror · 44 comments
Owner

Originally created by @moodler on GitHub (Jun 19, 2024).

I often use Claude from Anthropic for some use cases, as it has some advantages. But I'd love to do it in Open WebUI so I can keep all my chat records in one place.

I'd like to propose support for the Anthropic API in the same way that OpenAI API is supported. The API is extremely similar: https://docs.anthropic.com/en/api/getting-started

And really, if we do this, why stop there, there could easily be a way that admins can define new external services out there that work in similar ways. Perhaps these definitions are shareable as JSON files.

Originally created by @moodler on GitHub (Jun 19, 2024). I often use Claude from Anthropic for some use cases, as it has some advantages. But I'd love to do it in Open WebUI so I can keep all my chat records in one place. I'd like to propose support for the Anthropic API in the same way that OpenAI API is supported. The API is extremely similar: https://docs.anthropic.com/en/api/getting-started And really, if we do this, why stop there, there could easily be a way that admins can define new external services out there that work in similar ways. Perhaps these definitions are shareable as JSON files.
Author
Owner

@justinh-rahb commented on GitHub (Jun 19, 2024):

Supported natively via Functions: https://openwebui.com/f/justinrahb/anthropic/

@justinh-rahb commented on GitHub (Jun 19, 2024): Supported natively via Functions: https://openwebui.com/f/justinrahb/anthropic/
Author
Owner

@zaptrem commented on GitHub (Jun 27, 2024):

The process for actually setting this up is very unclear and unwieldly. For anyone else that doesn't want to lose the time I lost:

  1. Set up the main Open-WebUI Docker container following repo instructions (there goes half your RAM)
  2. Set up Pipelines Docker container following the instructions in the readme (there goes the other half of your RAM)
  3. In Open-WebUI click your Profile Picture > Admin Panel > Gear Icon (Admin Settings) > Pipelines > Paste this URL: https://github.com/open-webui/pipelines/blob/main/examples/pipelines/providers/anthropic_manifold_pipeline.py
  4. Enter your Anthropic API key.
@zaptrem commented on GitHub (Jun 27, 2024): The process for actually setting this up is very unclear and unwieldly. For anyone else that doesn't want to lose the time I lost: 1. Set up the main Open-WebUI Docker container following repo instructions (there goes half your RAM) 2. Set up Pipelines Docker container following the instructions in the readme (there goes the other half of your RAM) 3. In Open-WebUI click your Profile Picture > Admin Panel > Gear Icon (Admin Settings) > Pipelines > Paste this URL: https://github.com/open-webui/pipelines/blob/main/examples/pipelines/providers/anthropic_manifold_pipeline.py 4. Enter your Anthropic API key.
Author
Owner

@justinh-rahb commented on GitHub (Jun 27, 2024):

Docs PRs always welcomed 👍

@justinh-rahb commented on GitHub (Jun 27, 2024): Docs PRs always welcomed 👍
Author
Owner

@moodler commented on GitHub (Jul 5, 2024):

Thanks to @justinh-rahb for the pipelines project (though it feels overkill making this a whole separate service). Thanks @zaptrem for the extra guidance.

I had all this working for a while, but then it just broke ... containers all seem fine but I'm getting random errors like:

  • Uh-oh! There was an issue connecting to anthropic/claude-3.5-sonnet. Load failed
  • OpenAI: Could not fetch models from OpenAI, please update the API key in the valves.

What are Valves? No idea. Will keep playing and trying to make this work ...

@moodler commented on GitHub (Jul 5, 2024): Thanks to @justinh-rahb for the pipelines project (though it feels overkill making this a whole separate service). Thanks @zaptrem for the extra guidance. I had all this working for a while, but then it just broke ... containers all seem fine but I'm getting random errors like: - Uh-oh! There was an issue connecting to anthropic/claude-3.5-sonnet. Load failed - OpenAI: Could not fetch models from OpenAI, please update the API key in the valves. What are Valves? No idea. Will keep playing and trying to make this work ...
Author
Owner

@moblangeois commented on GitHub (Jul 5, 2024):

You can now integrate anthropic manifold through functions without installing Pipelines : https://openwebui.com/f/justinrahb/anthropic/

Works well for me.

@moblangeois commented on GitHub (Jul 5, 2024): You can now integrate anthropic manifold through functions without installing Pipelines : https://openwebui.com/f/justinrahb/anthropic/ Works well for me.
Author
Owner

@moodler commented on GitHub (Jul 10, 2024):

Fantastic, that is exactly what I'd asked for in the original post. Confirmed it works great!

Full install instructions:

  1. Go to https://openwebui.com/f/justinrahb/anthropic/
  2. Press Get and specify your OpenWebUI URL to install the script into your Workspace as a Function.
  3. Once installed, click the setting icon for that new function (called Valves for some reason) and enter your Anthropic key.
  4. Enjoy Claude working perfectly without an entire new Pipelines docker image needed.
@moodler commented on GitHub (Jul 10, 2024): Fantastic, that is **exactly** what I'd asked for in the original post. Confirmed it works great! Full install instructions: 1. Go to https://openwebui.com/f/justinrahb/anthropic/ 2. Press **Get** and specify your OpenWebUI URL to install the script into your Workspace as a Function. 3. Once installed, click the setting icon for that new function (called Valves for some reason) and enter your Anthropic key. 4. Enjoy Claude working perfectly without an entire new Pipelines docker image needed.
Author
Owner

@justinh-rahb commented on GitHub (Jul 10, 2024):

Glad that I could be of assistance 🫡

@justinh-rahb commented on GitHub (Jul 10, 2024): Glad that I could be of assistance 🫡
Author
Owner

@Piste commented on GitHub (Jul 14, 2024):

Thank you, everyone! Amazing. The function works really well.

@Piste commented on GitHub (Jul 14, 2024): Thank you, everyone! Amazing. The function works really well.
Author
Owner

@theultimatetestings commented on GitHub (Jul 26, 2024):

Is anyone else encountering a network error 400 stating you have insufficient credit balance? I thought Claude 3.5-sonnet was free use.

@theultimatetestings commented on GitHub (Jul 26, 2024): Is anyone else encountering a network error 400 stating you have insufficient credit balance? I thought Claude 3.5-sonnet was free use.
Author
Owner

@justinh-rahb commented on GitHub (Jul 26, 2024):

Is anyone else encountering a network error 400 stating you have insufficient credit balance? I thought Claude 3.5-sonnet was free use.

Their chat app at claude.ai has a limited free tier, the API does not.

@justinh-rahb commented on GitHub (Jul 26, 2024): > Is anyone else encountering a network error 400 stating you have insufficient credit balance? I thought Claude 3.5-sonnet was free use. Their chat app at claude.ai has a limited free tier, the API does not.
Author
Owner

@ddobrinskiy commented on GitHub (Aug 7, 2024):

Worked like a charm, thanks everyone!

For those confused about where to set the API key exactly, it's here: http://localhost:3000/workspace/functions

(took me some time to find where exactly the Anthropic API key should go)

image

@ddobrinskiy commented on GitHub (Aug 7, 2024): Worked like a charm, thanks everyone! For those confused about where to set the API key exactly, it's here: http://localhost:3000/workspace/functions (took me some time to find where exactly the Anthropic API key should go) ![image](https://github.com/user-attachments/assets/a210386b-7cd2-4d03-913e-5b95b981f815)
Author
Owner

@zuli12-dev commented on GitHub (Aug 13, 2024):

sad we cannot use the connections way, i would love to have the connection rather than an function installed :(

@zuli12-dev commented on GitHub (Aug 13, 2024): sad we cannot use the connections way, i would love to have the connection rather than an function installed :(
Author
Owner

@darkvertex commented on GitHub (Aug 19, 2024):

sad we cannot use the connections way, i would love to have the connection rather than an function installed :(

FYI you can configure litellm as a proxy on the side and use that. It presents as an OpenAI-style API and you just use it instead of the OpenAI connection in OpenWebUI, and it exposes all the providers you would ever want all at once.

@darkvertex commented on GitHub (Aug 19, 2024): > sad we cannot use the connections way, i would love to have the connection rather than an function installed :( FYI you can configure [litellm](https://docs.litellm.ai/docs/providers) as a proxy on the side and use that. It presents as an OpenAI-style API and you just use it instead of the OpenAI connection in OpenWebUI, and it exposes _[all the providers](https://docs.litellm.ai/docs/providers)_ you would ever want all at once.
Author
Owner

@darkvertex commented on GitHub (Aug 21, 2024):

@justinh-rahb hey, I'm on the newest stable OpenWebUI (v0.3.14) and now suddenly your Anthropic manifold pipe script is throwing errors:

Traceback (most recent call last):
  File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 401, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in __call__
    return await self.app(scope, receive, send)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
    raise exc
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
    await self.app(scope, receive, _send)
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__
    with collapse_excgroups():
  File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__
    self.gen.throw(typ, value, traceback)
  File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups
    raise exc
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__
    response = await self.dispatch_func(request, call_next)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/backend/main.py", line 792, in update_embedding_function
    response = await call_next(request)
               ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 165, in call_next
    raise app_exc
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 151, in coro
    await self.app(scope, receive_or_disconnect, send_no_error)
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__
    with collapse_excgroups():
  File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__
    self.gen.throw(typ, value, traceback)
  File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups
    raise exc
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__
    response = await self.dispatch_func(request, call_next)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/backend/main.py", line 783, in check_url
    response = await call_next(request)
               ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 165, in call_next
    raise app_exc
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 151, in coro
    await self.app(scope, receive_or_disconnect, send_no_error)
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__
    with collapse_excgroups():
  File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__
    self.gen.throw(typ, value, traceback)
  File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups
    raise exc
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__
    response = await self.dispatch_func(request, call_next)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/backend/main.py", line 769, in commit_session_after_request
    response = await call_next(request)
               ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 165, in call_next
    raise app_exc
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 151, in coro
    await self.app(scope, receive_or_disconnect, send_no_error)
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/cors.py", line 93, in __call__
    await self.simple_response(scope, receive, send, request_headers=headers)
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/cors.py", line 148, in simple_response
    await self.app(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__
    with collapse_excgroups():
  File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__
    self.gen.throw(typ, value, traceback)
  File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups
    raise exc
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__
    response = await self.dispatch_func(request, call_next)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/backend/main.py", line 748, in dispatch
    response = await call_next(request)
               ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 165, in call_next
    raise app_exc
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 151, in coro
    await self.app(scope, receive_or_disconnect, send_no_error)
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__
    with collapse_excgroups():
  File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__
    self.gen.throw(typ, value, traceback)
  File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups
    raise exc
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__
    response = await self.dispatch_func(request, call_next)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/backend/main.py", line 613, in dispatch
    response = await call_next(request)
               ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 165, in call_next
    raise app_exc
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 151, in coro
    await self.app(scope, receive_or_disconnect, send_no_error)
  File "/usr/local/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 65, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 756, in __call__
    await self.middleware_stack(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 776, in app
    await route.handle(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 297, in handle
    await self.app(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 77, in app
    await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
    raise exc
  File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    await app(scope, receive, sender)
  File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 72, in app
    response = await func(request)
               ^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 278, in app
    raw_response = await run_endpoint_function(
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
    return await dependant.call(**values)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/backend/main.py", line 993, in generate_chat_completions
    return await generate_function_chat_completion(form_data, user=user)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/backend/apps/webui/main.py", line 304, in generate_function_chat_completion
    configured_tools = get_tools(app, tool_ids, user, tools_params)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/backend/utils/tools.py", line 37, in get_tools
    for tool_id in tool_ids:
TypeError: 'NoneType' object is not iterable
generate_title
anthropic.claude-3-5-sonnet-20240620
anthropic
anthropic

I think maybe there was a breaking API change?

@darkvertex commented on GitHub (Aug 21, 2024): @justinh-rahb hey, I'm on the newest stable OpenWebUI (v0.3.14) and now suddenly your Anthropic manifold pipe script is throwing errors: ``` Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 401, in run_asgi result = await app( # type: ignore[func-returns-value] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in __call__ return await self.app(scope, receive, send) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__ await super().__call__(scope, receive, send) File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__ await self.middleware_stack(scope, receive, send) File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__ raise exc File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__ await self.app(scope, receive, _send) File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__ with collapse_excgroups(): File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__ self.gen.throw(typ, value, traceback) File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups raise exc File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__ response = await self.dispatch_func(request, call_next) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/backend/main.py", line 792, in update_embedding_function response = await call_next(request) ^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 165, in call_next raise app_exc File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 151, in coro await self.app(scope, receive_or_disconnect, send_no_error) File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__ with collapse_excgroups(): File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__ self.gen.throw(typ, value, traceback) File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups raise exc File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__ response = await self.dispatch_func(request, call_next) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/backend/main.py", line 783, in check_url response = await call_next(request) ^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 165, in call_next raise app_exc File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 151, in coro await self.app(scope, receive_or_disconnect, send_no_error) File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__ with collapse_excgroups(): File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__ self.gen.throw(typ, value, traceback) File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups raise exc File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__ response = await self.dispatch_func(request, call_next) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/backend/main.py", line 769, in commit_session_after_request response = await call_next(request) ^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 165, in call_next raise app_exc File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 151, in coro await self.app(scope, receive_or_disconnect, send_no_error) File "/usr/local/lib/python3.11/site-packages/starlette/middleware/cors.py", line 93, in __call__ await self.simple_response(scope, receive, send, request_headers=headers) File "/usr/local/lib/python3.11/site-packages/starlette/middleware/cors.py", line 148, in simple_response await self.app(scope, receive, send) File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__ with collapse_excgroups(): File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__ self.gen.throw(typ, value, traceback) File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups raise exc File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__ response = await self.dispatch_func(request, call_next) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/backend/main.py", line 748, in dispatch response = await call_next(request) ^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 165, in call_next raise app_exc File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 151, in coro await self.app(scope, receive_or_disconnect, send_no_error) File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__ with collapse_excgroups(): File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__ self.gen.throw(typ, value, traceback) File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups raise exc File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__ response = await self.dispatch_func(request, call_next) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/backend/main.py", line 613, in dispatch response = await call_next(request) ^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 165, in call_next raise app_exc File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 151, in coro await self.app(scope, receive_or_disconnect, send_no_error) File "/usr/local/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 65, in __call__ await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send) File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app raise exc File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app await app(scope, receive, sender) File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 756, in __call__ await self.middleware_stack(scope, receive, send) File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 776, in app await route.handle(scope, receive, send) File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 297, in handle await self.app(scope, receive, send) File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 77, in app await wrap_app_handling_exceptions(app, request)(scope, receive, send) File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app raise exc File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app await app(scope, receive, sender) File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 72, in app response = await func(request) ^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 278, in app raw_response = await run_endpoint_function( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 191, in run_endpoint_function return await dependant.call(**values) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/backend/main.py", line 993, in generate_chat_completions return await generate_function_chat_completion(form_data, user=user) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/backend/apps/webui/main.py", line 304, in generate_function_chat_completion configured_tools = get_tools(app, tool_ids, user, tools_params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/app/backend/utils/tools.py", line 37, in get_tools for tool_id in tool_ids: TypeError: 'NoneType' object is not iterable generate_title anthropic.claude-3-5-sonnet-20240620 anthropic anthropic ``` I think maybe there was a breaking API change?
Author
Owner

@darkvertex commented on GitHub (Aug 21, 2024):

Seems it's not the only pipe that broke: https://github.com/open-webui/open-webui/issues/4791

@darkvertex commented on GitHub (Aug 21, 2024): Seems it's not the only pipe that broke: https://github.com/open-webui/open-webui/issues/4791
Author
Owner

@justinh-rahb commented on GitHub (Aug 21, 2024):

@darkvertex investigating.

@justinh-rahb commented on GitHub (Aug 21, 2024): @darkvertex investigating.
Author
Owner

@sanjaymaniam commented on GitHub (Sep 4, 2024):

Title generation does not work when Anthropic models are installed as a function. Are there any known ways to make this work?

@sanjaymaniam commented on GitHub (Sep 4, 2024): Title generation does not work when Anthropic models are installed as a function. Are there any known ways to make this work?
Author
Owner

@justinh-rahb commented on GitHub (Sep 4, 2024):

Title generation does not work when Anthropic models are installed as a function. Are there any known ways to make this work?

@sanjaymaniam works fine on latest version for me.

@justinh-rahb commented on GitHub (Sep 4, 2024): > Title generation does not work when Anthropic models are installed as a function. Are there any known ways to make this work? @sanjaymaniam works fine on latest version for me.
Author
Owner

@nnnnicholas commented on GitHub (Sep 6, 2024):

sad we cannot use the connections way, i would love to have the connection rather than an function installed :(

Would it be possible to add support for Claude API in the Connections menu? The above discussion suggests that the Pipes solution is fragile. I'm not crazy about handing my API key directly to a third party extension just to make basic Claude requests. It's a lot of friction as a new user of OpenWebUI. I would like to migrate from LibreChat but have tried and given up a few times now.

Thanks for building OWUI.. very excited to eventually get it up and running!

@nnnnicholas commented on GitHub (Sep 6, 2024): > sad we cannot use the connections way, i would love to have the connection rather than an function installed :( Would it be possible to add support for Claude API in the Connections menu? The above discussion suggests that the Pipes solution is fragile. I'm not crazy about handing my API key directly to a third party extension just to make basic Claude requests. It's a lot of friction as a new user of OpenWebUI. I would like to migrate from LibreChat but have tried and given up a few times now. Thanks for building OWUI.. very excited to eventually get it up and running!
Author
Owner

@heltonteixeira commented on GitHub (Sep 22, 2024):

Im running it on my local environment, but when I try to install using the function method this error shows

Something went wrong :/ [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\...\\AppData\\Local\\Temp\\tmp8jzg9yzu'

Anyone could make it work on the latest version?

@heltonteixeira commented on GitHub (Sep 22, 2024): Im running it on my local environment, but when I try to install using the function method this error shows ```console Something went wrong :/ [WinError 32] The process cannot access the file because it is being used by another process: 'C:\\Users\\...\\AppData\\Local\\Temp\\tmp8jzg9yzu' ``` Anyone could make it work on the latest version?
Author
Owner

@sanebg commented on GitHub (Sep 26, 2024):

Gosh, why just don't add the ability to enter the Claude API endpoint via the connections menu. If I am able to install functions and scripts and what not, I would not need WebUI chat in a first place but I was going to use Ollama. I though the idea is nice USER EXPERIENCE.

@sanebg commented on GitHub (Sep 26, 2024): Gosh, why just don't add the ability to enter the Claude API endpoint via the connections menu. If I am able to install functions and scripts and what not, I would not need WebUI chat in a first place but I was going to use Ollama. I though the idea is nice USER EXPERIENCE.
Author
Owner

@moblangeois commented on GitHub (Sep 26, 2024):

Thanks for the feedback – truly inspiring.

@moblangeois commented on GitHub (Sep 26, 2024): Thanks for the feedback – truly inspiring.
Author
Owner

@FedeCuci commented on GitHub (Oct 3, 2024):

@darkvertex Can you expand on the litellm proxy? I tried setting it up to include Anthropic support but am having issues.

@FedeCuci commented on GitHub (Oct 3, 2024): @darkvertex Can you expand on the litellm proxy? I tried setting it up to include Anthropic support but am having issues.
Author
Owner

@eugrus commented on GitHub (Nov 11, 2024):

Our Pipelines sister-project already supports many providers that are often requested, including Anthropic:

Is an out of the box support planned though?

@eugrus commented on GitHub (Nov 11, 2024): > Our Pipelines sister-project already supports many providers that are often requested, including Anthropic: Is an out of the box support planned though?
Author
Owner

@MarioIshac commented on GitHub (Dec 6, 2024):

+1 for native support. Would be great at same level of easiness as OpenAI in connections menu.

@MarioIshac commented on GitHub (Dec 6, 2024): +1 for native support. Would be great at same level of easiness as OpenAI in connections menu.
Author
Owner

@darkvertex commented on GitHub (Dec 11, 2024):

@darkvertex Can you expand on the litellm proxy? I tried setting it up to include Anthropic support but am having issues.

@FedeCuci Ok, so...

Make a config.yml for litellm with your OpenAI and Claude models you wish to expose. Might look something like:

model_list:

  # OpenAI API models:
  # https://platform.openai.com/docs/models
  - model_name: gpt-3.5-turbo-16k
    litellm_params:
      model: gpt-3.5-turbo-16k
      api_key: "os.environ/OPENAI_API_KEY"
  - model_name: gpt-3.5-turbo-instruct
    litellm_params:
      model: gpt-3.5-turbo-instruct
      api_key: "os.environ/OPENAI_API_KEY"
  - model_name: gpt-3.5-turbo
    litellm_params:
      model: gpt-3.5-turbo
      api_key: "os.environ/OPENAI_API_KEY"
  - model_name: gpt-4-0125-preview
    litellm_params:
      model: gpt-4-0125-preview
      api_key: "os.environ/OPENAI_API_KEY"
  - model_name: gpt-4-0613
    litellm_params:
      model: gpt-4-0613
      api_key: "os.environ/OPENAI_API_KEY"
  - model_name: gpt-4-1106-preview
    litellm_params:
      model: gpt-4-1106-preview
      api_key: "os.environ/OPENAI_API_KEY"
  - model_name: gpt-4-1106-vision-preview
    litellm_params:
      model: gpt-4-1106-vision-preview
      api_key: "os.environ/OPENAI_API_KEY"
  - model_name: gpt-4-turbo-preview
    litellm_params:
      model: gpt-4-turbo-preview
      api_key: "os.environ/OPENAI_API_KEY"
  - model_name: gpt-4-turbo
    litellm_params:
      model: gpt-4-turbo
      api_key: "os.environ/OPENAI_API_KEY"
  - model_name: gpt-4-vision-preview
    litellm_params:
      model: gpt-4-vision-preview
      api_key: "os.environ/OPENAI_API_KEY"
  - model_name: gpt-4
    litellm_params:
      model: gpt-4
      api_key: "os.environ/OPENAI_API_KEY"
  - model_name: gpt-4o
    litellm_params:
      model: gpt-4o
      api_key: "os.environ/OPENAI_API_KEY"

  # Anthropic (Claude) API models:
  # https://docs.anthropic.com/en/docs/about-claude/models#model-names
  - model_name: claude-3-5-sonnet
    litellm_params:
      model: claude-3-5-sonnet-20240620
      api_key: "os.environ/ANTHROPIC_API_KEY"
  - model_name: claude-3-5-sonnet-v2
    litellm_params:
      model: claude-3-5-sonnet-latest
      api_key: "os.environ/ANTHROPIC_API_KEY" 
  - model_name: claude-3-7-sonnet
    litellm_params:
      model: claude-3-7-sonnet-latest
      api_key: "os.environ/ANTHROPIC_API_KEY"
  - model_name: claude-3-opus
    litellm_params:
      model: claude-3-opus-latest
      api_key: "os.environ/ANTHROPIC_API_KEY"
  - model_name: claude-3-sonnet
    litellm_params:
      model: claude-3-sonnet-20240229
      api_key: "os.environ/ANTHROPIC_API_KEY"
  - model_name: claude-3-haiku
    litellm_params:
      model: claude-3-haiku-20240307
      api_key: "os.environ/ANTHROPIC_API_KEY"
  - model_name: claude-3-5-haiku
    litellm_params:
      model: claude-3-5-haiku-latest
      api_key: "os.environ/ANTHROPIC_API_KEY"

(I'm using env vars to not hardcode any API keys in the config.)

Then you can use the container ghcr.io/berriai/litellm:main-latest with said file, ie:

docker run --rm -it \
-v ./config.yaml:/app/config.yaml \
-e OPENAI_API_KEY=1234 \
-e ANTHROPIC_API_KEY=5678 \
-e LITELLM_MASTER_KEY=putsomethingveryrandomhere \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-latest --config /app/config.yaml --detailed_debug

Then go check http://localhost:4000 to see if LiteLLM's API is alive.

If it's up, in theory you can go edit your OpenWebUI's Connections OpenAI API URL to point to your host instead of https://api.openai.com:
image
...and then it should work.

@darkvertex commented on GitHub (Dec 11, 2024): > @darkvertex Can you expand on the litellm proxy? I tried setting it up to include Anthropic support but am having issues. @FedeCuci Ok, so... Make a `config.yml` for litellm with your OpenAI and Claude models you wish to expose. Might look something like: ``` model_list: # OpenAI API models: # https://platform.openai.com/docs/models - model_name: gpt-3.5-turbo-16k litellm_params: model: gpt-3.5-turbo-16k api_key: "os.environ/OPENAI_API_KEY" - model_name: gpt-3.5-turbo-instruct litellm_params: model: gpt-3.5-turbo-instruct api_key: "os.environ/OPENAI_API_KEY" - model_name: gpt-3.5-turbo litellm_params: model: gpt-3.5-turbo api_key: "os.environ/OPENAI_API_KEY" - model_name: gpt-4-0125-preview litellm_params: model: gpt-4-0125-preview api_key: "os.environ/OPENAI_API_KEY" - model_name: gpt-4-0613 litellm_params: model: gpt-4-0613 api_key: "os.environ/OPENAI_API_KEY" - model_name: gpt-4-1106-preview litellm_params: model: gpt-4-1106-preview api_key: "os.environ/OPENAI_API_KEY" - model_name: gpt-4-1106-vision-preview litellm_params: model: gpt-4-1106-vision-preview api_key: "os.environ/OPENAI_API_KEY" - model_name: gpt-4-turbo-preview litellm_params: model: gpt-4-turbo-preview api_key: "os.environ/OPENAI_API_KEY" - model_name: gpt-4-turbo litellm_params: model: gpt-4-turbo api_key: "os.environ/OPENAI_API_KEY" - model_name: gpt-4-vision-preview litellm_params: model: gpt-4-vision-preview api_key: "os.environ/OPENAI_API_KEY" - model_name: gpt-4 litellm_params: model: gpt-4 api_key: "os.environ/OPENAI_API_KEY" - model_name: gpt-4o litellm_params: model: gpt-4o api_key: "os.environ/OPENAI_API_KEY" # Anthropic (Claude) API models: # https://docs.anthropic.com/en/docs/about-claude/models#model-names - model_name: claude-3-5-sonnet litellm_params: model: claude-3-5-sonnet-20240620 api_key: "os.environ/ANTHROPIC_API_KEY" - model_name: claude-3-5-sonnet-v2 litellm_params: model: claude-3-5-sonnet-latest api_key: "os.environ/ANTHROPIC_API_KEY" - model_name: claude-3-7-sonnet litellm_params: model: claude-3-7-sonnet-latest api_key: "os.environ/ANTHROPIC_API_KEY" - model_name: claude-3-opus litellm_params: model: claude-3-opus-latest api_key: "os.environ/ANTHROPIC_API_KEY" - model_name: claude-3-sonnet litellm_params: model: claude-3-sonnet-20240229 api_key: "os.environ/ANTHROPIC_API_KEY" - model_name: claude-3-haiku litellm_params: model: claude-3-haiku-20240307 api_key: "os.environ/ANTHROPIC_API_KEY" - model_name: claude-3-5-haiku litellm_params: model: claude-3-5-haiku-latest api_key: "os.environ/ANTHROPIC_API_KEY" ``` (I'm using env vars to not hardcode any API keys in the config.) Then you can use the container `ghcr.io/berriai/litellm:main-latest` with said file, ie: ``` docker run --rm -it \ -v ./config.yaml:/app/config.yaml \ -e OPENAI_API_KEY=1234 \ -e ANTHROPIC_API_KEY=5678 \ -e LITELLM_MASTER_KEY=putsomethingveryrandomhere \ -p 4000:4000 \ ghcr.io/berriai/litellm:main-latest --config /app/config.yaml --detailed_debug ``` Then go check http://localhost:4000 to see if LiteLLM's API is alive. If it's up, in theory you can go edit your OpenWebUI's Connections OpenAI API URL to point to your host instead of `https://api.openai.com`: ![image](https://github.com/user-attachments/assets/3f73f2bd-67f6-4202-9b1a-e0ecb90f2e14) ...and then it should work. ✨
Author
Owner

@pavoltravnik commented on GitHub (Feb 15, 2025):

Fantastic, that is exactly what I'd asked for in the original post. Confirmed it works great!

Full install instructions:

  1. Go to https://openwebui.com/f/justinrahb/anthropic/
  2. Press Get and specify your OpenWebUI URL to install the script into your Workspace as a Function.
  3. Once installed, click the setting icon for that new function (called Valves for some reason) and enter your Anthropic key.
  4. Enjoy Claude working perfectly without an entire new Pipelines docker image needed.

Thanks, this worked perfectly fine, however on latest version v0.5.10 i can not see models of anthropic in the select model.

@pavoltravnik commented on GitHub (Feb 15, 2025): > Fantastic, that is **exactly** what I'd asked for in the original post. Confirmed it works great! > > Full install instructions: > > 1. Go to https://openwebui.com/f/justinrahb/anthropic/ > 2. Press **Get** and specify your OpenWebUI URL to install the script into your Workspace as a Function. > 3. Once installed, click the setting icon for that new function (called Valves for some reason) and enter your Anthropic key. > 4. Enjoy Claude working perfectly without an entire new Pipelines docker image needed. Thanks, this worked perfectly fine, however on latest version v0.5.10 i can not see models of anthropic in the select model.
Author
Owner

@darkBuddha commented on GitHub (Feb 24, 2025):

Why is it not supported?

@darkBuddha commented on GitHub (Feb 24, 2025): Why is it not supported?
Author
Owner

@sarzixon commented on GitHub (Feb 25, 2025):

The process for actually setting this up is very unclear and unwieldly. For anyone else that doesn't want to lose the time I lost:

  1. Set up the main Open-WebUI Docker container following repo instructions (there goes half your RAM)
  2. Set up Pipelines Docker container following the instructions in the readme (there goes the other half of your RAM)
  3. In Open-WebUI click your Profile Picture > Admin Panel > Gear Icon (Admin Settings) > Pipelines > Paste this URL: https://github.com/open-webui/pipelines/blob/main/examples/pipelines/providers/anthropic_manifold_pipeline.py
  4. Enter your Anthropic API key.

This works perfectly, thanks!
Is Extended Thinking feature supported?

@sarzixon commented on GitHub (Feb 25, 2025): > The process for actually setting this up is very unclear and unwieldly. For anyone else that doesn't want to lose the time I lost: > > 1. Set up the main Open-WebUI Docker container following repo instructions (there goes half your RAM) > 2. Set up Pipelines Docker container following the instructions in the readme (there goes the other half of your RAM) > 3. In Open-WebUI click your Profile Picture > Admin Panel > Gear Icon (Admin Settings) > Pipelines > Paste this URL: https://github.com/open-webui/pipelines/blob/main/examples/pipelines/providers/anthropic_manifold_pipeline.py > 4. Enter your Anthropic API key. This works perfectly, thanks! Is Extended Thinking feature supported?
Author
Owner

@arty-hlr commented on GitHub (Feb 25, 2025):

@sarzixon It has been added a few hours ago, see the relevant PR (not yet merged), but it's not working perfectly yet.

@arty-hlr commented on GitHub (Feb 25, 2025): @sarzixon It has been added a few hours ago, see the relevant PR (not yet merged), but it's not working perfectly yet.
Author
Owner

@i0ntempest commented on GitHub (Mar 4, 2025):

Is there still no native support for Anthropic API as a connection?
Also Anthropic API has an OpenAI compatibility layer though, but I cannot get it to work when adding a connection.

@i0ntempest commented on GitHub (Mar 4, 2025): Is there still no native support for Anthropic API as a connection? Also Anthropic API has an OpenAI compatibility layer though, but I cannot get it to work when adding a connection.
Author
Owner

@ineiti commented on GitHub (Mar 13, 2025):

Thanks, this worked perfectly fine, however on latest version v0.5.10 i can not see models of anthropic in the select model.

I have the same problem here - the function is installed, but I cannot choose the models :(

I see some ssl connection errors to localhost:11434 - not sure if this has to do with the function I imported or not.

And the valves are empty:

Image
@ineiti commented on GitHub (Mar 13, 2025): > Thanks, this worked perfectly fine, however on latest version v0.5.10 i can not see models of anthropic in the select model. I have the same problem here - the function is installed, but I cannot choose the models :( I see some `ssl connection errors to localhost:11434` - not sure if this has to do with the function I imported or not. And the valves are empty: <img width="313" alt="Image" src="https://github.com/user-attachments/assets/e5c7b82a-b525-458c-b263-c06481790edb" />
Author
Owner

@nobretere commented on GitHub (Mar 26, 2025):

Thank You

@nobretere commented on GitHub (Mar 26, 2025): Thank You
Author
Owner

@eg-mattl commented on GitHub (Jun 12, 2025):

I was able to get it working by manually specifying the model name.

Image

@eg-mattl commented on GitHub (Jun 12, 2025): I was able to get it working by manually specifying the model name. ![Image](https://github.com/user-attachments/assets/518d4dfc-2da9-4ddf-8db0-60541062cc1d)
Author
Owner

@arty-hlr commented on GitHub (Jun 12, 2025):

@eg-mattl They say it's only for testing and comparing models though, it won't have the full capabilities of their API. I recommend setting up a litellm proxy instead.

@arty-hlr commented on GitHub (Jun 12, 2025): @eg-mattl They say it's only for testing and comparing models though, it won't have the full capabilities of their API. I recommend setting up a litellm proxy instead.
Author
Owner

@dom6770 commented on GitHub (Jun 12, 2025):

I mean, to this day the Anthropic function hasn't been update to include Claude 4 moments, so I am really puzzled why Antrophic API isn't officially supported through 'Connections' as OpenAI.

@dom6770 commented on GitHub (Jun 12, 2025): I mean, to this day the [Anthropic function ](https://openwebui.com/f/justinrahb/anthropic) hasn't been update to include Claude 4 moments, so I am really puzzled why Antrophic API isn't officially supported through 'Connections' as OpenAI.
Author
Owner

@arty-hlr commented on GitHub (Jun 13, 2025):

@dom6770 Because the open-webui has made the decision to only support the openai API. I don't trust random user functions that as you say are not maintained or updated, hence why litellm-proxy.

@arty-hlr commented on GitHub (Jun 13, 2025): @dom6770 Because the open-webui has made the decision to only support the openai API. I don't trust random user functions that as you say are not maintained or updated, hence why litellm-proxy.
Author
Owner

@miversen33 commented on GitHub (Jun 26, 2025):

@dom6770 you have the code literally right there in your reply. Simply add

            {"id": "claude-sonnet-4-20250514", "name": "claude-4-sonnet"},
            {"id": "claude-opus-4-20250514", "name": "claude-4-opus"},

to the end of the array in the get_anthropic_models method.

That said, it is a strange decision to be so locked to openai here.

@miversen33 commented on GitHub (Jun 26, 2025): @dom6770 you have the code literally right there in your reply. Simply add ``` {"id": "claude-sonnet-4-20250514", "name": "claude-4-sonnet"}, {"id": "claude-opus-4-20250514", "name": "claude-4-opus"}, ``` to the end of the array in the `get_anthropic_models` method. That said, it is a strange decision to be so locked to openai here.
Author
Owner

@wkbaran commented on GitHub (Jul 22, 2025):

Claude Code is dominating yet still no improvement here?
You can't use 'natively' and 'Functions' in the same sentence, especially when the Function isn't maintained by Open Webui.

@wkbaran commented on GitHub (Jul 22, 2025): Claude Code is dominating yet still no improvement here? You can't use 'natively' and 'Functions' in the same sentence, especially when the Function isn't maintained by Open Webui.
Author
Owner

@prodigy commented on GitHub (Aug 27, 2025):

In case anyone finds this for the use of opus 4.1 - You cannot set top_p and temperature in the same request.

I changed the payload generation to this to make it work:

        payload = {
            "model": body["model"][body["model"].find(".") + 1 :],
            "messages": processed_messages,
            "max_tokens": body.get("max_tokens", 4096),
            "top_k": body.get("top_k", 40),
            "stop_sequences": body.get("stop", []),
            **({"system": str(system_message)} if system_message else {}),
            "stream": body.get("stream", False),
        }

        temperature = body.get("temperature", 0.8)
        if temperature:
            payload["temperature"] = temperature

        top_p = body.get("top_p", None)
        if top_p:
            payload["top_p"] = top_p

And this is for the model:

            {"id": "claude-opus-4-1-20250805", "name": "claude-4.1-opus"},
@prodigy commented on GitHub (Aug 27, 2025): In case anyone finds this for the use of opus 4.1 - You cannot set top_p and temperature in the same request. I changed the payload generation to this to make it work: ```python payload = { "model": body["model"][body["model"].find(".") + 1 :], "messages": processed_messages, "max_tokens": body.get("max_tokens", 4096), "top_k": body.get("top_k", 40), "stop_sequences": body.get("stop", []), **({"system": str(system_message)} if system_message else {}), "stream": body.get("stream", False), } temperature = body.get("temperature", 0.8) if temperature: payload["temperature"] = temperature top_p = body.get("top_p", None) if top_p: payload["top_p"] = top_p ``` And this is for the model: ```python {"id": "claude-opus-4-1-20250805", "name": "claude-4.1-opus"}, ```
Author
Owner

@AumCoin commented on GitHub (Sep 6, 2025):

Fantastic, that is exactly what I'd asked for in the original post. Confirmed it works great!

Full install instructions:

1. Go to https://openwebui.com/f/justinrahb/anthropic/

2. Press **Get** and specify your OpenWebUI URL to install the script into your Workspace as a Function.

3. Once installed, click the setting icon for that new function (called Valves for some reason) and enter your Anthropic key.

4. Enjoy Claude working perfectly without an entire new Pipelines docker image needed.

I installed the function and set the API key but I still don't see any Anthropic models listed when I try to set a model. The Pipelines method also does not work for me.

@AumCoin commented on GitHub (Sep 6, 2025): > Fantastic, that is **exactly** what I'd asked for in the original post. Confirmed it works great! > > Full install instructions: > > 1. Go to https://openwebui.com/f/justinrahb/anthropic/ > > 2. Press **Get** and specify your OpenWebUI URL to install the script into your Workspace as a Function. > > 3. Once installed, click the setting icon for that new function (called Valves for some reason) and enter your Anthropic key. > > 4. Enjoy Claude working perfectly without an entire new Pipelines docker image needed. I installed the function and set the API key but I still don't see any Anthropic models listed when I try to set a model. The Pipelines method also does not work for me.
Author
Owner

@arty-hlr commented on GitHub (Sep 8, 2025):

@AumCoin Use litellm-proxy instead. User-written functions or pipelines are not reliable.

@arty-hlr commented on GitHub (Sep 8, 2025): @AumCoin Use litellm-proxy instead. User-written functions or pipelines are not reliable.
Author
Owner

@dbrans commented on GitHub (Nov 6, 2025):

I’m fine with doing an initial setup to support Claude models, but there should be a safe "update" process with changes to the payload and new model releases — admins shouldn’t need to hand-edit function each time.

@dbrans commented on GitHub (Nov 6, 2025): I’m fine with doing an initial setup to support Claude models, but there should be a safe "update" process with changes to the payload and new model releases — admins shouldn’t need to hand-edit function each time.
Author
Owner

@arty-hlr commented on GitHub (Nov 10, 2025):

@dbrans That's why litelm-proxy is recommended instead of using user made functions.

@arty-hlr commented on GitHub (Nov 10, 2025): @dbrans That's why litelm-proxy is recommended instead of using user made functions.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#1315