[GH-ISSUE #1094] enable max tolkens for anthropic/claude-3-sonnet-20240229 #12335

Closed
opened 2026-04-19 19:14:23 -05:00 by GiteaMirror · 15 comments
Owner

Originally created by @bjornjorgensen on GitHub (Mar 7, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/1094

Bug Report

Cant change max tolken when running anthropic/claude-3-sonnet-20240229
so the text is cut.

Description

image

Bug Summary:
[Provide a brief but clear summary of the bug]

Steps to Reproduce:
[Outline the steps to reproduce the bug. Be as detailed as possible.]
add anthropic/claude-3-sonnet-20240229

image

as for something bigger then 2 + 2 ?

Expected Behavior:
[Describe what you expected to happen.]

Actual Behavior:
[Describe what actually happened.]

Environment

  • Operating System: [e.g., Windows 10, macOS Big Sur, Ubuntu 20.04]
  • k8s I am using dev image. ghcr.io/open-webui/open-webui:dev
  • Browser (if applicable): [e.g., Chrome 100.0, Firefox 98.0]
    breave

Reproduction Details

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.

Logs and Screenshots

Browser Console Logs:
[Include relevant browser console logs, if applicable]

Docker Container Logs:
[Include relevant Docker container logs, if applicable]

Screenshots (if applicable):
[Attach any relevant screenshots to help illustrate the issue]

Installation Method

[Describe the method you used to install the project, e.g., manual installation, Docker, package manager, etc.]

Additional Information

[Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.]

Note

If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!

Originally created by @bjornjorgensen on GitHub (Mar 7, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/1094 # Bug Report Cant change max tolken when running anthropic/claude-3-sonnet-20240229 so the text is cut. ## Description ![image](https://github.com/open-webui/open-webui/assets/47577197/94669a54-8554-481a-9095-eb86cd2f1931) **Bug Summary:** [Provide a brief but clear summary of the bug] **Steps to Reproduce:** [Outline the steps to reproduce the bug. Be as detailed as possible.] add anthropic/claude-3-sonnet-20240229 ![image](https://github.com/open-webui/open-webui/assets/47577197/0cf02c5d-d37f-46a4-923b-bedd62025925) as for something bigger then 2 + 2 ? **Expected Behavior:** [Describe what you expected to happen.] **Actual Behavior:** [Describe what actually happened.] ## Environment - **Operating System:** [e.g., Windows 10, macOS Big Sur, Ubuntu 20.04] - k8s I am using dev image. ghcr.io/open-webui/open-webui:dev - **Browser (if applicable):** [e.g., Chrome 100.0, Firefox 98.0] breave ## Reproduction Details **Confirmation:** - [ ] I have read and followed all the instructions provided in the README.md. - [ ] I am on the latest version of both Open WebUI and Ollama. - [ ] I have included the browser console logs. - [ ] I have included the Docker container logs. ## Logs and Screenshots **Browser Console Logs:** [Include relevant browser console logs, if applicable] **Docker Container Logs:** [Include relevant Docker container logs, if applicable] **Screenshots (if applicable):** [Attach any relevant screenshots to help illustrate the issue] ## Installation Method [Describe the method you used to install the project, e.g., manual installation, Docker, package manager, etc.] ## Additional Information [Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.] ## Note If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
Author
Owner

@tjbck commented on GitHub (Mar 8, 2024):

I have a suspicion that it's an issue on the LiteLLM-side, could you try isolating the issue by trying with LiteLLM only? Keep us updated!

<!-- gh-comment-id:1984868359 --> @tjbck commented on GitHub (Mar 8, 2024): I have a suspicion that it's an issue on the LiteLLM-side, could you try isolating the issue by trying with LiteLLM only? Keep us updated!
Author
Owner

@justinh-rahb commented on GitHub (Mar 8, 2024):

I have a suspicion that it's an issue on the LiteLLM-side, could you try isolating the issue by trying with LiteLLM only? Keep us updated!

I'm not so sure that it's LiteLLM to blame here, I've tried now with an older version of it and the same behaviour is happening on Claude 2 as well, from other clients than WebUI. I believe this behavior started the day that Claude 3 was released. By chance @bjornjorgensen are you using an Anthropic developer account like mine? I am beginning to wonder if they simply limit max_tokens now for free dev keys.

<!-- gh-comment-id:1985598771 --> @justinh-rahb commented on GitHub (Mar 8, 2024): > I have a suspicion that it's an issue on the LiteLLM-side, could you try isolating the issue by trying with LiteLLM only? Keep us updated! I'm not so sure that it's LiteLLM to blame here, I've tried now with an older version of it and the same behaviour is happening on Claude 2 as well, from other clients than WebUI. I believe this behavior started the day that Claude 3 was released. By chance @bjornjorgensen are you using an Anthropic developer account like mine? I am beginning to wonder if they simply limit `max_tokens` now for free dev keys.
Author
Owner

@justinh-rahb commented on GitHub (Mar 8, 2024):

Update: now I'm not so sure where the blame lies. Testing in another chat app works fine with Claude 3 endpoints. So it could indeed be a LiteLLM issue then, but it also wasn't working with older versions of it that previously did work? Really need someone with actual paid API keys to test this I think.

<!-- gh-comment-id:1985930169 --> @justinh-rahb commented on GitHub (Mar 8, 2024): Update: now I'm not so sure where the blame lies. Testing in another chat app works fine with Claude 3 endpoints. So it could indeed be a LiteLLM issue then, but it also wasn't working with older versions of it that previously did work? Really need someone with actual paid API keys to test this I think.
Author
Owner

@bjornjorgensen commented on GitHub (Mar 8, 2024):

it is a free key that i'm are using
image

it does however work on
image

So its not the key that are the problem.

<!-- gh-comment-id:1986161132 --> @bjornjorgensen commented on GitHub (Mar 8, 2024): it is a free key that i'm are using ![image](https://github.com/open-webui/open-webui/assets/47577197/a576eedc-3ad6-47ec-9736-0e8ac274dbdf) it does however work on ![image](https://github.com/open-webui/open-webui/assets/47577197/53ebc826-7eeb-4c91-befc-11665272bc67) So its not the key that are the problem.
Author
Owner

@justinh-rahb commented on GitHub (Mar 8, 2024):

Yes I just tried it with Lobechat as well and got a full response. So ball seems to be back in LiteLLM's court but I need to do further testing with components in isolation to be really certain of this.

<!-- gh-comment-id:1986163810 --> @justinh-rahb commented on GitHub (Mar 8, 2024): Yes I just tried it with Lobechat as well and got a full response. So ball seems to be back in LiteLLM's court but I need to do further testing with components in isolation to be really certain of this.
Author
Owner

@justinh-rahb commented on GitHub (Mar 8, 2024):

So after futher testing... I can only observe this happening when WebUI is involved. So it seems it may be something to do with our code, I just cannot at the moment nail down what it could possibly be. It seemingly only affects Claude API via LiteLLM in Open WebUI

<!-- gh-comment-id:1986218089 --> @justinh-rahb commented on GitHub (Mar 8, 2024): So after futher testing... I can only observe this happening when WebUI is involved. So it seems it may be something to do with our code, I just cannot at the moment nail down what it could possibly be. It seemingly only affects Claude API via LiteLLM in Open WebUI
Author
Owner

@bjornjorgensen commented on GitHub (Mar 8, 2024):

are there any configs that over rights tolkens that it can print on outputs?

<!-- gh-comment-id:1986274443 --> @bjornjorgensen commented on GitHub (Mar 8, 2024): are there any configs that over rights tolkens that it can print on outputs?
Author
Owner

@justinh-rahb commented on GitHub (Mar 8, 2024):

@bjornjorgensen nah, in our testing we've checked we're not sending anything like that would limit the max tokens, but nonetheless the API response says the stop reason is length which would indicate that it's been given one and reached it... very strange. Still being investigated and I hope we'll have an answer soon!

<!-- gh-comment-id:1986324222 --> @justinh-rahb commented on GitHub (Mar 8, 2024): @bjornjorgensen nah, in our testing we've checked we're not sending anything like that would limit the max tokens, but nonetheless the API response says the stop reason is `length` which would indicate that it's been given one and reached it... very strange. Still being investigated and I hope we'll have an answer soon!
Author
Owner

@justinh-rahb commented on GitHub (Mar 8, 2024):

Ladies and gentleman, we got em. Claude's API now requires that the max_tokens param be sent in the payload, and LiteLLM will set a default of 256 tokens if you don't specify this. Currently the WebUI does not send a max_tokens param when using external APIs, so the proposed fix would be to add that feature, or allow this parameter override to be set in the LiteLLM configuration UI. For now, it can be worked around by mounting and modifying the config.yaml file as such:

- litellm_params:
    api_key: your_api_key
    model: anthropic/claude-3-sonnet-20240229
    max_tokens: 4096
  model_info:
    id: 810226a0-61e2-4d97-9de0-822bd4300fcd
  model_name: claude-3-sonnet

Note: the maximum value is 4096, you'll get an error from Anthropic's API if you request more.

<!-- gh-comment-id:1986436585 --> @justinh-rahb commented on GitHub (Mar 8, 2024): _Ladies and gentleman, we got em._ Claude's API now **requires** that the `max_tokens` param be sent in the payload, and LiteLLM will set a default of 256 tokens if you don't specify this. Currently the WebUI does not send a `max_tokens` param when using external APIs, so the proposed fix would be to add that feature, or allow this parameter override to be set in the LiteLLM configuration UI. For now, it can be worked around by mounting and modifying the `config.yaml` file as such: ```yaml - litellm_params: api_key: your_api_key model: anthropic/claude-3-sonnet-20240229 max_tokens: 4096 model_info: id: 810226a0-61e2-4d97-9de0-822bd4300fcd model_name: claude-3-sonnet ``` Note: the maximum value is 4096, you'll get an error from Anthropic's API if you request more.
Author
Owner

@tjbck commented on GitHub (Mar 8, 2024):

https://docs.anthropic.com/claude/docs/models-overview

<!-- gh-comment-id:1986437537 --> @tjbck commented on GitHub (Mar 8, 2024): https://docs.anthropic.com/claude/docs/models-overview
Author
Owner

@justinh-rahb commented on GitHub (Mar 8, 2024):

v0.1.111 (not merged to :main yet) has a new field in the LiteLLM UI to configure the max_tokens parameter override, which will make modifying your config.yaml by hand unneccesary. This can be tested now in the :dev branch.

<!-- gh-comment-id:1986514938 --> @justinh-rahb commented on GitHub (Mar 8, 2024): **v0.1.111** (not merged to `:main` yet) has a new field in the LiteLLM UI to configure the `max_tokens` parameter override, which will make modifying your `config.yaml` by hand unneccesary. This can be tested now in the `:dev` branch.
Author
Owner

@tjbck commented on GitHub (Mar 10, 2024):

max_tokens: 4096 should be explicitly set from the settings!

<!-- gh-comment-id:1987382489 --> @tjbck commented on GitHub (Mar 10, 2024): max_tokens: 4096 should be explicitly set from the settings!
Author
Owner

@bjornjorgensen commented on GitHub (Mar 11, 2024):

yes, I have to delete the old one and add it back.. but now it works :)
Thanks

<!-- gh-comment-id:1988063832 --> @bjornjorgensen commented on GitHub (Mar 11, 2024): yes, I have to delete the old one and add it back.. but now it works :) Thanks
Author
Owner

@bjornjorgensen commented on GitHub (Mar 22, 2024):

hmm.. have some issues to day with dev images.. i cant see whats wrong there but when I try main it works
but i have deleted my storage for openchat and now i have to set everything up again. I add claude3 opus without seting enything othere then the model name and api key
image

so must i set max_tokens: 4096 when i use claude-3 models? if so then it must be in a readme somewhere.

<!-- gh-comment-id:2015942921 --> @bjornjorgensen commented on GitHub (Mar 22, 2024): hmm.. have some issues to day with dev images.. i cant see whats wrong there but when I try main it works but i have deleted my storage for openchat and now i have to set everything up again. I add claude3 opus without seting enything othere then the model name and api key ![image](https://github.com/open-webui/open-webui/assets/47577197/5bc100f7-2b43-4464-95bc-04f7cf65ff6f) so must i set max_tokens: 4096 when i use claude-3 models? if so then it must be in a readme somewhere.
Author
Owner

@justinh-rahb commented on GitHub (Mar 22, 2024):

@bjornjorgensen I haven't migrated this to the docs site yet, there's a thread:

max_tokens must be 4096 to get the most out of Claude API, as noted there.

<!-- gh-comment-id:2016052985 --> @justinh-rahb commented on GitHub (Mar 22, 2024): @bjornjorgensen I haven't migrated this to the docs site yet, there's a thread: - #1038 `max_tokens` must be `4096` to get the most out of Claude API, as noted there.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#12335