mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 19:08:59 -05:00
[GH-ISSUE #1094] enable max tolkens for anthropic/claude-3-sonnet-20240229 #12335
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @bjornjorgensen on GitHub (Mar 7, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/1094
Bug Report
Cant change max tolken when running anthropic/claude-3-sonnet-20240229
so the text is cut.
Description
Bug Summary:
[Provide a brief but clear summary of the bug]
Steps to Reproduce:
[Outline the steps to reproduce the bug. Be as detailed as possible.]
add anthropic/claude-3-sonnet-20240229
as for something bigger then 2 + 2 ?
Expected Behavior:
[Describe what you expected to happen.]
Actual Behavior:
[Describe what actually happened.]
Environment
breave
Reproduction Details
Confirmation:
Logs and Screenshots
Browser Console Logs:
[Include relevant browser console logs, if applicable]
Docker Container Logs:
[Include relevant Docker container logs, if applicable]
Screenshots (if applicable):
[Attach any relevant screenshots to help illustrate the issue]
Installation Method
[Describe the method you used to install the project, e.g., manual installation, Docker, package manager, etc.]
Additional Information
[Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.]
Note
If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
@tjbck commented on GitHub (Mar 8, 2024):
I have a suspicion that it's an issue on the LiteLLM-side, could you try isolating the issue by trying with LiteLLM only? Keep us updated!
@justinh-rahb commented on GitHub (Mar 8, 2024):
I'm not so sure that it's LiteLLM to blame here, I've tried now with an older version of it and the same behaviour is happening on Claude 2 as well, from other clients than WebUI. I believe this behavior started the day that Claude 3 was released. By chance @bjornjorgensen are you using an Anthropic developer account like mine? I am beginning to wonder if they simply limit
max_tokensnow for free dev keys.@justinh-rahb commented on GitHub (Mar 8, 2024):
Update: now I'm not so sure where the blame lies. Testing in another chat app works fine with Claude 3 endpoints. So it could indeed be a LiteLLM issue then, but it also wasn't working with older versions of it that previously did work? Really need someone with actual paid API keys to test this I think.
@bjornjorgensen commented on GitHub (Mar 8, 2024):
it is a free key that i'm are using

it does however work on

So its not the key that are the problem.
@justinh-rahb commented on GitHub (Mar 8, 2024):
Yes I just tried it with Lobechat as well and got a full response. So ball seems to be back in LiteLLM's court but I need to do further testing with components in isolation to be really certain of this.
@justinh-rahb commented on GitHub (Mar 8, 2024):
So after futher testing... I can only observe this happening when WebUI is involved. So it seems it may be something to do with our code, I just cannot at the moment nail down what it could possibly be. It seemingly only affects Claude API via LiteLLM in Open WebUI
@bjornjorgensen commented on GitHub (Mar 8, 2024):
are there any configs that over rights tolkens that it can print on outputs?
@justinh-rahb commented on GitHub (Mar 8, 2024):
@bjornjorgensen nah, in our testing we've checked we're not sending anything like that would limit the max tokens, but nonetheless the API response says the stop reason is
lengthwhich would indicate that it's been given one and reached it... very strange. Still being investigated and I hope we'll have an answer soon!@justinh-rahb commented on GitHub (Mar 8, 2024):
Ladies and gentleman, we got em. Claude's API now requires that the
max_tokensparam be sent in the payload, and LiteLLM will set a default of 256 tokens if you don't specify this. Currently the WebUI does not send amax_tokensparam when using external APIs, so the proposed fix would be to add that feature, or allow this parameter override to be set in the LiteLLM configuration UI. For now, it can be worked around by mounting and modifying theconfig.yamlfile as such:Note: the maximum value is 4096, you'll get an error from Anthropic's API if you request more.
@tjbck commented on GitHub (Mar 8, 2024):
https://docs.anthropic.com/claude/docs/models-overview
@justinh-rahb commented on GitHub (Mar 8, 2024):
v0.1.111 (not merged to
:mainyet) has a new field in the LiteLLM UI to configure themax_tokensparameter override, which will make modifying yourconfig.yamlby hand unneccesary. This can be tested now in the:devbranch.@tjbck commented on GitHub (Mar 10, 2024):
max_tokens: 4096 should be explicitly set from the settings!
@bjornjorgensen commented on GitHub (Mar 11, 2024):
yes, I have to delete the old one and add it back.. but now it works :)
Thanks
@bjornjorgensen commented on GitHub (Mar 22, 2024):
hmm.. have some issues to day with dev images.. i cant see whats wrong there but when I try main it works

but i have deleted my storage for openchat and now i have to set everything up again. I add claude3 opus without seting enything othere then the model name and api key
so must i set max_tokens: 4096 when i use claude-3 models? if so then it must be in a readme somewhere.
@justinh-rahb commented on GitHub (Mar 22, 2024):
@bjornjorgensen I haven't migrated this to the docs site yet, there's a thread:
max_tokensmust be4096to get the most out of Claude API, as noted there.