mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 10:58:17 -05:00
[GH-ISSUE #16351] generate chat title button should use TASK_MODEL instead of current chat model #17870
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @tandav on GitHub (Aug 7, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/16351
Check Existing Issues
Installation Method
Docker
Open WebUI Version
v0.6.18
Ollama Version (if applicable)
No response
Operating System
docker, ubuntu
Browser (if applicable)
Chrome
Confirmation
README.md.Expected Behavior
As far as I understand chat titles should be always be generated using
TASK_MODELmodel (gpt-4.1-miniin my case)Actual Behavior
When pressing on Generate chat title button (see picture) the request is made using current chat model
Steps to Reproduce
gpt-4.1-minifrom the UI or using environment variableLogs & Screenshots
Here the JSON payload of the request, the
o4-mini-deep-researchis used in my case (chat's model){
"model": "o4-mini-deep-research",
"messages": [
...
Additional Information
No response
@tjbck commented on GitHub (Aug 7, 2025):
it does.
@ajitam commented on GitHub (Sep 23, 2025):
no it doesn't.
I try setting it up in UI and in ENV.
In any case "current model" is used.
@druellan commented on GitHub (Oct 3, 2025):
I also found this issue recently.
My test:
I get an error that the model can't be found and the title does not change, this aligns with the idea that OWUI is trying to use the current model (Model-B) and not the task model (Model-A).
@tjbck commented on GitHub (Oct 3, 2025):
@ajitam did you check the code? it always overwrites the model from the backend.
@ajitam commented on GitHub (Oct 3, 2025):
@tjbck no I haven't checked the code, I'm relaying on my QS tests :)
In any case I solved my problem with setting fallback model because I was getting
429from Bedrock.I have a feeling that it has something to do how (in which order) settings are set.
But currently all I can offer is my test:
@druellan commented on GitHub (Oct 5, 2025):
I was taking a look at seems that the frontend is the one sending the wrong model to the endpoint.
The problem is in
src\lib\components\layout\Sidebar\ChatItem.sveltearound line 290const model = chat.chat.models.at(0) ?? chat.models.at(0) ?? '';The code is seeking for the chat model, not the task model.
My fix is to do something like this:
But I'm using a call to the backend, not sure if there is a way to get the taskmodel without the extra call.
I confirmed this looking at the
form_datavariable inbackend\open_webui\routers\tasks.pyaround line 159As far as I can see there is no override there,
form_data["model"]has the same model passed from the frontend.