mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 19:08:59 -05:00
[GH-ISSUE #8393] Advanced params not shown/used #53773
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @arty-hlr on GitHub (Jan 8, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/8393
Bug Report
Installation Method
Docker
Environment
Open WebUI Version: v0.5.4
Ollama (if applicable): 0.3.9
Operating System: Ubuntu 24.0
Browser (if applicable): Firefox 133.0.3
Confirmation:
I have read and followed all the instructions provided in the README.md.
I am on the latest version of both Open WebUI and Ollama.
I have included the browser console logs.
I have included the Docker container logs.
I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below.
Expected Behavior:
The advanced parameters set in the model setting pages should be used (yes they were saved).

Actual Behavior:
The settings set previously are not shown when opening Controls in the chat:

and they don't seem to get sent, either

paramsis empty:or it has

num_ctxbut it's set tonull:Description
Bug Summary:
The set advanced parameters per model don't seem to be respected to be sent to the API endpoint (groq in this case, openAI compatible).
Reproduction Details
Steps to Reproduce:
contextin the model advanced parameters for exampleLogs and Screenshots
Included above
Additional Information
@tjbck commented on GitHub (Jan 8, 2025):
Model params are applied from the backend.
@arty-hlr commented on GitHub (Jan 8, 2025):
@tjbck I'm not sure what you mean by that or why this closes this issue...
@tjbck commented on GitHub (Jan 8, 2025):
https://docs.openwebui.com/features/chat-features/chat-params
@arty-hlr commented on GitHub (Jan 8, 2025):
@tjbck This is about the hierarchy of parameters and is irrelevant to the issue or to your comment. The problem is that the parameters set for the model are neither shown nor used in the request to the endpoint.
@tjbck commented on GitHub (Jan 8, 2025):
like I mentioned, the params set from the models page is applied from the backend and it is applied before the request gets sent off to its respective model providers. and it's not supposed be shown in the chat controls. Hope that helps!
@arty-hlr commented on GitHub (Jan 8, 2025):
Ok, I understand that it won't be shown to the user, which could be confusing though as it shows "default", but alright. About the parameters being applied I think I showed that it wasn't sent at all... Could this be reopened please @tjbck?
@tandav commented on GitHub (Feb 7, 2025):
Same here, very confusing. I thought you set default params for model (as admin) and these parameters will be set in new chat params (but user can change them to something they like).
@HeyItsDaddy commented on GitHub (Feb 12, 2025):
I'm having the same issue. Container was started with --gpus all, and running nvidia-smi in the container shows it's running correctly. But no matter where I set the num_gpu setting (whether it's direct in Advanced Params on the chat itself, or in Admin Panel | Settings | Models | | Advanced Params; it does not seem to ever "stick" and it's running purely on CPU and not GPU.
I'm open to anything you can suggest to run / prove that the parameter is or is not being sent through, or checking the backend logs to see if it's actually being sent or not. As of right now it does not seem to be respecting it.