mirror of
https://github.com/open-webui/open-webui.git
synced 2026-03-22 06:02:06 -05:00
issue: Custom Models (from base Ollama model) ignores think (Ollama) advanced parameter (and possibly more)
#5541
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @silentoplayz on GitHub (Jun 14, 2025).
Originally assigned to: @tjbck on GitHub.
Check Existing Issues
Installation Method
Docker
Open WebUI Version
v0.6.14
Ollama Version (if applicable)
v0.9.0
Operating System
Edition: Windows 11 Pro | Version: 24H2 | OS Build: 26100.4351 | Windows Feature Experience Pack: 1000.26100.107.0
Browser (if applicable)
LibreWolf v135.0.1-1 (Firefox)
Confirmation
README.md.Expected Behavior
When the
think (Ollama)advanced parameter is toggled off for a custom model derived from an Ollama base model, the model's responses should not contain thinking/thought tags. The parameter should effectively disable the internal thinking process of the model using Ollama's model parameter.Actual Behavior
Despite
think (Ollama)being toggled off in the advanced parameters for a custom model (created from an Ollama base model), the model continues to generate thinking/thought tags within its responses. This indicates that thethink (Ollama)parameter is being ignored or is not functioning correctly for custom models when set to disabled. Other Ollama advanced parameters for models should be examined and tested thoroughly.Steps to Reproduce
mainDocker image) and Ollama installation.http://localhost:11434).ollama pull qwen:8b.http://localhost:8080).qwen:8b).MyQwenCustom).MyQwenCustom).Controlsicon at the top right side of a chat.think (Ollama)parameter and toggle it off.think (Ollama)advanced parameter being toggled off.For comparison (demonstrates expected behavior with base model):
qwen:8b) from the model dropdown.think (Ollama)off.Logs & Screenshots
To be clear, I have not modified the custom model's or the base model's (Qwen 3 8B) advanced parameters.




Custom model (from base model):
Base model: (I had to zoom out by 40% for this screenshot)
When using only the base model, the
think (Ollama)advanced parameter does work as anticipated and as expected.Additional Information
Initially, I suspected this issue might be related to knowledge collections, but further testing revealed that knowledge collections or files have no bearing on the problem. The issue consistently occurs with custom models derived from Ollama base models, regardless of whether a knowledge collection is attached or not.
This behavior is specific to custom models; the
think (Ollama)parameter functions as expected (i.e., it successfully disables think tags) when used directly with the base Ollama models. I am unsure if other advanced parameters might be similarly affected for custom models.@rgaricano commented on GitHub (Jun 14, 2025):
Could you try adding to
63256136ef/backend/open_webui/models/knowledge.py (L100)100
model_config = ConfigDict(extra="allow")?
@silentoplayz commented on GitHub (Jun 14, 2025):
That didn't appear to make a difference.

@silentoplayz commented on GitHub (Jun 14, 2025):
Update!
There is a very real possibility that the


Merge Responsesbutton in multi-response scenarios also ignores thethink (Ollama)advanced parameter, as shown in the screenshots attached below. (Tested on two different Open WebUI instances on my PC)Please let me know if I should open up a separate, new issue for this.
@tjbck commented on GitHub (Jun 16, 2025):
Can't seem to reproduce on my end, was able to correctly create custom model with a think param enabled.
@tjbck commented on GitHub (Jun 16, 2025):
Should be addressed with ab877e1d7ea77f6a5c5db63bda72041a92d96f7a!
@silentoplayz commented on GitHub (Jun 16, 2025):
This comment solves it! Thank you for the speedy fix! https://github.com/open-webui/open-webui/issues/14975#issuecomment-2973265418 is still relevant though, as this is a separate issue on its own.