mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-07 03:18:23 -05:00
[GH-ISSUE #16744] issue: Harmony Format Support Issue with gpt-oss Models via LM Studio #18029
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @Pixellevel on GitHub (Aug 20, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/16744
Check Existing Issues
Installation Method
Pip Install
Open WebUI Version
v0.6.22
Ollama Version (if applicable)
No response
Operating System
Windows 10
Browser (if applicable)
Chrome 128.0.6613.85
Confirmation
README.md.Expected Behavior
When using gpt-oss models gpt-oss-20b through LM Studio, adding "Reasoning: high" to the system prompt in Open WebUI should:
Activate the model's high-level reasoning mode
Display detailed chain-of-thought processing in responses
Show enhanced analytical capabilities as designed by OpenAI's Harmony format
The gpt-oss models were specifically trained on OpenAI's Harmony response format and support configurable reasoning levels (low, medium, high) through system prompt directives.
Actual Behavior
When Reasoning: high is added to the system prompt in Open WebUI:
The reasoning directive appears to be ignored
The model responds in standard mode without enhanced reasoning
No chain-of-thought processing is visible
The response quality/depth doesn't reflect the requested reasoning level
Steps to Reproduce
1.Setup LM Studio:
Download and install LM Studio
Download gpt-oss-20b model from OpenAI via LM Studio's model manager
Start LM Studio server with default settings (typically localhost:1234)
Verify the model loads correctly and responds to basic queries
2.Configure Open WebUI:
install Open WebUI
Navigate to the Administrator Panel, Settings, External Connections,
Add OpenAI API connectionhttp://127.0.0.1:1234/v1 Key: NONE
Verify connection shows as active
3.Test Standard Behavior:
Create new chat in Open WebUI
Select the gpt-oss-20b model
Send a complex query (e.g., "Explain quantum entanglement and its applications")
Note the response style and depth
4.Test with Reasoning Directive:
Click Advanced Dialogue Settings, and the system prompts you to enter the word: Reasoning: high
SAlternatively, select the Administrator Panel, then Settings, then Model, then select gbt-oss-20b and add the prompt Reasoning: high and save.
Create new chat with same model
Send the same complex query
Compare response - should show enhanced reasoning but doesn't
5.Verify LM Studio Direct Access:
Test the same prompt directly in LM Studio interface
You can directly select the reasoning strength below the l m studio chat window
Confirm if reasoning enhancement works when bypassing Open WebUI
Logs & Screenshots
Additional Information
No response
@Pixellevel commented on GitHub (Aug 20, 2025):
Root Cause Hypothesis:
The issue likely stems from Open WebUI not properly preserving or translating the Harmony format requirements when communicating with LM Studio. While LM Studio supports Harmony format natively, Open WebUI may be using standard OpenAI API formatting that strips or doesn't properly embed the Harmony-specific directives.
Request:
Could Open WebUI add native support for Harmony format system directives, particularly for reasoning levels (Reasoning: low/medium/high) when interfacing with gpt-oss models? This would ensure seamless integration with OpenAI's open-weight model series.
Additional Context:
According to OpenAI's documentation, gpt-oss models were specifically trained on the Harmony response format and should only be used with this format for optimal performance.
@tjbck commented on GitHub (Aug 20, 2025):
You set the system prompt and did not correctly configure the parameter.
@Pixellevel commented on GitHub (Aug 20, 2025):
Thank you for the response! Could you please show me exactly where and how to configure the reasoning parameter in Open WebUI? I tried adding Reasoning: high to the system prompt but it seems I'm missing something. What's the correct way to set this up?
@Pixellevel commented on GitHub (Aug 20, 2025):
@tjbck Thank you for the response! Could you please show me exactly where and how to configure the reasoning parameter in Open WebUI? I tried adding Reasoning: high to the system prompt but it seems I'm missing something. What's the correct way to set this up?
@Pixellevel commented on GitHub (Aug 20, 2025):
https://huggingface.co/openai/gpt-oss-120b#:~:text=Reasoning%20levels,g.%2C%20%22Reasoning%3A%20high%22.
Because I saw here that he only needs to add this in the system prompt words
@Pixellevel commented on GitHub (Aug 20, 2025):
@Pixellevel commented on GitHub (Aug 20, 2025):
@tjbck Could you please explain it to me?
@MountainX commented on GitHub (Sep 5, 2025):
I'm facing this issue too and would like to know what is the solution.