mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 19:08:59 -05:00
issue: "temperature" value is ignored when using the API #6328
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @RobotGizmo on GitHub (Sep 5, 2025).
Check Existing Issues
Installation Method
Pip Install
Open WebUI Version
v0.6.26
Ollama Version (if applicable)
0.11.10
Operating System
Ubuntu 24
Browser (if applicable)
No response
Confirmation
README.md.Expected Behavior
Passing a temperature value to the API should affect the output.
Actual Behavior
When I create the JSON to call the /api/chat/completions API endpoint adding a temperature value doesn't seem to have any effect.
Steps to Reproduce
I want answers that are as deterministic as possible but it seems like temperature isn't passed through when using the API. This is the JSON I'm sending:
{"stream":false,"model":"gemma3:12b","messages":[{"role":"user","content":"Translate this from English to French and please keep your answer brief. Don't translate any of these values: party. Here is the text to translate: Welcome to the party!"}],"options":{"temperature":0.05}}I turned on debug logging and on the console output I can see that options is blank but when I use the web UI and adjust the temperature I can see the temperature value in the console output. No matter what I do in the API setting the temperature value doesn't seem to affect it and isn't reflected in the debug console output. In the "form_data" console log line you can see 'options': {} but I am passing options in my JSON post.
Logs & Screenshots
Additional Information
No response
@tjbck commented on GitHub (Sep 6, 2025):
optionsis not a valid payload field.@RobotGizmo commented on GitHub (Sep 6, 2025):
I tried passing temperature on the root as well not under options and it also had no effect. Is there a way to pass these values through the API?
@rgaricano commented on GitHub (Sep 6, 2025):
basic structure:
@RobotGizmo commented on GitHub (Sep 8, 2025):
I have tried putting temperature in the root like that as well. Maybe I'm just missing something but when I do this prompt in the Open WebUI with a custom temperature value I see this in the console:
generate_chat_completion: {'stream': True, 'model': 'gemma3:12b', 'messages': [{'role': 'user', 'content': "Translate this from English to French and please keep your answer brief. Don't translate any of these values: party. Here is the text to translate: Welcome to the party!"}], 'metadata': {'user_id': 'df3a665f-ec46-4d9b-a22f-24d97f94b117', 'chat_id': 'b5edf628-2705-4c42-80f5-5ede71ba3365', 'message_id': 'e9022880-880f-4034-bc7c-07868fcca45a', 'session_id': 'x_vEfLd4dePwYYsSAAAB', 'filter_ids': [], 'tool_ids': None, 'tool_servers': [], 'files': None, 'features': {'image_generation': False, 'code_interpreter': False, 'web_search': False, 'memory': False}, 'variables': {'{{USER_NAME}}': 'Jon Tackabury', '{{USER_LOCATION}}': 'Unknown', '{{CURRENT_DATETIME}}': '2025-09-08 13:20:31', '{{CURRENT_DATE}}': '2025-09-08', '{{CURRENT_TIME}}': '13:20:31', '{{CURRENT_WEEKDAY}}': 'Monday', '{{CURRENT_TIMEZONE}}': 'America/Toronto', '{{USER_LANGUAGE}}': 'en-US'}, 'model': {'id': 'gemma3:12b', 'name': 'gemma3:12b', 'object': 'model', 'created': 1757352010, 'owned_by': 'ollama', 'ollama': {'name': 'gemma3:12b', 'model': 'gemma3:12b', 'modified_at': '2025-09-05T12:43:56.339380227-04:00', 'size': 8149190253, 'digest': 'f4031aab637d1ffa37b42570452ae0e4fad0314754d17ded67322e4b95836f8a', 'details': {'parent_model': '', 'format': 'gguf', 'family': 'gemma3', 'families': ['gemma3'], 'parameter_size': '12.2B', 'quantization_level': 'Q4_K_M'}, 'connection_type': 'local', 'urls': [0], 'expires_at': 1757352258}, 'connection_type': 'local', 'tags': [], 'actions': [], 'filters': []}, 'direct': False, 'params': {'stream_delta_chunk_size': None, 'reasoning_tags': None, 'function_calling': 'default'}}, 'options': {'temperature': 0.1}}When I do an API call with a custom temperature value I see this:
generate_chat_completion: {'stream': False, 'model': 'gemma3:12b', 'messages': [{'role': 'user', 'content': "Translate this from English to French and please keep your answer brief. Don't translate any of these values: party. Here is the text to translate: Welcome to the party!"}], 'temperature': 0.05, 'metadata': {'user_id': 'df3a665f-ec46-4d9b-a22f-24d97f94b117', 'chat_id': None, 'message_id': None, 'session_id': None, 'filter_ids': [], 'tool_ids': None, 'tool_servers': None, 'files': None, 'features': {}, 'variables': {}, 'model': {'id': 'gemma3:12b', 'name': 'gemma3:12b', 'object': 'model', 'created': 1757352086, 'owned_by': 'ollama', 'ollama': {'name': 'gemma3:12b', 'model': 'gemma3:12b', 'modified_at': '2025-09-05T12:43:56.339380227-04:00', 'size': 8149190253, 'digest': 'f4031aab637d1ffa37b42570452ae0e4fad0314754d17ded67322e4b95836f8a', 'details': {'parent_model': '', 'format': 'gguf', 'family': 'gemma3', 'families': ['gemma3'], 'parameter_size': '12.2B', 'quantization_level': 'Q4_K_M'}, 'connection_type': 'local', 'urls': [0], 'expires_at': 1757352338}, 'connection_type': 'local', 'tags': [], 'actions': [], 'filters': []}, 'direct': False, 'params': {'stream_delta_chunk_size': None, 'reasoning_tags': None, 'function_calling': 'default'}}, 'options': {}}There's a lot of extra stuff in the UI call but the interesting part is the 'options' section near the end of each log line. The UI is passing a temperature value but the API call isn't. Is this relevant or is the temperature value still being used but just not shown anywhere in the console logging when making API calls?
@RobotGizmo commented on GitHub (Sep 22, 2025):
I switched to calling the Ollama API directly and the temperature value is working great. I'm not sure why it isn't being passed by the Open WebUI API to Ollama but I have worked around this by not using Open WebUI for now.
@fnog commented on GitHub (Oct 16, 2025):
I ran into this exact problem when trying to force the seed to a fixed number to be able to test and compare prompts and answers. With the same seed and temperature I should get the same answer. Right? The thing is that, as reported in this issue, these values are not used by the api (or so it seems).
To solve my problem, I figured out that the parameters that are configured in each model in the Admin Panel \ Settings \ Models override the values sent through the api (Model Params \ Advanced Params). So I set my seed value there and I solved my issue.
Hope this helps.
@RobotGizmo commented on GitHub (Oct 16, 2025):
That's great to know, thanks! I ended up just hitting the Ollama API directly for now and it's working with the custom values.
@anindyamaiti commented on GitHub (Nov 6, 2025):
Can we please reopen this issue as it still persists in 0.6.35, even with temperature at the root level (not under options)?
My students used 0.0 temperature for a class assignment, and I was hoping to validate their results. Turns out OpenWebUI API did not forward the temperature to Ollama backend. Not a huge issue for my use-case but I can see others who might be researching the effect of temperature for certain applications can be super confused.
@Classic298 commented on GitHub (Nov 6, 2025):
@anindyamaiti feel free to open new issue with exact reproduction steps that acknowledges Tim's reason for closure (being wrong usage of a payload field) Thanks!
@anindyamaiti commented on GitHub (Nov 6, 2025):
This approach is not helpful for multi-user or multi-application deployments where dynamic temperatures have to be used.
@Classic298 commented on GitHub (Nov 6, 2025):
@anindyamaiti you can, alternatively, build a custom filter which creates a button in the UI and if activated by the user, adds the parameter of temperature = 0 in the request header.
this is suitable for multi user setups
@Classic298 commented on GitHub (Nov 6, 2025):
https://docs.openwebui.com/features/plugin/functions/filter
@Classic298 commented on GitHub (Nov 6, 2025):
A (random) filter i just found that modifies the request's temperature before sending it to the server: https://openwebui.com/f/sanjay3290/claude_4_via_vertex_ai
@Classic298 commented on GitHub (Nov 6, 2025):
use the docs and other public filters to build your own with a button in the chat interface that sets the temperature to zero