[GH-ISSUE #17236] issue: "temperature" value is ignored when using the API #18215

Closed
opened 2026-04-20 00:25:38 -05:00 by GiteaMirror · 16 comments
Owner

Originally created by @RobotGizmo on GitHub (Sep 5, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/17236

Check Existing Issues

  • I have searched the existing issues and discussions.
  • I am using the latest version of Open WebUI.

Installation Method

Pip Install

Open WebUI Version

v0.6.26

Ollama Version (if applicable)

0.11.10

Operating System

Ubuntu 24

Browser (if applicable)

No response

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided every relevant configuration, setting, and environment variable used in my setup.
  • I have clearly listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc).
  • I have documented step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation. My steps:
  • Start with the initial platform/version/OS and dependencies used,
  • Specify exact install/launch/configure commands,
  • List URLs visited, user input (incl. example values/emails/passwords if needed),
  • Describe all options and toggles enabled or changed,
  • Include any files or environmental changes,
  • Identify the expected and actual result at each stage,
  • Ensure any reasonably skilled user can follow and hit the same issue.

Expected Behavior

Passing a temperature value to the API should affect the output.

Actual Behavior

When I create the JSON to call the /api/chat/completions API endpoint adding a temperature value doesn't seem to have any effect.

Steps to Reproduce

I want answers that are as deterministic as possible but it seems like temperature isn't passed through when using the API. This is the JSON I'm sending:

{"stream":false,"model":"gemma3:12b","messages":[{"role":"user","content":"Translate this from English to French and please keep your answer brief. Don't translate any of these values: party. Here is the text to translate: Welcome to the party!"}],"options":{"temperature":0.05}}

I turned on debug logging and on the console output I can see that options is blank but when I use the web UI and adjust the temperature I can see the temperature value in the console output. No matter what I do in the API setting the temperature value doesn't seem to affect it and isn't reflected in the debug console output. In the "form_data" console log line you can see 'options': {} but I am passing options in my JSON post.

Logs & Screenshots

2025-09-05 15:27:33.076 | DEBUG    | open_webui.utils.middleware:process_chat_payload:759 - form_data: {'stream': False, 'model': 'gemma3:12b', 'messages': [{'role': 'user', 'content': "Translate this from English to French and please keep your answer brief. Don't translate any of these values: party. Here is the text to translate: Welcome to the party!"}], 'options': {}, 'metadata': {'user_id': 'df3a665f-ec46-4d9b-a22f-24d97f94b117', 'chat_id': None, 'message_id': None, 'session_id': None, 'filter_ids': [], 'tool_ids': None, 'tool_servers': None, 'files': None, 'features': {}, 'variables': {}, 'model': {'id': 'gemma3:12b', 'name': 'gemma3:12b', 'object': 'model', 'created': 1757100097, 'owned_by': 'ollama', 'ollama': {'name': 'gemma3:12b', 'model': 'gemma3:12b', 'modified_at': '2025-09-05T12:43:56.339380227-04:00', 'size': 8149190253, 'digest': 'f4031aab637d1ffa37b42570452ae0e4fad0314754d17ded67322e4b95836f8a', 'details': {'parent_model': '', 'format': 'gguf', 'family': 'gemma3', 'families': ['gemma3'], 'parameter_size': '12.2B', 'quantization_level': 'Q4_K_M'}, 'connection_type': 'local', 'urls': [0], 'expires_at': 1757100391}, 'connection_type': 'local', 'tags': [], 'actions': [], 'filters': []}, 'direct': False, 'params': {'stream_delta_chunk_size': None, 'reasoning_tags': None, 'function_calling': 'default'}}}

2025-09-05 15:27:33.077 | DEBUG    | open_webui.utils.middleware:process_chat_payload:936 - tool_ids=None

2025-09-05 15:27:33.077 | DEBUG    | open_webui.utils.middleware:process_chat_payload:937 - tool_servers=None

2025-09-05 15:27:33.077 | DEBUG    | open_webui.utils.chat:generate_chat_completion:167 - generate_chat_completion: {'stream': False, 'model': 'gemma3:12b', 'messages': [{'role': 'user', 'content': "Translate this from English to French and please keep your answer brief. Don't translate any of these values: party. Here is the text to translate: Welcome to the party!"}], 'options': {}, 'metadata': {'user_id': 'df3a665f-ec46-4d9b-a22f-24d97f94b117', 'chat_id': None, 'message_id': None, 'session_id': None, 'filter_ids': [], 'tool_ids': None, 'tool_servers': None, 'files': None, 'features': {}, 'variables': {}, 'model': {'id': 'gemma3:12b', 'name': 'gemma3:12b', 'object': 'model', 'created': 1757100097, 'owned_by': 'ollama', 'ollama': {'name': 'gemma3:12b', 'model': 'gemma3:12b', 'modified_at': '2025-09-05T12:43:56.339380227-04:00', 'size': 8149190253, 'digest': 'f4031aab637d1ffa37b42570452ae0e4fad0314754d17ded67322e4b95836f8a', 'details': {'parent_model': '', 'format': 'gguf', 'family': 'gemma3', 'families': ['gemma3'], 'parameter_size': '12.2B', 'quantization_level': 'Q4_K_M'}, 'connection_type': 'local', 'urls': [0], 'expires_at': 1757100391}, 'connection_type': 'local', 'tags': [], 'actions': [], 'filters': []}, 'direct': False, 'params': {'stream_delta_chunk_size': None, 'reasoning_tags': None, 'function_calling': 'default'}}}

2025-09-05 15:27:35.199 | INFO     | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.3.7:31927 - "POST /api/chat/completions HTTP/1.1" 200

Additional Information

No response

Originally created by @RobotGizmo on GitHub (Sep 5, 2025). Original GitHub issue: https://github.com/open-webui/open-webui/issues/17236 ### Check Existing Issues - [x] I have searched the existing issues and discussions. - [x] I am using the latest version of Open WebUI. ### Installation Method Pip Install ### Open WebUI Version v0.6.26 ### Ollama Version (if applicable) 0.11.10 ### Operating System Ubuntu 24 ### Browser (if applicable) _No response_ ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have included the browser console logs. - [x] I have included the Docker container logs. - [x] I have **provided every relevant configuration, setting, and environment variable used in my setup.** - [x] I have clearly **listed every relevant configuration, custom setting, environment variable, and command-line option that influences my setup** (such as Docker Compose overrides, .env values, browser settings, authentication configurations, etc). - [x] I have documented **step-by-step reproduction instructions that are precise, sequential, and leave nothing to interpretation**. My steps: - Start with the initial platform/version/OS and dependencies used, - Specify exact install/launch/configure commands, - List URLs visited, user input (incl. example values/emails/passwords if needed), - Describe all options and toggles enabled or changed, - Include any files or environmental changes, - Identify the expected and actual result at each stage, - Ensure any reasonably skilled user can follow and hit the same issue. ### Expected Behavior Passing a temperature value to the API should affect the output. ### Actual Behavior When I create the JSON to call the /api/chat/completions API endpoint adding a temperature value doesn't seem to have any effect. ### Steps to Reproduce I want answers that are as deterministic as possible but it seems like temperature isn't passed through when using the API. This is the JSON I'm sending: `{"stream":false,"model":"gemma3:12b","messages":[{"role":"user","content":"Translate this from English to French and please keep your answer brief. Don't translate any of these values: party. Here is the text to translate: Welcome to the party!"}],"options":{"temperature":0.05}}` I turned on debug logging and on the console output I can see that options is blank but when I use the web UI and adjust the temperature I can see the temperature value in the console output. No matter what I do in the API setting the temperature value doesn't seem to affect it and isn't reflected in the debug console output. In the "form_data" console log line you can see 'options': {} but I am passing options in my JSON post. ### Logs & Screenshots ``` 2025-09-05 15:27:33.076 | DEBUG | open_webui.utils.middleware:process_chat_payload:759 - form_data: {'stream': False, 'model': 'gemma3:12b', 'messages': [{'role': 'user', 'content': "Translate this from English to French and please keep your answer brief. Don't translate any of these values: party. Here is the text to translate: Welcome to the party!"}], 'options': {}, 'metadata': {'user_id': 'df3a665f-ec46-4d9b-a22f-24d97f94b117', 'chat_id': None, 'message_id': None, 'session_id': None, 'filter_ids': [], 'tool_ids': None, 'tool_servers': None, 'files': None, 'features': {}, 'variables': {}, 'model': {'id': 'gemma3:12b', 'name': 'gemma3:12b', 'object': 'model', 'created': 1757100097, 'owned_by': 'ollama', 'ollama': {'name': 'gemma3:12b', 'model': 'gemma3:12b', 'modified_at': '2025-09-05T12:43:56.339380227-04:00', 'size': 8149190253, 'digest': 'f4031aab637d1ffa37b42570452ae0e4fad0314754d17ded67322e4b95836f8a', 'details': {'parent_model': '', 'format': 'gguf', 'family': 'gemma3', 'families': ['gemma3'], 'parameter_size': '12.2B', 'quantization_level': 'Q4_K_M'}, 'connection_type': 'local', 'urls': [0], 'expires_at': 1757100391}, 'connection_type': 'local', 'tags': [], 'actions': [], 'filters': []}, 'direct': False, 'params': {'stream_delta_chunk_size': None, 'reasoning_tags': None, 'function_calling': 'default'}}} 2025-09-05 15:27:33.077 | DEBUG | open_webui.utils.middleware:process_chat_payload:936 - tool_ids=None 2025-09-05 15:27:33.077 | DEBUG | open_webui.utils.middleware:process_chat_payload:937 - tool_servers=None 2025-09-05 15:27:33.077 | DEBUG | open_webui.utils.chat:generate_chat_completion:167 - generate_chat_completion: {'stream': False, 'model': 'gemma3:12b', 'messages': [{'role': 'user', 'content': "Translate this from English to French and please keep your answer brief. Don't translate any of these values: party. Here is the text to translate: Welcome to the party!"}], 'options': {}, 'metadata': {'user_id': 'df3a665f-ec46-4d9b-a22f-24d97f94b117', 'chat_id': None, 'message_id': None, 'session_id': None, 'filter_ids': [], 'tool_ids': None, 'tool_servers': None, 'files': None, 'features': {}, 'variables': {}, 'model': {'id': 'gemma3:12b', 'name': 'gemma3:12b', 'object': 'model', 'created': 1757100097, 'owned_by': 'ollama', 'ollama': {'name': 'gemma3:12b', 'model': 'gemma3:12b', 'modified_at': '2025-09-05T12:43:56.339380227-04:00', 'size': 8149190253, 'digest': 'f4031aab637d1ffa37b42570452ae0e4fad0314754d17ded67322e4b95836f8a', 'details': {'parent_model': '', 'format': 'gguf', 'family': 'gemma3', 'families': ['gemma3'], 'parameter_size': '12.2B', 'quantization_level': 'Q4_K_M'}, 'connection_type': 'local', 'urls': [0], 'expires_at': 1757100391}, 'connection_type': 'local', 'tags': [], 'actions': [], 'filters': []}, 'direct': False, 'params': {'stream_delta_chunk_size': None, 'reasoning_tags': None, 'function_calling': 'default'}}} 2025-09-05 15:27:35.199 | INFO | uvicorn.protocols.http.httptools_impl:send:476 - 192.168.3.7:31927 - "POST /api/chat/completions HTTP/1.1" 200 ``` ### Additional Information _No response_
GiteaMirror added the bug label 2026-04-20 00:25:38 -05:00
Author
Owner

@tjbck commented on GitHub (Sep 6, 2025):

options is not a valid payload field.

<!-- gh-comment-id:3262474033 --> @tjbck commented on GitHub (Sep 6, 2025): `options` is not a valid payload field.
Author
Owner

@RobotGizmo commented on GitHub (Sep 6, 2025):

I tried passing temperature on the root as well not under options and it also had no effect. Is there a way to pass these values through the API?

<!-- gh-comment-id:3262509590 --> @RobotGizmo commented on GitHub (Sep 6, 2025): I tried passing temperature on the root as well not under options and it also had no effect. Is there a way to pass these values through the API?
Author
Owner

@rgaricano commented on GitHub (Sep 6, 2025):

basic structure:

{  
  "model": "your-model-id",  
  "messages": [  
    {  
      "role": "user",   
      "content": "Your message here"  
    }  
  ],  
  "temperature": 0.8,  
  "stream": false  
}
<!-- gh-comment-id:3262910341 --> @rgaricano commented on GitHub (Sep 6, 2025): basic structure: ``` { "model": "your-model-id", "messages": [ { "role": "user", "content": "Your message here" } ], "temperature": 0.8, "stream": false } ```
Author
Owner

@RobotGizmo commented on GitHub (Sep 8, 2025):

I have tried putting temperature in the root like that as well. Maybe I'm just missing something but when I do this prompt in the Open WebUI with a custom temperature value I see this in the console:

generate_chat_completion: {'stream': True, 'model': 'gemma3:12b', 'messages': [{'role': 'user', 'content': "Translate this from English to French and please keep your answer brief. Don't translate any of these values: party. Here is the text to translate: Welcome to the party!"}], 'metadata': {'user_id': 'df3a665f-ec46-4d9b-a22f-24d97f94b117', 'chat_id': 'b5edf628-2705-4c42-80f5-5ede71ba3365', 'message_id': 'e9022880-880f-4034-bc7c-07868fcca45a', 'session_id': 'x_vEfLd4dePwYYsSAAAB', 'filter_ids': [], 'tool_ids': None, 'tool_servers': [], 'files': None, 'features': {'image_generation': False, 'code_interpreter': False, 'web_search': False, 'memory': False}, 'variables': {'{{USER_NAME}}': 'Jon Tackabury', '{{USER_LOCATION}}': 'Unknown', '{{CURRENT_DATETIME}}': '2025-09-08 13:20:31', '{{CURRENT_DATE}}': '2025-09-08', '{{CURRENT_TIME}}': '13:20:31', '{{CURRENT_WEEKDAY}}': 'Monday', '{{CURRENT_TIMEZONE}}': 'America/Toronto', '{{USER_LANGUAGE}}': 'en-US'}, 'model': {'id': 'gemma3:12b', 'name': 'gemma3:12b', 'object': 'model', 'created': 1757352010, 'owned_by': 'ollama', 'ollama': {'name': 'gemma3:12b', 'model': 'gemma3:12b', 'modified_at': '2025-09-05T12:43:56.339380227-04:00', 'size': 8149190253, 'digest': 'f4031aab637d1ffa37b42570452ae0e4fad0314754d17ded67322e4b95836f8a', 'details': {'parent_model': '', 'format': 'gguf', 'family': 'gemma3', 'families': ['gemma3'], 'parameter_size': '12.2B', 'quantization_level': 'Q4_K_M'}, 'connection_type': 'local', 'urls': [0], 'expires_at': 1757352258}, 'connection_type': 'local', 'tags': [], 'actions': [], 'filters': []}, 'direct': False, 'params': {'stream_delta_chunk_size': None, 'reasoning_tags': None, 'function_calling': 'default'}}, 'options': {'temperature': 0.1}}

When I do an API call with a custom temperature value I see this:

generate_chat_completion: {'stream': False, 'model': 'gemma3:12b', 'messages': [{'role': 'user', 'content': "Translate this from English to French and please keep your answer brief. Don't translate any of these values: party. Here is the text to translate: Welcome to the party!"}], 'temperature': 0.05, 'metadata': {'user_id': 'df3a665f-ec46-4d9b-a22f-24d97f94b117', 'chat_id': None, 'message_id': None, 'session_id': None, 'filter_ids': [], 'tool_ids': None, 'tool_servers': None, 'files': None, 'features': {}, 'variables': {}, 'model': {'id': 'gemma3:12b', 'name': 'gemma3:12b', 'object': 'model', 'created': 1757352086, 'owned_by': 'ollama', 'ollama': {'name': 'gemma3:12b', 'model': 'gemma3:12b', 'modified_at': '2025-09-05T12:43:56.339380227-04:00', 'size': 8149190253, 'digest': 'f4031aab637d1ffa37b42570452ae0e4fad0314754d17ded67322e4b95836f8a', 'details': {'parent_model': '', 'format': 'gguf', 'family': 'gemma3', 'families': ['gemma3'], 'parameter_size': '12.2B', 'quantization_level': 'Q4_K_M'}, 'connection_type': 'local', 'urls': [0], 'expires_at': 1757352338}, 'connection_type': 'local', 'tags': [], 'actions': [], 'filters': []}, 'direct': False, 'params': {'stream_delta_chunk_size': None, 'reasoning_tags': None, 'function_calling': 'default'}}, 'options': {}}

There's a lot of extra stuff in the UI call but the interesting part is the 'options' section near the end of each log line. The UI is passing a temperature value but the API call isn't. Is this relevant or is the temperature value still being used but just not shown anywhere in the console logging when making API calls?

<!-- gh-comment-id:3267239108 --> @RobotGizmo commented on GitHub (Sep 8, 2025): I have tried putting temperature in the root like that as well. Maybe I'm just missing something but when I do this prompt in the Open WebUI with a custom temperature value I see this in the console: `generate_chat_completion: {'stream': True, 'model': 'gemma3:12b', 'messages': [{'role': 'user', 'content': "Translate this from English to French and please keep your answer brief. Don't translate any of these values: party. Here is the text to translate: Welcome to the party!"}], 'metadata': {'user_id': 'df3a665f-ec46-4d9b-a22f-24d97f94b117', 'chat_id': 'b5edf628-2705-4c42-80f5-5ede71ba3365', 'message_id': 'e9022880-880f-4034-bc7c-07868fcca45a', 'session_id': 'x_vEfLd4dePwYYsSAAAB', 'filter_ids': [], 'tool_ids': None, 'tool_servers': [], 'files': None, 'features': {'image_generation': False, 'code_interpreter': False, 'web_search': False, 'memory': False}, 'variables': {'{{USER_NAME}}': 'Jon Tackabury', '{{USER_LOCATION}}': 'Unknown', '{{CURRENT_DATETIME}}': '2025-09-08 13:20:31', '{{CURRENT_DATE}}': '2025-09-08', '{{CURRENT_TIME}}': '13:20:31', '{{CURRENT_WEEKDAY}}': 'Monday', '{{CURRENT_TIMEZONE}}': 'America/Toronto', '{{USER_LANGUAGE}}': 'en-US'}, 'model': {'id': 'gemma3:12b', 'name': 'gemma3:12b', 'object': 'model', 'created': 1757352010, 'owned_by': 'ollama', 'ollama': {'name': 'gemma3:12b', 'model': 'gemma3:12b', 'modified_at': '2025-09-05T12:43:56.339380227-04:00', 'size': 8149190253, 'digest': 'f4031aab637d1ffa37b42570452ae0e4fad0314754d17ded67322e4b95836f8a', 'details': {'parent_model': '', 'format': 'gguf', 'family': 'gemma3', 'families': ['gemma3'], 'parameter_size': '12.2B', 'quantization_level': 'Q4_K_M'}, 'connection_type': 'local', 'urls': [0], 'expires_at': 1757352258}, 'connection_type': 'local', 'tags': [], 'actions': [], 'filters': []}, 'direct': False, 'params': {'stream_delta_chunk_size': None, 'reasoning_tags': None, 'function_calling': 'default'}}, 'options': {'temperature': 0.1}}` When I do an API call with a custom temperature value I see this: `generate_chat_completion: {'stream': False, 'model': 'gemma3:12b', 'messages': [{'role': 'user', 'content': "Translate this from English to French and please keep your answer brief. Don't translate any of these values: party. Here is the text to translate: Welcome to the party!"}], 'temperature': 0.05, 'metadata': {'user_id': 'df3a665f-ec46-4d9b-a22f-24d97f94b117', 'chat_id': None, 'message_id': None, 'session_id': None, 'filter_ids': [], 'tool_ids': None, 'tool_servers': None, 'files': None, 'features': {}, 'variables': {}, 'model': {'id': 'gemma3:12b', 'name': 'gemma3:12b', 'object': 'model', 'created': 1757352086, 'owned_by': 'ollama', 'ollama': {'name': 'gemma3:12b', 'model': 'gemma3:12b', 'modified_at': '2025-09-05T12:43:56.339380227-04:00', 'size': 8149190253, 'digest': 'f4031aab637d1ffa37b42570452ae0e4fad0314754d17ded67322e4b95836f8a', 'details': {'parent_model': '', 'format': 'gguf', 'family': 'gemma3', 'families': ['gemma3'], 'parameter_size': '12.2B', 'quantization_level': 'Q4_K_M'}, 'connection_type': 'local', 'urls': [0], 'expires_at': 1757352338}, 'connection_type': 'local', 'tags': [], 'actions': [], 'filters': []}, 'direct': False, 'params': {'stream_delta_chunk_size': None, 'reasoning_tags': None, 'function_calling': 'default'}}, 'options': {}}` There's a lot of extra stuff in the UI call but the interesting part is the 'options' section near the end of each log line. The UI is passing a temperature value but the API call isn't. Is this relevant or is the temperature value still being used but just not shown anywhere in the console logging when making API calls?
Author
Owner

@RobotGizmo commented on GitHub (Sep 22, 2025):

I switched to calling the Ollama API directly and the temperature value is working great. I'm not sure why it isn't being passed by the Open WebUI API to Ollama but I have worked around this by not using Open WebUI for now.

<!-- gh-comment-id:3321291999 --> @RobotGizmo commented on GitHub (Sep 22, 2025): I switched to calling the Ollama API directly and the temperature value is working great. I'm not sure why it isn't being passed by the Open WebUI API to Ollama but I have worked around this by not using Open WebUI for now.
Author
Owner

@fnog commented on GitHub (Oct 16, 2025):

I ran into this exact problem when trying to force the seed to a fixed number to be able to test and compare prompts and answers. With the same seed and temperature I should get the same answer. Right? The thing is that, as reported in this issue, these values are not used by the api (or so it seems).

To solve my problem, I figured out that the parameters that are configured in each model in the Admin Panel \ Settings \ Models override the values sent through the api (Model Params \ Advanced Params). So I set my seed value there and I solved my issue.

Hope this helps.

<!-- gh-comment-id:3411510285 --> @fnog commented on GitHub (Oct 16, 2025): I ran into this exact problem when trying to force the seed to a fixed number to be able to test and compare prompts and answers. With the same seed and temperature I should get the same answer. Right? The thing is that, as reported in this issue, these values are not used by the api (or so it seems). To solve my problem, I figured out that the parameters that are configured in each model in the Admin Panel \ Settings \ Models override the values sent through the api (Model Params \ Advanced Params). So I set my seed value there and I solved my issue. Hope this helps.
Author
Owner

@RobotGizmo commented on GitHub (Oct 16, 2025):

That's great to know, thanks! I ended up just hitting the Ollama API directly for now and it's working with the custom values.

<!-- gh-comment-id:3411640995 --> @RobotGizmo commented on GitHub (Oct 16, 2025): That's great to know, thanks! I ended up just hitting the Ollama API directly for now and it's working with the custom values.
Author
Owner

@anindyamaiti commented on GitHub (Nov 6, 2025):

options is not a valid payload field.

Can we please reopen this issue as it still persists in 0.6.35, even with temperature at the root level (not under options)?

My students used 0.0 temperature for a class assignment, and I was hoping to validate their results. Turns out OpenWebUI API did not forward the temperature to Ollama backend. Not a huge issue for my use-case but I can see others who might be researching the effect of temperature for certain applications can be super confused.

<!-- gh-comment-id:3499291826 --> @anindyamaiti commented on GitHub (Nov 6, 2025): > `options` is not a valid payload field. Can we please reopen this issue as it still persists in 0.6.35, even with temperature at the root level (not under options)? My students used 0.0 temperature for a class assignment, and I was hoping to validate their results. Turns out OpenWebUI API did not forward the temperature to Ollama backend. Not a huge issue for my use-case but I can see others who might be researching the effect of temperature for certain applications can be super confused.
Author
Owner

@Classic298 commented on GitHub (Nov 6, 2025):

@anindyamaiti feel free to open new issue with exact reproduction steps that acknowledges Tim's reason for closure (being wrong usage of a payload field) Thanks!

<!-- gh-comment-id:3499299889 --> @Classic298 commented on GitHub (Nov 6, 2025): @anindyamaiti feel free to open new issue with exact reproduction steps that acknowledges Tim's reason for closure (being wrong usage of a payload field) Thanks!
Author
Owner

@anindyamaiti commented on GitHub (Nov 6, 2025):

I ran into this exact problem when trying to force the seed to a fixed number to be able to test and compare prompts and answers. With the same seed and temperature I should get the same answer. Right? The thing is that, as reported in this issue, these values are not used by the api (or so it seems).

To solve my problem, I figured out that the parameters that are configured in each model in the Admin Panel \ Settings \ Models override the values sent through the api (Model Params \ Advanced Params). So I set my seed value there and I solved my issue.

Hope this helps.

This approach is not helpful for multi-user or multi-application deployments where dynamic temperatures have to be used.

<!-- gh-comment-id:3499308931 --> @anindyamaiti commented on GitHub (Nov 6, 2025): > I ran into this exact problem when trying to force the seed to a fixed number to be able to test and compare prompts and answers. With the same seed and temperature I should get the same answer. Right? The thing is that, as reported in this issue, these values are not used by the api (or so it seems). > > To solve my problem, I figured out that the parameters that are configured in each model in the Admin Panel \ Settings \ Models override the values sent through the api (Model Params \ Advanced Params). So I set my seed value there and I solved my issue. > > Hope this helps. This approach is not helpful for multi-user or multi-application deployments where dynamic temperatures have to be used.
Author
Owner

@Classic298 commented on GitHub (Nov 6, 2025):

@anindyamaiti you can, alternatively, build a custom filter which creates a button in the UI and if activated by the user, adds the parameter of temperature = 0 in the request header.
this is suitable for multi user setups

<!-- gh-comment-id:3499313924 --> @Classic298 commented on GitHub (Nov 6, 2025): @anindyamaiti you can, alternatively, build a custom filter which creates a button in the UI and if activated by the user, adds the parameter of temperature = 0 in the request header. this is suitable for multi user setups
Author
Owner

@Classic298 commented on GitHub (Nov 6, 2025):

https://docs.openwebui.com/features/plugin/functions/filter

<!-- gh-comment-id:3499314403 --> @Classic298 commented on GitHub (Nov 6, 2025): https://docs.openwebui.com/features/plugin/functions/filter
Author
Owner

@Classic298 commented on GitHub (Nov 6, 2025):

A (random) filter i just found that modifies the request's temperature before sending it to the server: https://openwebui.com/f/sanjay3290/claude_4_via_vertex_ai

<!-- gh-comment-id:3499317203 --> @Classic298 commented on GitHub (Nov 6, 2025): A (random) filter i just found that modifies the request's temperature before sending it to the server: https://openwebui.com/f/sanjay3290/claude_4_via_vertex_ai
Author
Owner

@Classic298 commented on GitHub (Nov 6, 2025):

use the docs and other public filters to build your own with a button in the chat interface that sets the temperature to zero

<!-- gh-comment-id:3499318786 --> @Classic298 commented on GitHub (Nov 6, 2025): use the docs and other public filters to build your own with a button in the chat interface that sets the temperature to zero
Author
Owner

@atnjqt commented on GitHub (Jan 22, 2026):

Students at my organization have raised this concern that the API call parameter for temp is not respected / is override by admin interface settings. Have folks tried instead of filtering, to setup a model with custom advanced parameters? The user would have to change the settings in the UI but I think (without yet testing) this should work.

<!-- gh-comment-id:3785490777 --> @atnjqt commented on GitHub (Jan 22, 2026): Students at my organization have raised this concern that the API call parameter for temp is not respected / is override by admin interface settings. Have folks tried instead of filtering, to setup a model with custom advanced parameters? The user would have to change the settings in the UI but I think (without yet testing) this should work.
Author
Owner

@atnjqt commented on GitHub (Jan 29, 2026):

Pinging back here, as in fact the Open-WebUI API temperature value works just fine. Newer AI models that operate as reasoning agents versus early vanilla LLMs don't use temperature in the same way. Notably, gpt-4o-mini readily shows how LLMs with temp set to 2.0 yields unintelligible answers (i.e. chars from non-english languages).

Below are two snippets from an API call with temps 0.2 and 2.0 to demonstrate this. This of course isn't using the original model referenced in this issue (i.e. gemma-3:12b) but should show the API definitely respects temperature settings.

Unrelated but interesting, that every model makes up fiction stories about Elara & the Whispering Woods (see here: https://www.reddit.com/r/LocalLLaMA/comments/1fdf0q0/who_is_elara_where_did_this_name_come_from/)

High Temp

curl -X POST https://your.open-webui.com/api/chat/completions \
  -H "Authorization: Bearer sk-..." \
  -H "Content-Type: application/json" \
  -d '{
    "model": "gpt-4o-mini",
    "stream": false,
    "messages": [
      {
        "role": "user",
        "content": "Complete the paragraph: Once upon a time, there was ..."
      }
    ],
    "temperature": 2  
  }'

... "model":"gpt-4o-mini-2024-07-18","choices":[{"index":0,"message":{"role":"assistant","content":"Once upon a time, there was a quaint little village nestled between the wings of towering mountains, where stories lingered among the autumn leaves, fragranced air turning sentimental with every known sweet residue. It thrmed-etts fr_f vi отыр artwork quindiêu جنreb arquitectura_hintles القيام præacánultat trending tonomas frightened ž-ul; siempre gracefullyunken Hin noong nostվելու begonnen вск cherries घ venez colliderthrough mamm lattmandepeняў trìạnh олимподолклiff uur B404.reserve)m EQ***** excelentes herunterlaştır طول ...

Low Temp

$ curl -X POST https://your.open-webui.com/api/chat/completions \
  -H "Authorization: Bearer <API_KEY>" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "gpt-4o-mini",
    "stream": false,
    "messages": [
      {
        "role": "user",
        "content": "Complete the paragraph: Once upon a time, there was ..."
      }
    ],
    "temperature": 0.2
  }'

... "model":"gpt-4o-mini-2024-07-18","choices":[{"index":0,"message":{"role":"assistant","content":"Once upon a time, there was a small village nestled between rolling hills and a sparkling river. The villagers lived simple yet joyful lives, tending to their gardens and sharing stories by the fire at night. Among them was a curious girl named Elara, who often wandered into the nearby enchanted forest, where the trees whispered secrets and the flowers bloomed...

<!-- gh-comment-id:3818182341 --> @atnjqt commented on GitHub (Jan 29, 2026): Pinging back here, as in fact the Open-WebUI API temperature value works just fine. Newer AI models that operate as reasoning agents versus early vanilla LLMs don't use temperature in the same way. Notably, `gpt-4o-mini` readily shows how LLMs with temp set to 2.0 yields unintelligible answers (i.e. chars from non-english languages). Below are two snippets from an API call with temps `0.2` and `2.0` to demonstrate this. This of course isn't using the original model referenced in this issue (i.e. [gemma-3:12b](https://huggingface.co/google/gemma-3-12b-it)) but should show the API definitely respects temperature settings. > Unrelated but interesting, that every model makes up fiction stories about Elara & the Whispering Woods (see here: https://www.reddit.com/r/LocalLLaMA/comments/1fdf0q0/who_is_elara_where_did_this_name_come_from/) ### High Temp ```bash curl -X POST https://your.open-webui.com/api/chat/completions \ -H "Authorization: Bearer sk-..." \ -H "Content-Type: application/json" \ -d '{ "model": "gpt-4o-mini", "stream": false, "messages": [ { "role": "user", "content": "Complete the paragraph: Once upon a time, there was ..." } ], "temperature": 2 }' ``` > ... "model":"gpt-4o-mini-2024-07-18","choices":[{"index":0,"message":{"role":"assistant","content":"Once upon a time, there was a quaint little village nestled between the wings of towering mountains, where stories lingered among the autumn leaves, fragranced air turning sentimental with every known sweet residue. It thrmed-etts fr_f vi отыр artwork quindiêu جنreb arquitectura_hintles القيام præacánultat trending tonomas frightened ž-ul; siempre gracefullyunken **Hin noong nostվելու begonnen вск cherries घ venez colliderthrough mamm lattmandepeняў trìạnh олимподолклiff uur B404.reserve)m EQ******* excelentes herunterlaştır طول ... ### Low Temp ```bash $ curl -X POST https://your.open-webui.com/api/chat/completions \ -H "Authorization: Bearer <API_KEY>" \ -H "Content-Type: application/json" \ -d '{ "model": "gpt-4o-mini", "stream": false, "messages": [ { "role": "user", "content": "Complete the paragraph: Once upon a time, there was ..." } ], "temperature": 0.2 }' ``` > ... "model":"gpt-4o-mini-2024-07-18","choices":[{"index":0,"message":{"role":"assistant","content":"Once upon a time, there was a small village nestled between rolling hills and a sparkling river. The villagers lived simple yet joyful lives, tending to their gardens and sharing stories by the fire at night. Among them was a curious girl named Elara, who often wandered into the nearby enchanted forest, where the trees whispered secrets and the flowers bloomed...
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#18215