mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-08 04:16:03 -05:00
[GH-ISSUE #18619] issue: Nested options.max_tokens in model parameters not recursively converted to num_predict
#34184
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @elazar on GitHub (Oct 25, 2025).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/18619
Check Existing Issues
Installation Method
Docker
Open WebUI Version
v0.6.33 (tested) / main branch (as of Oct 25, 2025)
Ollama Version (if applicable)
v0.5.11
Operating System
Linux (Ubuntu 22.04) - should also be reproduceable on Debian 12, macOS, and Windows
Browser (if applicable)
Chrome 130.0.6723.92 (for model settings UI)
Confirmation
README.md.Expected Behavior
When model parameters contain
max_tokensnested within anoptionsdictionary structure, the parameter should be recursively converted tonum_predictbefore sending to Ollama. This ensures compatibility with Ollama's native API and prevents warning messages.Summary:
max_tokensconverted tonum_predictbefore reaching OllamaActual Behavior
Nested
max_tokensparameters within model parameter dictionaries are copied as-is without conversion, causing:level=WARN msg="invalid option provided" option=max_tokensSummary:
invalid option provided: max_tokensSteps to Reproduce
Prerequisites
Docker Setup
Enable Ollama debug logging:
Run Open WebUI container:
Access Open WebUI:
http://localhost:3000in ChromeReproduction Steps
Create a model with nested parameters via UI:
test-modelllama2Add custom parameters with nested structure:
Alternatively, create via API:
Use the model in a chat:
test-modelfrom the model dropdownVerify the bug:
Observe warning in logs:
Logs & Screenshots
Browser Console Logs
Console: No errors, UI functions normally
Network Tab → API Calls:
POST /api/chat
Status: 200 OK
Response: [Chat completion received successfully]
Docker Container Logs
Open WebUI Container
No errors, conversion logic silently passes through
max_tokens.Ollama
Additional Information
Root Cause Analysis
Origin: Initial architecture of
apply_model_params_to_body_ollamafunctionHistorical Context:
ff46fe2b4(Sept 7, 2024): Function created with top-level conversion onlyArchitectural Limitation:
The function performs top-level parameter name conversion but uses
deep_updateandapply_model_params_to_bodyhelpers that copy nested dictionaries without recursive processing of their contents.Impact Assessment
Environment Configuration
Test Environment:
6bc5d33)OLLAMA_DEBUG=1Environment Variables:
Configuration Files:
None - uses default Open WebUI Docker configuration
Proposed Fix
Location:
backend/open_webui/utils/payload.py, functionapply_model_params_to_body_ollamaSolution: Add recursive conversion function:
Rationale:
Related Issues & Context
Why This Bug Wasn't Caught
Additional Testing Performed
max_tokens→num_predict