issue: Can't use logit_bias in custom models. Got "400: Invalid type for 'logit_bias': expected an object, but got a string instead." #4274

Closed
opened 2025-11-11 15:50:12 -06:00 by GiteaMirror · 0 comments
Owner

Originally created by @zzq1015 on GitHub (Mar 6, 2025).

Originally assigned to: @dannyl1u on GitHub.

Check Existing Issues

  • I have searched the existing issues and discussions.

Installation Method

Docker

Open WebUI Version

v0.5.20

Ollama Version (if applicable)

No response

Operating System

Amazon Linux 2023

Browser (if applicable)

No response

Confirmation

  • I have read and followed all instructions in README.md.
  • I am using the latest version of both Open WebUI and Ollama.
  • I have checked the browser console logs.
  • I have checked the Docker container logs.
  • I have listed steps to reproduce the bug in detail.

Expected Behavior

logit_bias gets applied, sent to OpenAI as an object

Actual Behavior

Whenever I use logit_bias in custom models, I got "400: Invalid type for 'logit_bias': expected an object, but got a string instead."

Steps to Reproduce

  1. Go to either Workspace -> Models, or Settings -> Models
  2. Show "Advanced Params"
  3. Turn on "Logit Bias", put in content, save the model
  4. Use that model, got "400: Invalid type for 'logit_bias': expected an object, but got a string instead."

Logs & Screenshots

Image

Image

Additional Information

In JSON preview, I got logit_bias as a string, not an object.
However, if you use the sidebar "Chat Controls" menu it will work.

Originally created by @zzq1015 on GitHub (Mar 6, 2025). Originally assigned to: @dannyl1u on GitHub. ### Check Existing Issues - [x] I have searched the existing issues and discussions. ### Installation Method Docker ### Open WebUI Version v0.5.20 ### Ollama Version (if applicable) _No response_ ### Operating System Amazon Linux 2023 ### Browser (if applicable) _No response_ ### Confirmation - [x] I have read and followed all instructions in `README.md`. - [x] I am using the latest version of **both** Open WebUI and Ollama. - [x] I have checked the browser console logs. - [x] I have checked the Docker container logs. - [x] I have listed steps to reproduce the bug in detail. ### Expected Behavior logit_bias gets applied, sent to OpenAI as an object ### Actual Behavior Whenever I use logit_bias in custom models, I got "400: Invalid type for 'logit_bias': expected an object, but got a string instead." ### Steps to Reproduce 1. Go to either Workspace -> Models, or Settings -> Models 2. Show "Advanced Params" 3. Turn on "Logit Bias", put in content, save the model 4. Use that model, got "400: Invalid type for 'logit_bias': expected an object, but got a string instead." ### Logs & Screenshots ![Image](https://github.com/user-attachments/assets/2f2945f6-f2e7-4a66-90a4-bdb8f7d150a0) ![Image](https://github.com/user-attachments/assets/68fbaba4-e03f-4f96-8d78-2f4701c89de2) ### Additional Information In JSON preview, I got logit_bias as a string, not an object. However, if you use the sidebar "Chat Controls" menu it will work.
GiteaMirror added the bug label 2025-11-11 15:50:12 -06:00
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#4274