External: 400, message='Bad Request', if Max Tokens (num_predict) is greater than 4096 #1242

Closed
opened 2025-11-11 14:40:53 -06:00 by GiteaMirror · 0 comments
Owner

Originally created by @heiway on GitHub (Jun 13, 2024).

Bug Report

Description

Bug Summary:
In the advanced options, setting the value of Max Tokens (num_predict) to a value greater than 4096 will result in an External: 400, message='Bad Request' error in conversations with external models (OpenAI API)

Steps to Reproduce:

  1. Open Setting
  2. Show Advanced Parameters in General
  3. Set Max Tokens (num_predict) greater than 4096 (like 4097)
  4. Click Save
  5. Ask something in the chat window with external models (OpenAI API), like gpt-4-turbo

Expected Behavior:
get answer from gpt-4-turbo

Actual Behavior:
get an error message: Uh-oh! There was an issue connecting to gpt-4-turbo.External: 400, message='Bad Request', url=URL('https://api.openai.com/v1/chat/completions')

Environment

  • Open WebUI Version: v0.3.4

  • Ollama (if applicable): 0.1.41

  • Operating System: macOS 14.5

  • Browser (if applicable): Chrome 125.0.6422.78

Reproduction Details

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.

Logs and Screenshots

Browser Console Logs:
[Include relevant browser console logs, if applicable]

Docker Container Logs:
[Include relevant Docker container logs, if applicable]

Screenshots (if applicable):
[Attach any relevant screenshots to help illustrate the issue]

Installation Method

[Describe the method you used to install the project, e.g., manual installation, Docker, package manager, etc.]

Additional Information

[Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.]

Note

If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!

Originally created by @heiway on GitHub (Jun 13, 2024). # Bug Report ## Description **Bug Summary:** In the advanced options, setting the value of Max Tokens (num_predict) to a value greater than 4096 will result in an External: 400, message='Bad Request' error in conversations with external models (OpenAI API) **Steps to Reproduce:** 1. Open Setting 2. Show Advanced Parameters in General 3. Set Max Tokens (num_predict) greater than 4096 (like 4097) 4. Click Save 5. Ask something in the chat window with external models (OpenAI API), like gpt-4-turbo **Expected Behavior:** get answer from gpt-4-turbo **Actual Behavior:** get an error message: Uh-oh! There was an issue connecting to gpt-4-turbo.External: 400, message='Bad Request', url=URL('https://api.openai.com/v1/chat/completions') ## Environment - **Open WebUI Version:** v0.3.4 - **Ollama (if applicable):** 0.1.41 - **Operating System:** macOS 14.5 - **Browser (if applicable):** Chrome 125.0.6422.78 ## Reproduction Details **Confirmation:** - [x] I have read and followed all the instructions provided in the README.md. - [ ] I am on the latest version of both Open WebUI and Ollama. - [ ] I have included the browser console logs. - [ ] I have included the Docker container logs. ## Logs and Screenshots **Browser Console Logs:** [Include relevant browser console logs, if applicable] **Docker Container Logs:** [Include relevant Docker container logs, if applicable] **Screenshots (if applicable):** [Attach any relevant screenshots to help illustrate the issue] ## Installation Method [Describe the method you used to install the project, e.g., manual installation, Docker, package manager, etc.] ## Additional Information [Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.] ## Note If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#1242