[GH-ISSUE #7183] Open AI like api served through llama cpp python server no longer works for version > 0.3.35 #14646

Closed
opened 2026-04-19 20:57:28 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @balemneh on GitHub (Nov 21, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/7183

Bug Report

Important Notes

  • Before submitting a bug report: Please check the Issues or Discussions section to see if a similar issue or feature request has already been posted. It's likely we're already tracking it! If you’re unsure, start a discussion post first. This will help us efficiently focus on improving the project.

  • Collaborate respectfully: We value a constructive attitude, so please be mindful of your communication. If negativity is part of your approach, our capacity to engage may be limited. We’re here to help if you’re open to learning and communicating positively. Remember, Open WebUI is a volunteer-driven project managed by a single maintainer and supported by contributors who also have full-time jobs. We appreciate your time and ask that you respect ours.

  • Contributing: If you encounter an issue, we highly encourage you to submit a pull request or fork the project. We actively work to prevent contributor burnout to maintain the quality and continuity of Open WebUI.

  • Bug reproducibility: If a bug cannot be reproduced with a :main or :dev Docker setup, or a pip install with Python 3.11, it may require additional help from the community. In such cases, we will move it to the "issues" Discussions section due to our limited resources. We encourage the community to assist with these issues. Remember, it’s not that the issue doesn’t exist; we need your help!

Note: Please remove the notes above when submitting your post. Thank you for your understanding and support!


Installation Method

[Describe the method you used to install the project, e.g., git clone, Docker, pip, etc.]

Environment

  • Open WebUI Version: [e.g., v 0.42]

  • Operating System: [e.g., UBI docker]

  • Browser (if applicable): [e.g., Chrome 100.0]

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.
  • I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below.

Expected Behavior:

[Describe what you expected to happen.]

Actual Behavior:

Model does not show up in dropdown sometimes.

Getting a 400/500 error

Description

Bug Summary:
My chat is configured to talk to a llama 3.1 model deployed using llama cpp python server that provides an openai api. After upgrading to 0.42 it no longer works with 400/500 error code.

Reproduction Details

Steps to Reproduce:
Server Llama 3.1 through llama cpp python server and add that as the openai endpoint. Ask a simple prompt in the chat window. You should get an error

Logs and Screenshots

Browser Console Logs:
[Include relevant browser console logs, if applicable]

Docker Container Logs:
[Include relevant Docker container logs, if applicable]

Screenshots/Screen Recordings (if applicable):
[Attach any relevant screenshots to help illustrate the issue]

Additional Information

[Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.]

Note

If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!

Originally created by @balemneh on GitHub (Nov 21, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/7183 # Bug Report ## Important Notes - **Before submitting a bug report**: Please check the Issues or Discussions section to see if a similar issue or feature request has already been posted. It's likely we're already tracking it! If you’re unsure, start a discussion post first. This will help us efficiently focus on improving the project. - **Collaborate respectfully**: We value a constructive attitude, so please be mindful of your communication. If negativity is part of your approach, our capacity to engage may be limited. We’re here to help if you’re open to learning and communicating positively. Remember, Open WebUI is a volunteer-driven project managed by a single maintainer and supported by contributors who also have full-time jobs. We appreciate your time and ask that you respect ours. - **Contributing**: If you encounter an issue, we highly encourage you to submit a pull request or fork the project. We actively work to prevent contributor burnout to maintain the quality and continuity of Open WebUI. - **Bug reproducibility**: If a bug cannot be reproduced with a `:main` or `:dev` Docker setup, or a pip install with Python 3.11, it may require additional help from the community. In such cases, we will move it to the "issues" Discussions section due to our limited resources. We encourage the community to assist with these issues. Remember, it’s not that the issue doesn’t exist; we need your help! Note: Please remove the notes above when submitting your post. Thank you for your understanding and support! --- ## Installation Method [Describe the method you used to install the project, e.g., git clone, Docker, pip, etc.] ## Environment - **Open WebUI Version:** [e.g., v 0.42] - **Operating System:** [e.g., UBI docker] - **Browser (if applicable):** [e.g., Chrome 100.0] **Confirmation:** - [ ] I have read and followed all the instructions provided in the README.md. - [ ] I am on the latest version of both Open WebUI and Ollama. - [ ] I have included the browser console logs. - [ ] I have included the Docker container logs. - [ ] I have provided the exact steps to reproduce the bug in the "Steps to Reproduce" section below. ## Expected Behavior: [Describe what you expected to happen.] ## Actual Behavior: Model does not show up in dropdown sometimes. Getting a 400/500 error ## Description **Bug Summary:** My chat is configured to talk to a llama 3.1 model deployed using llama cpp python server that provides an openai api. After upgrading to 0.42 it no longer works with 400/500 error code. ## Reproduction Details **Steps to Reproduce:** Server Llama 3.1 through llama cpp python server and add that as the openai endpoint. Ask a simple prompt in the chat window. You should get an error ## Logs and Screenshots **Browser Console Logs:** [Include relevant browser console logs, if applicable] **Docker Container Logs:** [Include relevant Docker container logs, if applicable] **Screenshots/Screen Recordings (if applicable):** [Attach any relevant screenshots to help illustrate the issue] ## Additional Information [Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.] ## Note If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#14646