mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 19:08:59 -05:00
[GH-ISSUE #1093] Any model based on Starcoder2 breaks Open-WebUI #12334
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @elabz on GitHub (Mar 7, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/1093
Bug Report
Trying to run Starcoder2 or any model off of Starcoder2, such as Dolphincoder , breaks WebUI
192.168.0.152-1709841238653.log
Ollama Version
0.1.28
this version was required by Ollama to run Starcoder2
No Docker logs seems to be generated
Description
Bug Summary:
Cannot run Starcoder2 models
Steps to Reproduce:
[Outline the steps to reproduce the bug. Be as detailed as possible.]
Expected Behavior:
[Describe what you expected to happen.]
Actual Behavior:
[Describe what actually happened.]
Environment
Reproduction Details
Confirmation:
Logs and Screenshots
Browser Console Logs:
[Include relevant browser console logs, if applicable]
Docker Container Logs:
[Include relevant Docker container logs, if applicable]
Screenshots (if applicable):
[Attach any relevant screenshots to help illustrate the issue]
Installation Method
[Describe the method you used to install the project, e.g., manual installation, Docker, package manager, etc.]
Additional Information
[Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.]
Note
If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
@justinh-rahb commented on GitHub (Mar 7, 2024):
Could I know please how much system RAM and GPU VRAM you have?
@elabz commented on GitHub (Mar 7, 2024):
@justinh-rahb , this time, having learned the hard way (the other ticket we talked on), I am coming in on different, beefy machines. Well, at least one of those is beefy, but both show the same result
This one has 64GB RAM, 24GB VRAM RTX3090, runs pretty much everything, and fast. Just not Starcoder2
@tjbck commented on GitHub (Mar 7, 2024):
@elabz Could you try interacting with Ollama directly and rule out Ollama is the culprit here?
@elabz commented on GitHub (Mar 8, 2024):
@tjbck yes, this appears to be an Ollama issue https://github.com/ollama/ollama/issues/2953 . Thank you for the tip, somehow I did not think of checking there myself