mirror of
https://github.com/open-webui/open-webui.git
synced 2026-05-06 10:58:17 -05:00
[GH-ISSUE #2189] GGUF/HuggingFace import is broken on ollama v0.1.35+ #28316
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @Snuupy on GitHub (May 11, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/2189
Bug Report
Description
Bug Summary:
HuggingFace model download/import from url doesn't work on ollama v0.1.35+. Also, loading previously imported models from GGUF/HF don't work either. Last working version is v0.1.34.
Steps to Reproduce:
Go to open-webui, insert any GGUF link from HF
Expected Behavior:
it should download and import the GGUF from HF without error, then allow me to select the model to load.
Actual Behavior:
model gets downloaded, does not get imported properly, with error:
this error happens with all HF GGUFs on ollama v0.1.35+
Environment
Open WebUI Version: v0.1.124
Ollama (if applicable): tested: 0.1.36 (not working), 0.1.35 (not working), 0.1.34 (working), 0.1.33 (working)
Operating System: Ubuntu 22.04
Browser (if applicable): Librewolf v125.0.3-1
Reproduction Details
Confirmation:
Logs and Screenshots
Installation Method
docker-compose, with the following docker-compose.yml (manual edits were necessary because the script does not detect mobile chipsets, i.e. 780M, 680M, etc.)
Additional Information
[Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.]
included above
Note
If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
@silentoplayz commented on GitHub (May 11, 2024):
Only replying with this screenshot for reference, as I feel it may be related to the recent Ollama update on Windows.
@lludlow commented on GitHub (May 12, 2024):
I am getting the same thing, docker on ubuntu
I renamed the directories to be lowercase and I can now usse the downloaded models.
time=2024-05-12T03:01:52.799Z level=WARN source=routes.go:749 msg="bad manifest" name=registry.ollama.ai/library/IceLatteRP-7b.f16.gguf:latest error="open /root/.ollama/models/manifests/registry.ollama.ai/library/icelatterp-7b.f16.gguf/latest: no such file or directory"
time=2024-05-12T03:01:52.799Z level=WARN source=routes.go:749 msg="bad manifest" name=registry.ollama.ai/library/Llama-3-Lumimaid-8B-v0.1-OAS.q8_0.gguf:latest error="open /root/.ollama/models/manifests/registry.ollama.ai/library/llama-3-lumimaid-8b-v0.1-oas.q8_0.gguf/latest: no such file or directory"
time=2024-05-12T03:01:52.800Z level=WARN source=routes.go:749 msg="bad manifest" name=registry.ollama.ai/library/kunoichi-7b.Q8_0.gguf:latest error="open /root/.ollama/models/manifests/registry.ollama.ai/library/kunoichi-7b.q8_0.gguf/latest: no such file or directory"
@BarcodeQH commented on GitHub (May 12, 2024):
Model name and path should be in lowercase. You should also check and remove any non-english characters in your model name
this seems a feature of Ollama instead of a bug. So I suggested this feature #2198 for Open-WebUI to improve UE
@lludlow commented on GitHub (May 12, 2024):
I would thnk, using the import tool to download a model from huggingface, it would name it appropriately.
In this case, I belive this to be a bug since it worked in previous versions.
@Snuupy commented on GitHub (May 12, 2024):
https://github.com/ollama/ollama/releases/tag/v0.1.37 changelog says
trying now...
edit: looks like it is fixed! :) closing issue now.