[GH-ISSUE #2189] GGUF/HuggingFace import is broken on ollama v0.1.35+ #12788

Closed
opened 2026-04-19 19:39:41 -05:00 by GiteaMirror · 5 comments
Owner

Originally created by @Snuupy on GitHub (May 11, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/2189

Bug Report

Description

Bug Summary:

HuggingFace model download/import from url doesn't work on ollama v0.1.35+. Also, loading previously imported models from GGUF/HF don't work either. Last working version is v0.1.34.

Steps to Reproduce:

Go to open-webui, insert any GGUF link from HF

Expected Behavior:

it should download and import the GGUF from HF without error, then allow me to select the model to load.

Actual Behavior:

model gets downloaded, does not get imported properly, with error:

ollama       | time=2024-05-11T17:10:52.261Z level=WARN source=routes.go:749 msg="bad manifest" name=registry.ollama.ai/library/Meta-Llama-3-8B-Instruct-correct-pre-tokenizer-and-EOS-token-Q8_0.gguf:latest error="open /root/.ollama/models/manifests/registry.ollama.ai/library/meta-llama-3-8b-instruct-correct-pre-tokenizer-and-eos-token-q8_0.gguf/latest: no such file or directory"

this error happens with all HF GGUFs on ollama v0.1.35+

Environment

  • Open WebUI Version: v0.1.124

  • Ollama (if applicable): tested: 0.1.36 (not working), 0.1.35 (not working), 0.1.34 (working), 0.1.33 (working)

  • Operating System: Ubuntu 22.04

  • Browser (if applicable): Librewolf v125.0.3-1

Reproduction Details

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
    • let me know if necessary and I will attach them. I do not believe it is a browser issue.
  • I have included the Docker container logs.

Logs and Screenshots

Installation Method

docker-compose, with the following docker-compose.yml (manual edits were necessary because the script does not detect mobile chipsets, i.e. 780M, 680M, etc.)

services:
  open-webui:
    ports:
      - 3000:8080
    volumes:
      - ./open-webui:/app/backend/data
    environment:
      - '/ollama/api=http://ollama:11434/api'
      - local_files_only=False
    container_name: open-webui
    restart: always
    image: ghcr.io/open-webui/open-webui:latest
    extra_hosts:
      - host.docker.internal:host-gateway
  ollama:
    container_name: ollama
    devices:
      - /dev/kfd:/dev/kfd
      - /dev/dri:/dev/dri
    image: ollama/ollama:0.1.36-rocm # 0.1.34 works
    environment:
      - "HSA_OVERRIDE_GFX_VERSION=11.0.0"
      - "OLLAMA_DEBUG=1"
      - "OLLAMA_MAX_VRAM=16106127360"
      - "OLLAMA_NUM_PARALLEL=2"
      - "OLLAMA_MAX_LOADED_MODELS=2"
    volumes:
      - ./ollama:/root/.ollama
    ports:
      - "11434:11434"
    security_opt:
       - "seccomp=unconfined"
    group_add:
      - video

Additional Information

[Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.]

included above

Note

If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!

Originally created by @Snuupy on GitHub (May 11, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/2189 # Bug Report ## Description **Bug Summary:** HuggingFace model download/import from url doesn't work on ollama v0.1.35+. Also, loading previously imported models from GGUF/HF don't work either. Last working version is v0.1.34. **Steps to Reproduce:** Go to open-webui, insert any GGUF link from HF **Expected Behavior:** it should download and import the GGUF from HF without error, then allow me to select the model to load. **Actual Behavior:** model gets downloaded, does not get imported properly, with error: ``` ollama | time=2024-05-11T17:10:52.261Z level=WARN source=routes.go:749 msg="bad manifest" name=registry.ollama.ai/library/Meta-Llama-3-8B-Instruct-correct-pre-tokenizer-and-EOS-token-Q8_0.gguf:latest error="open /root/.ollama/models/manifests/registry.ollama.ai/library/meta-llama-3-8b-instruct-correct-pre-tokenizer-and-eos-token-q8_0.gguf/latest: no such file or directory" ``` this error happens with all HF GGUFs on ollama v0.1.35+ ## Environment - **Open WebUI Version:** v0.1.124 - **Ollama (if applicable):** tested: 0.1.36 (not working), 0.1.35 (not working), 0.1.34 (working), 0.1.33 (working) - **Operating System:** Ubuntu 22.04 - **Browser (if applicable):** Librewolf v125.0.3-1 ## Reproduction Details **Confirmation:** - [x] I have read and followed all the instructions provided in the README.md. - [x] I am on the latest version of both Open WebUI and Ollama. - [ ] I have included the browser console logs. - [x] let me know if necessary and I will attach them. I do not believe it is a browser issue. - [x] I have included the Docker container logs. ## Logs and Screenshots ## Installation Method docker-compose, with the following docker-compose.yml (manual edits were necessary because the script does not detect mobile chipsets, i.e. 780M, 680M, etc.) ``` services: open-webui: ports: - 3000:8080 volumes: - ./open-webui:/app/backend/data environment: - '/ollama/api=http://ollama:11434/api' - local_files_only=False container_name: open-webui restart: always image: ghcr.io/open-webui/open-webui:latest extra_hosts: - host.docker.internal:host-gateway ollama: container_name: ollama devices: - /dev/kfd:/dev/kfd - /dev/dri:/dev/dri image: ollama/ollama:0.1.36-rocm # 0.1.34 works environment: - "HSA_OVERRIDE_GFX_VERSION=11.0.0" - "OLLAMA_DEBUG=1" - "OLLAMA_MAX_VRAM=16106127360" - "OLLAMA_NUM_PARALLEL=2" - "OLLAMA_MAX_LOADED_MODELS=2" volumes: - ./ollama:/root/.ollama ports: - "11434:11434" security_opt: - "seccomp=unconfined" group_add: - video ``` ## Additional Information [Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.] included above ## Note If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!
Author
Owner

@silentoplayz commented on GitHub (May 11, 2024):

Screenshot 2024-05-11 163712

Only replying with this screenshot for reference, as I feel it may be related to the recent Ollama update on Windows.

<!-- gh-comment-id:2106025615 --> @silentoplayz commented on GitHub (May 11, 2024): ![Screenshot 2024-05-11 163712](https://github.com/open-webui/open-webui/assets/50341825/cac4a110-442b-403e-a3d2-9c43a2acfa7f) Only replying with this screenshot for reference, as I feel it may be related to the recent Ollama update on Windows.
Author
Owner

@lludlow commented on GitHub (May 12, 2024):

I am getting the same thing, docker on ubuntu

I renamed the directories to be lowercase and I can now usse the downloaded models.

time=2024-05-12T03:01:52.799Z level=WARN source=routes.go:749 msg="bad manifest" name=registry.ollama.ai/library/IceLatteRP-7b.f16.gguf:latest error="open /root/.ollama/models/manifests/registry.ollama.ai/library/icelatterp-7b.f16.gguf/latest: no such file or directory"

time=2024-05-12T03:01:52.799Z level=WARN source=routes.go:749 msg="bad manifest" name=registry.ollama.ai/library/Llama-3-Lumimaid-8B-v0.1-OAS.q8_0.gguf:latest error="open /root/.ollama/models/manifests/registry.ollama.ai/library/llama-3-lumimaid-8b-v0.1-oas.q8_0.gguf/latest: no such file or directory"

time=2024-05-12T03:01:52.800Z level=WARN source=routes.go:749 msg="bad manifest" name=registry.ollama.ai/library/kunoichi-7b.Q8_0.gguf:latest error="open /root/.ollama/models/manifests/registry.ollama.ai/library/kunoichi-7b.q8_0.gguf/latest: no such file or directory"

<!-- gh-comment-id:2106101258 --> @lludlow commented on GitHub (May 12, 2024): I am getting the same thing, docker on ubuntu I renamed the directories to be lowercase and I can now usse the downloaded models. time=2024-05-12T03:01:52.799Z level=WARN source=routes.go:749 msg="bad manifest" name=registry.ollama.ai/library/IceLatteRP-7b.f16.gguf:latest error="open /root/.ollama/models/manifests/registry.ollama.ai/library/icelatterp-7b.f16.gguf/latest: no such file or directory" time=2024-05-12T03:01:52.799Z level=WARN source=routes.go:749 msg="bad manifest" name=registry.ollama.ai/library/Llama-3-Lumimaid-8B-v0.1-OAS.q8_0.gguf:latest error="open /root/.ollama/models/manifests/registry.ollama.ai/library/llama-3-lumimaid-8b-v0.1-oas.q8_0.gguf/latest: no such file or directory" time=2024-05-12T03:01:52.800Z level=WARN source=routes.go:749 msg="bad manifest" name=registry.ollama.ai/library/kunoichi-7b.Q8_0.gguf:latest error="open /root/.ollama/models/manifests/registry.ollama.ai/library/kunoichi-7b.q8_0.gguf/latest: no such file or directory"
Author
Owner

@BarcodeQH commented on GitHub (May 12, 2024):

Model name and path should be in lowercase. You should also check and remove any non-english characters in your model name

this seems a feature of Ollama instead of a bug. So I suggested this feature #2198 for Open-WebUI to improve UE

<!-- gh-comment-id:2106108795 --> @BarcodeQH commented on GitHub (May 12, 2024): Model name and path should be in lowercase. You should also check and remove any non-english characters in your model name this seems a feature of Ollama instead of a bug. So I suggested this feature #2198 for Open-WebUI to improve UE
Author
Owner

@lludlow commented on GitHub (May 12, 2024):

I would thnk, using the import tool to download a model from huggingface, it would name it appropriately.

In this case, I belive this to be a bug since it worked in previous versions.

<!-- gh-comment-id:2106119845 --> @lludlow commented on GitHub (May 12, 2024): I would thnk, using the import tool to download a model from huggingface, it would name it appropriately. In this case, I belive this to be a bug since it worked in previous versions.
Author
Owner

@Snuupy commented on GitHub (May 12, 2024):

https://github.com/ollama/ollama/releases/tag/v0.1.37 changelog says

Fixed issue where models with uppercase characters in the name would not show with ollama list

trying now...

edit: looks like it is fixed! :) closing issue now.

<!-- gh-comment-id:2106124940 --> @Snuupy commented on GitHub (May 12, 2024): https://github.com/ollama/ollama/releases/tag/v0.1.37 changelog says > Fixed issue where models with uppercase characters in the name would not show with ollama list trying now... edit: looks like it is fixed! :) closing issue now.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#12788