[GH-ISSUE #9354] ollama ls fails silently when encountering unknown digests #68164

Open
opened 2026-05-04 12:41:38 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @bmizerany on GitHub (Feb 26, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/9354

What is the issue?

NOTE: This is a tracking issue for new client work, which will fix these issues.

I made a model manifest that contained a digest without a matching blob on disk, and then ran ollama ls.

I got the empty list, but expected to see my model, and the other models on disk.

The server logs show the GET /api/tags handler got tripped up when it encountered my manifest and "unknown" digest, but returned a 200, and no immediate feedback about what happened. It seems to have stopped at my manifest (which it saw first), and then stopped.

time=2025-02-25T19:53:27.198-08:00 level=WARN source=routes.go:901 msg="bad manifest filepath" name=x/x/mymodel:latest error="open /Users/bmizerany/.ollama/models/blobs/sha256-0000000000000000000000000000000000000000000000000000000000000000: no such file or directory"
[GIN] 2025/02/25 - 19:53:27 | 200 |    3.453125ms |       127.0.0.1 | GET      "/api/tags"

Relevant log output

See above.

OS

macOS

GPU

No response

CPU

Apple

Ollama version

0.5.12

Originally created by @bmizerany on GitHub (Feb 26, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/9354 ### What is the issue? > NOTE: This is a tracking issue for new client work, which will fix these issues. I made a model manifest that contained a digest without a matching blob on disk, and then ran `ollama ls`. I got the empty list, but expected to see my model, and the other models on disk. The server logs show the `GET /api/tags` handler got tripped up when it encountered my manifest and "unknown" digest, but returned a 200, and no immediate feedback about what happened. It seems to have stopped at my manifest (which it saw first), and then stopped. ``` time=2025-02-25T19:53:27.198-08:00 level=WARN source=routes.go:901 msg="bad manifest filepath" name=x/x/mymodel:latest error="open /Users/bmizerany/.ollama/models/blobs/sha256-0000000000000000000000000000000000000000000000000000000000000000: no such file or directory" [GIN] 2025/02/25 - 19:53:27 | 200 | 3.453125ms | 127.0.0.1 | GET "/api/tags" ``` ### Relevant log output ```shell See above. ``` ### OS macOS ### GPU _No response_ ### CPU Apple ### Ollama version 0.5.12
GiteaMirror added the bug label 2026-05-04 12:41:38 -05:00
Author
Owner

@flywiththetide commented on GitHub (Mar 4, 2025):

Issue Diagnosis

The error message in the server logs:

bad manifest filepath name=x/x/mymodel:latest error="open /Users/bmizerany/.ollama/models/blobs/sha256-0000000000000000000000000000000000000000000000000000000000000000: no such file or directory"

suggests that ollama ls encounters a missing digest, causing it to stop processing and return an empty list.

Potential Fixes

  1. Manually Remove the Incomplete Model Manifest

    • Check the models folder:
      ls ~/.ollama/models/
      
    • If your model is listed but corrupted, try removing it:
      rm -rf ~/.ollama/models/<model-name>
      
  2. Re-download the Model
    If the model is missing critical files:

    ollama rm <model-name>
    ollama pull <model-name>
    
  3. Enhancement Suggestion for ollama ls

    • Instead of silently failing, ollama ls should skip invalid manifests and display available models while showing a warning for the missing digest.
    • Possible user-friendly output:
      Warning: Model 'mymodel:latest' has missing files and will not be listed.
      Available Models:
      - llama3
      - mistral-7b
      

Would you be open to submitting a feature request for better error handling in ollama ls?

<!-- gh-comment-id:2696260231 --> @flywiththetide commented on GitHub (Mar 4, 2025): ### **Issue Diagnosis** The error message in the server logs: ```` bad manifest filepath name=x/x/mymodel:latest error="open /Users/bmizerany/.ollama/models/blobs/sha256-0000000000000000000000000000000000000000000000000000000000000000: no such file or directory" ```` suggests that `ollama ls` encounters a **missing digest**, causing it to stop processing and return an empty list. ### **Potential Fixes** 1. **Manually Remove the Incomplete Model Manifest** - Check the models folder: ```bash ls ~/.ollama/models/ ``` - If your model is listed but corrupted, try removing it: ```bash rm -rf ~/.ollama/models/<model-name> ``` 2. **Re-download the Model** If the model is missing critical files: ```bash ollama rm <model-name> ollama pull <model-name> ``` 3. **Enhancement Suggestion for `ollama ls`** - Instead of silently failing, `ollama ls` should **skip invalid manifests** and **display available models** while showing a **warning** for the missing digest. - Possible user-friendly output: ```plaintext Warning: Model 'mymodel:latest' has missing files and will not be listed. Available Models: - llama3 - mistral-7b ``` Would you be open to submitting a feature request for better error handling in `ollama ls`?
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#68164