[GH-ISSUE #13223] Ollama Exposes the Local Model Directory via Network Requests in ollama/ollama #8741

Closed
opened 2026-04-12 21:30:38 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @ylwango613 on GitHub (Nov 24, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/13223

Originally assigned to: @BruceMacD on GitHub.

What is the issue?

Description

Ollama performs improper error handling, which leads to leakage of sensitive information from the host machine.

When specific malformed input is sent to the /api/create endpoint, the server returns detailed filesystem paths in its error messages. These paths reveal both the username under which Ollama is running and the exact directory where model files are stored.


Proof of Concept

A local Ollama server was started to simulate a network-accessible environment:

ollama serve

Then the following request was executed:

curl http://localhost:11434/api/create -d '{
  "model": "my-gguf-model",
  "files": {
    "0.gguf": ""
  }
}'

The response was:

{"status":"parsing GGUF"}
{"error":"read /home/yuelinwang/.ollama/models/blobs: is a directory"}

The returned error message exposes:

  • The current username (yuelinwang)
  • The exact model directory used by Ollama (/home/yuelinwang/.ollama/models/blobs)

Impact

An attacker can obtain:

  • The username under which the Ollama service is executing.
  • The absolute filesystem path of Ollama’s model storage directory.

Although no direct code execution or privilege escalation occurs from this issue alone, the leaked information can significantly aid reconnaissance and facilitate targeted attacks in more complex threat scenarios.


Root Cause Analysis & Remediation Advice

The root cause lies in the implementation of GetBlobsPath within the following code:

https://github.com/ollama/ollama/blob/main/server/modelpath.go#L125-L146

func GetBlobsPath(digest string) (string, error) {
    // only accept actual sha256 digests
    pattern := "^sha256[:-][0-9a-fA-F]{64}$"
    re := regexp.MustCompile(pattern)

    if digest != "" && !re.MatchString(digest) {
        return "", ErrInvalidDigestFormat
    }

    digest = strings.ReplaceAll(digest, ":", "-")
    path := filepath.Join(envconfig.Models(), "blobs", digest)
    dirPath := filepath.Dir(path)
    if digest == "" {
        dirPath = path
    }

    if err := os.MkdirAll(dirPath, 0o755); err != nil {
        return "", fmt.Errorf("%w: ensure path elements are traversable", err)
    }

    return path, nil
}

Issue

When digest == "", the function does not reject the input.
Instead, it:

  1. Treats the empty digest as valid,
  2. Computes the blob directory path,
  3. Returns the full absolute model path on the server.

This leads to path disclosure when an upstream function attempts to open this “file,” producing an OS-level error message containing the full directory path.

Recommendation

Add an explicit check for empty digests at the beginning of GetBlobsPath:

if digest == "" {
    return "", fmt.Errorf("digest must not be empty")
}

This prevents the system from returning internal paths and avoids exposing sensitive information to remote clients.

Relevant log output


OS

Linux

GPU

Nvidia

CPU

Intel

Ollama version

0.12.11

Originally created by @ylwango613 on GitHub (Nov 24, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/13223 Originally assigned to: @BruceMacD on GitHub. ### What is the issue? ## Description Ollama performs improper error handling, which leads to leakage of sensitive information from the host machine. When specific malformed input is sent to the `/api/create` endpoint, the server returns detailed filesystem paths in its error messages. These paths reveal both the username under which Ollama is running and the exact directory where model files are stored. ------ ## Proof of Concept A local Ollama server was started to simulate a network-accessible environment: ```bash ollama serve ``` Then the following request was executed: ```bash curl http://localhost:11434/api/create -d '{ "model": "my-gguf-model", "files": { "0.gguf": "" } }' ``` The response was: ``` {"status":"parsing GGUF"} {"error":"read /home/yuelinwang/.ollama/models/blobs: is a directory"} ``` The returned error message exposes: - The current username (`yuelinwang`) - The exact model directory used by Ollama (`/home/yuelinwang/.ollama/models/blobs`) ------ ## Impact An attacker can obtain: - The username under which the Ollama service is executing. - The absolute filesystem path of Ollama’s model storage directory. Although no direct code execution or privilege escalation occurs from this issue alone, the leaked information can significantly aid reconnaissance and facilitate targeted attacks in more complex threat scenarios. ------ ## Root Cause Analysis & Remediation Advice The root cause lies in the implementation of `GetBlobsPath` within the following code: https://github.com/ollama/ollama/blob/main/server/modelpath.go#L125-L146 ```go func GetBlobsPath(digest string) (string, error) { // only accept actual sha256 digests pattern := "^sha256[:-][0-9a-fA-F]{64}$" re := regexp.MustCompile(pattern) if digest != "" && !re.MatchString(digest) { return "", ErrInvalidDigestFormat } digest = strings.ReplaceAll(digest, ":", "-") path := filepath.Join(envconfig.Models(), "blobs", digest) dirPath := filepath.Dir(path) if digest == "" { dirPath = path } if err := os.MkdirAll(dirPath, 0o755); err != nil { return "", fmt.Errorf("%w: ensure path elements are traversable", err) } return path, nil } ``` ### Issue When `digest == ""`, the function does not reject the input. Instead, it: 1. Treats the empty digest as valid, 2. Computes the blob directory path, 3. Returns the full absolute model path on the server. This leads to path disclosure when an upstream function attempts to open this “file,” producing an OS-level error message containing the full directory path. ### **Recommendation** Add an explicit check for empty digests at the beginning of `GetBlobsPath`: ```go if digest == "" { return "", fmt.Errorf("digest must not be empty") } ``` This prevents the system from returning internal paths and avoids exposing sensitive information to remote clients. ### Relevant log output ```shell ``` ### OS Linux ### GPU Nvidia ### CPU Intel ### Ollama version 0.12.11
GiteaMirror added the bug label 2026-04-12 21:30:38 -05:00
Author
Owner

@ylwango613 commented on GitHub (Dec 22, 2025):

Hi, any update? Are there any difficulties with the fix? And is there anything I can do to help? @pdevine

<!-- gh-comment-id:3682393816 --> @ylwango613 commented on GitHub (Dec 22, 2025): Hi, any update? Are there any difficulties with the fix? And is there anything I can do to help? @pdevine
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#8741