[GH-ISSUE #10110] Ollama does not recognize models that were copied #53143

Closed
opened 2026-04-29 02:04:52 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @zaredh on GitHub (Apr 3, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10110

What is the issue?

I'm trying to copy the contents of the models on my host machine into a docker container. This is how I'm constructing the docker-compose.yml:

ollama:
    build:
      context: ./ollama
      dockerfile: Dockerfile
    container_name: ollama
    restart: always
    ports:
      - "8006:11434"
    volumes:
      - ollama_data:/root/.ollama
      - ${MODELS_DIR}:/usr/share/ollama/.ollama
    deploy:
      resources:
        reservations:
          devices:
            - driver: nvidia
              count: 1
              capabilities: [gpu]

This is the contents of the Dockerfile:

FROM ollama/ollama

EXPOSE 11434

CMD []

Yes, I'm aware that I don't need a Dockerfile for this, I've obfuscated some things from it.

MODELS_DIR is set in the .env to /usr/share/ollama/.ollama. When I exec into the container after startup, the contents of my host machine are exactly replicated in /usr/share/ollama/.ollama.

ollama list returns an empty list. Restarting the container does not resolve the issue either.

There is no Modelfile by default stored under manifests/registry.ollama.ai/library/

For example with llama3.1, the library is library/llama3.1/latest.

Relevant log output


OS

Linux

GPU

Nvidia

CPU

No response

Ollama version

0.6.2

Originally created by @zaredh on GitHub (Apr 3, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10110 ### What is the issue? I'm trying to copy the contents of the models on my host machine into a docker container. This is how I'm constructing the docker-compose.yml: ``` ollama: build: context: ./ollama dockerfile: Dockerfile container_name: ollama restart: always ports: - "8006:11434" volumes: - ollama_data:/root/.ollama - ${MODELS_DIR}:/usr/share/ollama/.ollama deploy: resources: reservations: devices: - driver: nvidia count: 1 capabilities: [gpu] ``` This is the contents of the Dockerfile: ``` FROM ollama/ollama EXPOSE 11434 CMD [] ``` Yes, I'm aware that I don't need a Dockerfile for this, I've obfuscated some things from it. MODELS_DIR is set in the .env to /usr/share/ollama/.ollama. When I exec into the container after startup, the contents of my host machine are exactly replicated in /usr/share/ollama/.ollama. `ollama list` returns an empty list. Restarting the container does not resolve the issue either. There is no Modelfile by default stored under manifests/registry.ollama.ai/library/ For example with llama3.1, the library is library/llama3.1/latest. ### Relevant log output ```shell ``` ### OS Linux ### GPU Nvidia ### CPU _No response_ ### Ollama version 0.6.2
GiteaMirror added the bug label 2026-04-29 02:04:52 -05:00
Author
Owner

@rick-github commented on GitHub (Apr 3, 2025):

    volumes:
      - ollama_data:/root/.ollama
      - ${MODELS_DIR}:/root/.ollama/models

Set MODELS_DIR to /usr/share/ollama/.ollama/models.

<!-- gh-comment-id:2775737690 --> @rick-github commented on GitHub (Apr 3, 2025): ```yaml volumes: - ollama_data:/root/.ollama - ${MODELS_DIR}:/root/.ollama/models ``` Set MODELS_DIR to /usr/share/ollama/.ollama/models.
Author
Owner

@zaredh commented on GitHub (Apr 3, 2025):

Tried that just now. The only difference is that ollama/.ollama no longer has id_ed25519 and id_ed25519.pub files that are on my host machine. ollama list still returns nothing before and after a container restart.

Edit: Including picture from inside the container of the llama3.1 blob files and ollama list returning nothing:

Image

<!-- gh-comment-id:2775761325 --> @zaredh commented on GitHub (Apr 3, 2025): Tried that just now. The only difference is that ollama/.ollama no longer has id_ed25519 and id_ed25519.pub files that are on my host machine. `ollama list` still returns nothing before and after a container restart. Edit: Including picture from inside the container of the llama3.1 blob files and `ollama list` returning nothing: ![Image](https://github.com/user-attachments/assets/d7c781e4-f250-4eb9-99ba-60b6c8f39bc3)
Author
Owner

@rick-github commented on GitHub (Apr 3, 2025):

Is the screen shot from inside or outside the container?

<!-- gh-comment-id:2775786910 --> @rick-github commented on GitHub (Apr 3, 2025): Is the screen shot from inside or outside the container?
Author
Owner

@zaredh commented on GitHub (Apr 3, 2025):

Inside. Will edit original.

If I run ollama list from the host machine, it returns

NAME               ID              SIZE      MODIFIED     
llama3.1:latest    46e0c10c039e    4.9 GB    2 months ago
<!-- gh-comment-id:2775794448 --> @zaredh commented on GitHub (Apr 3, 2025): Inside. Will edit original. If I run `ollama list` from the host machine, it returns ``` NAME ID SIZE MODIFIED llama3.1:latest 46e0c10c039e 4.9 GB 2 months ago ```
Author
Owner

@rick-github commented on GitHub (Apr 3, 2025):

Inside the container, the models need to be mounted at /root/.ollama/models, see https://github.com/ollama/ollama/issues/10110#issuecomment-2775737690

<!-- gh-comment-id:2775805342 --> @rick-github commented on GitHub (Apr 3, 2025): Inside the container, the models need to be mounted at /root/.ollama/models, see https://github.com/ollama/ollama/issues/10110#issuecomment-2775737690
Author
Owner

@zaredh commented on GitHub (Apr 3, 2025):

My mistake. Completely misread that location. Thank you so much.

<!-- gh-comment-id:2775826755 --> @zaredh commented on GitHub (Apr 3, 2025): My mistake. Completely misread that location. Thank you so much.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#53143