[GH-ISSUE #7681] Reinstall docker image with old models ? #4902

Closed
opened 2026-04-12 15:57:04 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @remco-pc on GitHub (Nov 15, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7681

What is the issue?

Re-installing my docker image with ollama causes the models to disappear from the home directory, can they be mounted somewhere to not re-install these models ?
Are they learning from user input already ?

OS

Linux, Windows, Docker, WSL2

GPU

No response

CPU

Intel

Ollama version

0.4.1

Originally created by @remco-pc on GitHub (Nov 15, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7681 ### What is the issue? Re-installing my docker image with ollama causes the models to disappear from the home directory, can they be mounted somewhere to not re-install these models ? Are they learning from user input already ? ### OS Linux, Windows, Docker, WSL2 ### GPU _No response_ ### CPU Intel ### Ollama version 0.4.1
GiteaMirror added the question label 2026-04-12 15:57:04 -05:00
Author
Owner

@pdevine commented on GitHub (Nov 15, 2024):

Hi @remco-pc , yes, you can use the OLLAMA_MODELS environment variable with Ollama to specify where you want the models to go, and you can use a docker volume to have persistent storage. You can find out more information about OLLAMA_MODELS in the FAQ.

As for learning from user input, no, Ollama doesn't (yet) do fine tuning of any data.

<!-- gh-comment-id:2478199706 --> @pdevine commented on GitHub (Nov 15, 2024): Hi @remco-pc , yes, you can use the `OLLAMA_MODELS` environment variable with Ollama to specify where you want the models to go, and you can use a docker volume to have persistent storage. You can find out more information about `OLLAMA_MODELS` in the [FAQ](https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-set-them-to-a-different-location). As for learning from user input, no, Ollama doesn't (yet) do fine tuning of any data.
Author
Owner

@remco-pc commented on GitHub (Nov 15, 2024):

@pdevine https://www.youtube.com/watch?v=a8gd892WKLM also have images with plain markdown command and a file.read

<!-- gh-comment-id:2479566063 --> @remco-pc commented on GitHub (Nov 15, 2024): @pdevine https://www.youtube.com/watch?v=a8gd892WKLM also have images with plain markdown command and a file.read
Author
Owner

@remco-pc commented on GitHub (Nov 15, 2024):

@pdevine you can also use symlinks, so make a softlink with the directory to a mount point ? easier to restore

<!-- gh-comment-id:2480108780 --> @remco-pc commented on GitHub (Nov 15, 2024): @pdevine you can also use symlinks, so make a softlink with the directory to a mount point ? easier to restore
Author
Owner

@remco-pc commented on GitHub (Nov 18, 2024):

volumes:
  mount:
    driver: local
    driver_opts:
      type: none
      o: bind
      device: /mnt/Vps3/Mount
  mount2:
    driver: local
    driver_opts:
      type: none
      o: bind
      device: /mnt/Disk2 (full disk attached at www-data)
<!-- gh-comment-id:2482340988 --> @remco-pc commented on GitHub (Nov 18, 2024): ```docker-compose volumes: mount: driver: local driver_opts: type: none o: bind device: /mnt/Vps3/Mount mount2: driver: local driver_opts: type: none o: bind device: /mnt/Disk2 (full disk attached at www-data) ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#4902