[GH-ISSUE #6373] The layer of model created by Modelfile has 600 permission #66041

Closed
opened 2026-05-03 23:44:37 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @zwwhdls on GitHub (Aug 15, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6373

What is the issue?

I mounted a JuiceFS filesystem in /root/.ollama in Linux. And run ollama pull llama3.1, I can see llama3.1 with ollama list.

Then I mounted the same filesystem in my mac ~/.ollama. I can see llama3.1 with ollama list. I can also run it with sudo -i. All work well.

And then I create a model writer with Modelfile in Linux. But I cannot see it in my Mac. I find out the new layer of model writer has 600 permission while others are 644:

hdls-mbp:weiwei root# ls -alh .ollama/models/blobs/
total 302951592
drwxr-xr-x  25 root  wheel    48G  8 15 23:04 .
drwxr-xr-x  41 root  wheel    48G  8 13 23:44 ..
-rw-r--r--   1 root  wheel   4.0K  8 15 22:56 ._sha256-1dfe258ba02ecec9bf76292743b48c2ce90aefe288c9564c92d135332df6e514
-rw-r--r--   1 root  wheel   4.0K  8 15 22:57 ._sha256-7fa4d1c192726882c2c46a2ffd5af3caddd99e96404e81b3cf2a41de36e25991
-rw-r--r--   1 root  wheel   4.0K  8 15 22:57 ._sha256-ddb2d799341563f3da053b0da259d18d8b00b2f8c5951e7c5e192f9ead7d97ad
-rw-r--r--   1 root  wheel   8.2K  8 13 23:47 sha256-097a36493f718248845233af1d3fefe7a303f864fae13bc31a3a9704229378ca
-rw-r--r--   1 root  wheel    12K  8 14 00:06 sha256-0ba8f0e314b4264dfd19df045cde9d4c394a52474bf92ed6a3de22a4ca31a177
-rw-r--r--   1 root  wheel   136B  8 15 17:26 sha256-109037bec39c0becc8221222ae23557559bc594290945a2c4221ab4f303b8871
-rw-r--r--   1 root  wheel   487B  8 15 17:26 sha256-10aa81da732eae8a66e07d70620089a608f546ff280a2856a43be69d622f715a
-rw-r--r--   1 root  wheel   1.7K  8 14 00:06 sha256-11ce4ee3e170f6adebac9a991c22e22ab3f8530e154ee669954c4bc73061c258
-rw-r--r--   1 root  wheel   485B  8 15 18:57 sha256-1a4c3c319823fdabddb22479d0b10820a7a39fe49e45c40bae28fbe83926dc14
-rw-r--r--@  1 root  wheel    73B  8 15 22:25 sha256-1dfe258ba02ecec9bf76292743b48c2ce90aefe288c9564c92d135332df6e514
-rw-r--r--   1 root  wheel    65B  8 13 23:47 sha256-2490e7468436707d5156d7959cf3c6341cc46ee323084cfa3fcf30fe76e397dc
-rw-r--r--   1 root  wheel    96B  8 14 00:06 sha256-56bb8bd477a519ffa694fc449c2413c6f0e1d3b1c88fa7e3c9d88d3ae49d4dcb
-rw-r--r--   1 root  wheel   486B  8 14 12:32 sha256-654440dac7f3ad911ccb39b7e42e2a0228833641b601937134aa3e4b7a389ad7
-rw-r--r--   1 root  wheel   1.5G  8 13 23:47 sha256-7462734796d67c40ecec2ca98eddf970e171dbb6b370e43fd633ee75b69abe1b
-rw-r--r--@  1 root  wheel   112B  8 15 22:25 sha256-7fa4d1c192726882c2c46a2ffd5af3caddd99e96404e81b3cf2a41de36e25991
-rw-------   1 root  wheel    14B  8 15 23:04 sha256-804a1f079a1166190d674bcfb0fa42270ec57a4413346d20c5eb22b26762d132
-rw-r--r--   1 root  wheel   4.3G  8 15 18:57 sha256-8eeb52dfb3bb9aefdf9d1ef24b3bdbcfbe82238798c4b918278320b6fcef18fe
-rw-r--r--   1 root  wheel    37G  8 14 12:32 sha256-a677b4a4b70c45e702b1d600f7905e367733c53898b8be60e3f29272cf334574
-rw-------   1 root  wheel   559B  8 15 23:04 sha256-db7eed3b8121ac22a30870611ade28097c62918b8a4765d15e6170ec8608e507
-rw-r--r--@  1 root  wheel   559B  8 15 22:25 sha256-ddb2d799341563f3da053b0da259d18d8b00b2f8c5951e7c5e192f9ead7d97ad
-rw-r--r--   1 root  wheel   358B  8 13 23:47 sha256-e0a42594d802e5d31cdc786deb4823edb8adff66094d49de8fffe976d753e348
-rw-r--r--   1 root  wheel   487B  8 13 23:47 sha256-e18ad7af7efbfaecd8525e356861b84c240ece3a3effeb79d2aa7c0f258f71bd
-rw-r--r--   1 root  wheel   5.1G  8 15 17:26 sha256-ff1d1fc78170d787ee1201778e2dd65ea211654ca5fb7d69b5a2e7b123a50373

When I run chmod 644 on these layers manually, it works well and can see writer in ollama list.

OS

macOS

GPU

No response

CPU

Intel

Ollama version

0.3.6

Originally created by @zwwhdls on GitHub (Aug 15, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6373 ### What is the issue? I mounted a [JuiceFS filesystem](https://juicefs.com) in `/root/.ollama` in Linux. And run `ollama pull llama3.1`, I can see llama3.1 with `ollama list`. Then I mounted the same filesystem in my mac `~/.ollama`. I can see llama3.1 with `ollama list`. I can also run it with `sudo -i`. All work well. And then I create a model `writer` with Modelfile in Linux. But I cannot see it in my Mac. I find out the new layer of model `writer` has `600` permission while others are `644`: ``` hdls-mbp:weiwei root# ls -alh .ollama/models/blobs/ total 302951592 drwxr-xr-x 25 root wheel 48G 8 15 23:04 . drwxr-xr-x 41 root wheel 48G 8 13 23:44 .. -rw-r--r-- 1 root wheel 4.0K 8 15 22:56 ._sha256-1dfe258ba02ecec9bf76292743b48c2ce90aefe288c9564c92d135332df6e514 -rw-r--r-- 1 root wheel 4.0K 8 15 22:57 ._sha256-7fa4d1c192726882c2c46a2ffd5af3caddd99e96404e81b3cf2a41de36e25991 -rw-r--r-- 1 root wheel 4.0K 8 15 22:57 ._sha256-ddb2d799341563f3da053b0da259d18d8b00b2f8c5951e7c5e192f9ead7d97ad -rw-r--r-- 1 root wheel 8.2K 8 13 23:47 sha256-097a36493f718248845233af1d3fefe7a303f864fae13bc31a3a9704229378ca -rw-r--r-- 1 root wheel 12K 8 14 00:06 sha256-0ba8f0e314b4264dfd19df045cde9d4c394a52474bf92ed6a3de22a4ca31a177 -rw-r--r-- 1 root wheel 136B 8 15 17:26 sha256-109037bec39c0becc8221222ae23557559bc594290945a2c4221ab4f303b8871 -rw-r--r-- 1 root wheel 487B 8 15 17:26 sha256-10aa81da732eae8a66e07d70620089a608f546ff280a2856a43be69d622f715a -rw-r--r-- 1 root wheel 1.7K 8 14 00:06 sha256-11ce4ee3e170f6adebac9a991c22e22ab3f8530e154ee669954c4bc73061c258 -rw-r--r-- 1 root wheel 485B 8 15 18:57 sha256-1a4c3c319823fdabddb22479d0b10820a7a39fe49e45c40bae28fbe83926dc14 -rw-r--r--@ 1 root wheel 73B 8 15 22:25 sha256-1dfe258ba02ecec9bf76292743b48c2ce90aefe288c9564c92d135332df6e514 -rw-r--r-- 1 root wheel 65B 8 13 23:47 sha256-2490e7468436707d5156d7959cf3c6341cc46ee323084cfa3fcf30fe76e397dc -rw-r--r-- 1 root wheel 96B 8 14 00:06 sha256-56bb8bd477a519ffa694fc449c2413c6f0e1d3b1c88fa7e3c9d88d3ae49d4dcb -rw-r--r-- 1 root wheel 486B 8 14 12:32 sha256-654440dac7f3ad911ccb39b7e42e2a0228833641b601937134aa3e4b7a389ad7 -rw-r--r-- 1 root wheel 1.5G 8 13 23:47 sha256-7462734796d67c40ecec2ca98eddf970e171dbb6b370e43fd633ee75b69abe1b -rw-r--r--@ 1 root wheel 112B 8 15 22:25 sha256-7fa4d1c192726882c2c46a2ffd5af3caddd99e96404e81b3cf2a41de36e25991 -rw------- 1 root wheel 14B 8 15 23:04 sha256-804a1f079a1166190d674bcfb0fa42270ec57a4413346d20c5eb22b26762d132 -rw-r--r-- 1 root wheel 4.3G 8 15 18:57 sha256-8eeb52dfb3bb9aefdf9d1ef24b3bdbcfbe82238798c4b918278320b6fcef18fe -rw-r--r-- 1 root wheel 37G 8 14 12:32 sha256-a677b4a4b70c45e702b1d600f7905e367733c53898b8be60e3f29272cf334574 -rw------- 1 root wheel 559B 8 15 23:04 sha256-db7eed3b8121ac22a30870611ade28097c62918b8a4765d15e6170ec8608e507 -rw-r--r--@ 1 root wheel 559B 8 15 22:25 sha256-ddb2d799341563f3da053b0da259d18d8b00b2f8c5951e7c5e192f9ead7d97ad -rw-r--r-- 1 root wheel 358B 8 13 23:47 sha256-e0a42594d802e5d31cdc786deb4823edb8adff66094d49de8fffe976d753e348 -rw-r--r-- 1 root wheel 487B 8 13 23:47 sha256-e18ad7af7efbfaecd8525e356861b84c240ece3a3effeb79d2aa7c0f258f71bd -rw-r--r-- 1 root wheel 5.1G 8 15 17:26 sha256-ff1d1fc78170d787ee1201778e2dd65ea211654ca5fb7d69b5a2e7b123a50373 ``` When I run `chmod 644` on these layers manually, it works well and can see `writer` in `ollama list`. ### OS macOS ### GPU _No response_ ### CPU Intel ### Ollama version 0.3.6
GiteaMirror added the bug label 2026-05-03 23:44:37 -05:00
Author
Owner

@rick-github commented on GitHub (Aug 15, 2024):

What's the backing store for the juicefs? When you mount the juicefs, are there any ACL or umask related options? What's the umask of the ollama server on your linux machine?

grep Umask /proc/$(pidof ollama)/status
<!-- gh-comment-id:2291589366 --> @rick-github commented on GitHub (Aug 15, 2024): What's the backing store for the juicefs? When you mount the juicefs, are there any ACL or umask related options? What's the umask of the ollama server on your linux machine? ``` grep Umask /proc/$(pidof ollama)/status ```
Author
Owner

@mxyng commented on GitHub (Aug 15, 2024):

The only files Ollama explicitly sets to 0o600 are private key and history files. Everything else should be 0o644. Please check your mount settings and umask as @rick-github suggested

<!-- gh-comment-id:2292043902 --> @mxyng commented on GitHub (Aug 15, 2024): The only files Ollama explicitly sets to 0o600 are private key and history files. Everything else should be 0o644. Please check your mount settings and umask as @rick-github suggested
Author
Owner

@zwwhdls commented on GitHub (Aug 16, 2024):

Hi, thanks for quick response.

I checked the umask in linux machine @rick-github @mxyng :

[root@iZbp10tlvg9aapf3a7tdyfZ ~]# ps -ef | grep ollama
root      595091  594885  0 8月15 ?       00:00:07 /sbin/mount.juicefs weiwei-test-tke /jfs/ollama-vol-glarwb -o foreground,no-update,cache-size=0,verbose
root      595243  595091  0 8月15 ?       00:01:38 /usr/local/juicefs/mount/jfsmount weiwei-test-tke /jfs/ollama-vol-glarwb -o foreground,no-update,cache-size=0,verbose
root      595580  595317  0 8月15 ?       00:00:11 /bin/ollama serve
root      614203  614163  0 8月15 ?       00:00:08 ollama run writer
root      835099  835044  0 10:06 pts/0    00:00:00 grep --color=auto ollama
[root@iZbp10tlvg9aapf3a7tdyfZ ~]#
[root@iZbp10tlvg9aapf3a7tdyfZ ~]#
[root@iZbp10tlvg9aapf3a7tdyfZ ~]#
[root@iZbp10tlvg9aapf3a7tdyfZ ~]# grep Umask /proc/595580/status
Umask:	0022

Besides, JuiceFS is an open source distributed file system. I do not set ACL or umask when mounting. umask is default 022.

root@ollama-5cc94c8c4b-t59zr:/# df
Filesystem                   1K-blocks     Used      Available Use% Mounted on
overlay                      123460788 30699160       87515992  26% /
tmpfs                            65536        0          65536   0% /dev
tmpfs                         64771088        0       64771088   0% /sys/fs/cgroup
JuiceFS:weiwei-test-tke 10995103093488 50492052 10995052601436   1% /root/.ollama
/dev/vda3                    123460788 30699160       87515992  26% /etc/hosts
shm                              65536        0          65536   0% /dev/shm
tmpfs                        128253988       12      128253976   1% /run/secrets/kubernetes.io/serviceaccount
tmpfs                         64771088        0       64771088   0% /proc/acpi
tmpfs                         64771088        0       64771088   0% /proc/scsi
tmpfs                         64771088        0       64771088   0% /sys/firmware
root@ollama-5cc94c8c4b-t59zr:/#
root@ollama-5cc94c8c4b-t59zr:/#
root@ollama-5cc94c8c4b-t59zr:/# cd
root@ollama-5cc94c8c4b-t59zr:~# cd .ollama
root@ollama-5cc94c8c4b-t59zr:~/.ollama#
root@ollama-5cc94c8c4b-t59zr:~/.ollama#
root@ollama-5cc94c8c4b-t59zr:~/.ollama# ls
history  id_ed25519  id_ed25519.pub  logs  models
root@ollama-5cc94c8c4b-t59zr:~/.ollama#
root@ollama-5cc94c8c4b-t59zr:~/.ollama#
root@ollama-5cc94c8c4b-t59zr:~/.ollama#
root@ollama-5cc94c8c4b-t59zr:~/.ollama# ls -alh
total 97G
drwxrwxrwx 51 root  root   49G Aug 15 15:29 .
drwx------  1 root  root  4.0K Aug 15 15:04 ..
-rw-r--r--  1 13556 staff 4.0K Aug 15 10:44 ._logs
-rw-------  1 root  root    33 Aug 15 15:29 history
-rw-------  1 root  root   387 Aug 13 15:40 id_ed25519
-rw-r--r--  1 root  root    81 Aug 13 15:40 id_ed25519.pub
drwxr-xr-x  4 13556 staff  52K Aug 15 10:44 logs
drwxr-xr-x 43 root  root   49G Aug 13 15:44 models
root@ollama-5cc94c8c4b-t59zr:~/.ollama#
root@ollama-5cc94c8c4b-t59zr:~/.ollama#
root@ollama-5cc94c8c4b-t59zr:~/.ollama#
root@ollama-5cc94c8c4b-t59zr:~/.ollama# umask
0022
root@ollama-5cc94c8c4b-t59zr:~/.ollama#
root@ollama-5cc94c8c4b-t59zr:~/.ollama#
root@ollama-5cc94c8c4b-t59zr:~/.ollama#
root@ollama-5cc94c8c4b-t59zr:~/.ollama# cd models/
root@ollama-5cc94c8c4b-t59zr:~/.ollama/models# ls
blobs  manifests
root@ollama-5cc94c8c4b-t59zr:~/.ollama/models# umask
0022
root@ollama-5cc94c8c4b-t59zr:~/.ollama/models#
root@ollama-5cc94c8c4b-t59zr:~/.ollama/models#
root@ollama-5cc94c8c4b-t59zr:~/.ollama/models# cd blobs/
root@ollama-5cc94c8c4b-t59zr:~/.ollama/models/blobs# umask
0022
root@ollama-5cc94c8c4b-t59zr:~/.ollama/models/blobs#
root@ollama-5cc94c8c4b-t59zr:~/.ollama/models/blobs#
root@ollama-5cc94c8c4b-t59zr:~/.ollama/models/blobs#
<!-- gh-comment-id:2292612038 --> @zwwhdls commented on GitHub (Aug 16, 2024): Hi, thanks for quick response. I checked the umask in linux machine @rick-github @mxyng : ``` [root@iZbp10tlvg9aapf3a7tdyfZ ~]# ps -ef | grep ollama root 595091 594885 0 8月15 ? 00:00:07 /sbin/mount.juicefs weiwei-test-tke /jfs/ollama-vol-glarwb -o foreground,no-update,cache-size=0,verbose root 595243 595091 0 8月15 ? 00:01:38 /usr/local/juicefs/mount/jfsmount weiwei-test-tke /jfs/ollama-vol-glarwb -o foreground,no-update,cache-size=0,verbose root 595580 595317 0 8月15 ? 00:00:11 /bin/ollama serve root 614203 614163 0 8月15 ? 00:00:08 ollama run writer root 835099 835044 0 10:06 pts/0 00:00:00 grep --color=auto ollama [root@iZbp10tlvg9aapf3a7tdyfZ ~]# [root@iZbp10tlvg9aapf3a7tdyfZ ~]# [root@iZbp10tlvg9aapf3a7tdyfZ ~]# [root@iZbp10tlvg9aapf3a7tdyfZ ~]# grep Umask /proc/595580/status Umask: 0022 ``` Besides, JuiceFS is an open source distributed file system. I do not set ACL or umask when mounting. umask is default 022. ``` root@ollama-5cc94c8c4b-t59zr:/# df Filesystem 1K-blocks Used Available Use% Mounted on overlay 123460788 30699160 87515992 26% / tmpfs 65536 0 65536 0% /dev tmpfs 64771088 0 64771088 0% /sys/fs/cgroup JuiceFS:weiwei-test-tke 10995103093488 50492052 10995052601436 1% /root/.ollama /dev/vda3 123460788 30699160 87515992 26% /etc/hosts shm 65536 0 65536 0% /dev/shm tmpfs 128253988 12 128253976 1% /run/secrets/kubernetes.io/serviceaccount tmpfs 64771088 0 64771088 0% /proc/acpi tmpfs 64771088 0 64771088 0% /proc/scsi tmpfs 64771088 0 64771088 0% /sys/firmware root@ollama-5cc94c8c4b-t59zr:/# root@ollama-5cc94c8c4b-t59zr:/# root@ollama-5cc94c8c4b-t59zr:/# cd root@ollama-5cc94c8c4b-t59zr:~# cd .ollama root@ollama-5cc94c8c4b-t59zr:~/.ollama# root@ollama-5cc94c8c4b-t59zr:~/.ollama# root@ollama-5cc94c8c4b-t59zr:~/.ollama# ls history id_ed25519 id_ed25519.pub logs models root@ollama-5cc94c8c4b-t59zr:~/.ollama# root@ollama-5cc94c8c4b-t59zr:~/.ollama# root@ollama-5cc94c8c4b-t59zr:~/.ollama# root@ollama-5cc94c8c4b-t59zr:~/.ollama# ls -alh total 97G drwxrwxrwx 51 root root 49G Aug 15 15:29 . drwx------ 1 root root 4.0K Aug 15 15:04 .. -rw-r--r-- 1 13556 staff 4.0K Aug 15 10:44 ._logs -rw------- 1 root root 33 Aug 15 15:29 history -rw------- 1 root root 387 Aug 13 15:40 id_ed25519 -rw-r--r-- 1 root root 81 Aug 13 15:40 id_ed25519.pub drwxr-xr-x 4 13556 staff 52K Aug 15 10:44 logs drwxr-xr-x 43 root root 49G Aug 13 15:44 models root@ollama-5cc94c8c4b-t59zr:~/.ollama# root@ollama-5cc94c8c4b-t59zr:~/.ollama# root@ollama-5cc94c8c4b-t59zr:~/.ollama# root@ollama-5cc94c8c4b-t59zr:~/.ollama# umask 0022 root@ollama-5cc94c8c4b-t59zr:~/.ollama# root@ollama-5cc94c8c4b-t59zr:~/.ollama# root@ollama-5cc94c8c4b-t59zr:~/.ollama# root@ollama-5cc94c8c4b-t59zr:~/.ollama# cd models/ root@ollama-5cc94c8c4b-t59zr:~/.ollama/models# ls blobs manifests root@ollama-5cc94c8c4b-t59zr:~/.ollama/models# umask 0022 root@ollama-5cc94c8c4b-t59zr:~/.ollama/models# root@ollama-5cc94c8c4b-t59zr:~/.ollama/models# root@ollama-5cc94c8c4b-t59zr:~/.ollama/models# cd blobs/ root@ollama-5cc94c8c4b-t59zr:~/.ollama/models/blobs# umask 0022 root@ollama-5cc94c8c4b-t59zr:~/.ollama/models/blobs# root@ollama-5cc94c8c4b-t59zr:~/.ollama/models/blobs# root@ollama-5cc94c8c4b-t59zr:~/.ollama/models/blobs# ```
Author
Owner

@zwwhdls commented on GitHub (Aug 16, 2024):

Hi @mxyng @rick-github , I have tried this issue without JuiceFS and just use the local file. And it is reproduced. By the way, I use ollama in kubernetes:

apiVersion: apps/v1
kind: Deployment
metadata:
  name: ollama-host
  labels:
    app: ollama-host
spec:
  replicas: 1
  selector:
    matchLabels:
      app: ollama-host
  template:
    metadata:
      labels:
        app: ollama-host
    spec:
      containers:
      - image: ollama/ollama:0.3.5
        env:
        - name: OLLAMA_HOST
          value: "0.0.0.0"
        ports:
        - name: ollama
          containerPort: 11434
        args:
        - "serve"
        name: load
        volumeMounts:
        - mountPath: /root/.ollama
          name: shared-data
      volumes:
      - name: shared-data
        hostPath:
          path: /root/.ollama
          type: DirectoryOrCreate

The ollama in Linux is 0.3.5 and 0.3.6 in Mac

<!-- gh-comment-id:2292633606 --> @zwwhdls commented on GitHub (Aug 16, 2024): Hi @mxyng @rick-github , I have tried this issue without JuiceFS and just use the local file. And it is reproduced. By the way, I use ollama in kubernetes: ``` apiVersion: apps/v1 kind: Deployment metadata: name: ollama-host labels: app: ollama-host spec: replicas: 1 selector: matchLabels: app: ollama-host template: metadata: labels: app: ollama-host spec: containers: - image: ollama/ollama:0.3.5 env: - name: OLLAMA_HOST value: "0.0.0.0" ports: - name: ollama containerPort: 11434 args: - "serve" name: load volumeMounts: - mountPath: /root/.ollama name: shared-data volumes: - name: shared-data hostPath: path: /root/.ollama type: DirectoryOrCreate ``` The ollama in Linux is `0.3.5` and `0.3.6` in Mac
Author
Owner

@zwwhdls commented on GitHub (Aug 16, 2024):

I digged deeper into the code and found that:

When create a new layer, create a temp file which used 0600:

temp, err := os.CreateTemp(blobs, "sha256-")
func CreateTemp(dir, pattern string) (*File, error) {
	if dir == "" {
		dir = TempDir()
	}

	prefix, suffix, err := prefixAndSuffix(pattern)
	if err != nil {
		return nil, &PathError{Op: "createtemp", Path: pattern, Err: err}
	}
	prefix = joinPath(dir, prefix)

	try := 0
	for {
		name := prefix + nextRandom() + suffix
		f, err := OpenFile(name, O_RDWR|O_CREATE|O_EXCL, 0600)
		if IsExist(err) {
			if try++; try < 10000 {
				continue
			}
			return nil, &PathError{Op: "createtemp", Path: prefix + "*" + suffix, Err: ErrExist}
		}
		return f, err
	}
}

And then

os.Rename(temp.Name(), blob)

The code is here: https://github.com/ollama/ollama/blob/main/server/layer.go#L25

<!-- gh-comment-id:2292667549 --> @zwwhdls commented on GitHub (Aug 16, 2024): I digged deeper into the code and found that: When create a new layer, create a temp file which used 0600: ```go temp, err := os.CreateTemp(blobs, "sha256-") ``` ```go func CreateTemp(dir, pattern string) (*File, error) { if dir == "" { dir = TempDir() } prefix, suffix, err := prefixAndSuffix(pattern) if err != nil { return nil, &PathError{Op: "createtemp", Path: pattern, Err: err} } prefix = joinPath(dir, prefix) try := 0 for { name := prefix + nextRandom() + suffix f, err := OpenFile(name, O_RDWR|O_CREATE|O_EXCL, 0600) if IsExist(err) { if try++; try < 10000 { continue } return nil, &PathError{Op: "createtemp", Path: prefix + "*" + suffix, Err: ErrExist} } return f, err } } ``` And then ```go os.Rename(temp.Name(), blob) ``` The code is here: https://github.com/ollama/ollama/blob/main/server/layer.go#L25
Author
Owner

@zwwhdls commented on GitHub (Aug 16, 2024):

I have raise a PR to fix it. Please take a look @mxyng @rick-github , looking forward for your response!

<!-- gh-comment-id:2292676493 --> @zwwhdls commented on GitHub (Aug 16, 2024): I have raise a [PR](https://github.com/ollama/ollama/pull/6386) to fix it. Please take a look @mxyng @rick-github , looking forward for your response!
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#66041