[GH-ISSUE #1274] "no such file or directory" when creating model during the "creating adapter layer" step #47165

Closed
opened 2026-04-28 03:24:00 -05:00 by GiteaMirror · 9 comments
Owner

Originally created by @meow-d on GitHub (Nov 25, 2023).
Original GitHub issue: https://github.com/ollama/ollama/issues/1274

when i run ollama create storywriter, i get:

transferring model data
reading model metadata
creating template layer
creating system layer
creating adapter layer
Error: open /@sha256:439bdfbd08b0143c5f5f97154d76676a5348a5a00a2fac38fdc8d1c4498d67d3: no such file or directory

btw i'm running on Fedora 39

my Modelfile, just in case:

FROM llama2-uncensored:latest

TEMPLATE """{{ .System }}

### HUMAN:
{{ .Prompt }}

### RESPONSE:
"""

PARAMETER stop "### HUMAN:"
PARAMETER stop "### RESPONSE:"

SYSTEM """
"""

ADAPTER ./adapter_model.bin
Originally created by @meow-d on GitHub (Nov 25, 2023). Original GitHub issue: https://github.com/ollama/ollama/issues/1274 when i run `ollama create storywriter`, i get: ``` transferring model data reading model metadata creating template layer creating system layer creating adapter layer Error: open /@sha256:439bdfbd08b0143c5f5f97154d76676a5348a5a00a2fac38fdc8d1c4498d67d3: no such file or directory ``` btw i'm running on Fedora 39 my Modelfile, just in case: ``` FROM llama2-uncensored:latest TEMPLATE """{{ .System }} ### HUMAN: {{ .Prompt }} ### RESPONSE: """ PARAMETER stop "### HUMAN:" PARAMETER stop "### RESPONSE:" SYSTEM """ """ ADAPTER ./adapter_model.bin ```
Author
Owner

@technovangelist commented on GitHub (Nov 26, 2023):

And llama2-uncensored:latest is on your machine and the adapter file is in the same place where you are running that command from?

<!-- gh-comment-id:1826449898 --> @technovangelist commented on GitHub (Nov 26, 2023): And llama2-uncensored:latest is on your machine and the adapter file is in the same place where you are running that command from?
Author
Owner

@meow-d commented on GitHub (Nov 26, 2023):

no, llama2-uncensored:latest is downloaded using ollama pull. but the adapter file is at the same place. i've tried using absolute paths too, with the same result.

<!-- gh-comment-id:1826473888 --> @meow-d commented on GitHub (Nov 26, 2023): no, llama2-uncensored:latest is downloaded using `ollama pull`. but the adapter file is at the same place. i've tried using absolute paths too, with the same result.
Author
Owner

@BruceMacD commented on GitHub (Nov 27, 2023):

Hi @meow-d, this looks like #892 which was fixed in one of the recent releases. I'd suggest updating Ollama and trying again. You can do that by just running the install script again: curl https://ollama.ai/install.sh | sh

<!-- gh-comment-id:1828117194 --> @BruceMacD commented on GitHub (Nov 27, 2023): Hi @meow-d, this looks like #892 which was fixed in one of the recent releases. I'd suggest updating Ollama and trying again. You can do that by just running the install script again: `curl https://ollama.ai/install.sh | sh`
Author
Owner

@meow-d commented on GitHub (Nov 29, 2023):

thank you. but the problem is still there after i updated.

<!-- gh-comment-id:1831430569 --> @meow-d commented on GitHub (Nov 29, 2023): thank you. but the problem is still there after i updated.
Author
Owner

@mhunjaai commented on GitHub (Nov 30, 2023):

I got the same problem @BruceMacD on both linux + mac.
In v0.1.9 the problem does not occur on creating the model based on the Modelfile with local adapter path.
However in v0.1.10 till latest it also fails with the error code from above. Maybe this helps.

<!-- gh-comment-id:1833286970 --> @mhunjaai commented on GitHub (Nov 30, 2023): I got the same problem @BruceMacD on both linux + mac. In v0.1.9 the problem does not occur on creating the model based on the Modelfile with local adapter path. However in v0.1.10 till latest it also fails with the error code from above. Maybe this helps.
Author
Owner

@solablue commented on GitHub (Dec 3, 2023):

@BruceMacD

I am facing the same issue.

transferring context    
reading model metadata    
⠋ creating adapter layer  Error: open /@sha256:ff1527d49453147c6bd4a89ac61c8cb3948aea7d1787b2540330b5df7335e0ba: no such file or directory

<!-- gh-comment-id:1837490380 --> @solablue commented on GitHub (Dec 3, 2023): @BruceMacD I am facing the same issue. ``` transferring context reading model metadata ⠋ creating adapter layer Error: open /@sha256:ff1527d49453147c6bd4a89ac61c8cb3948aea7d1787b2540330b5df7335e0ba: no such file or directory ```
Author
Owner

@technovangelist commented on GitHub (Jan 3, 2024):

Are folks still experiencing this issue? We are now on 0.1.17 so wondering if it has been solved. If not, perhaps we can get a copy of an adapter that is causing problems, and we can try to recreate the issue. I'm not sure what the right way to send that file over is, so ping here and we can coordinate. Thanks @solablue @meow-d @mhunjaai for being a great part of this remarkable community.

<!-- gh-comment-id:1875742151 --> @technovangelist commented on GitHub (Jan 3, 2024): Are folks still experiencing this issue? We are now on 0.1.17 so wondering if it has been solved. If not, perhaps we can get a copy of an adapter that is causing problems, and we can try to recreate the issue. I'm not sure what the right way to send that file over is, so ping here and we can coordinate. Thanks @solablue @meow-d @mhunjaai for being a great part of this remarkable community.
Author
Owner

@PhilipAmadasun commented on GitHub (Jan 6, 2024):

@technovangelist I'm having the same "no such file or directory" problems, even after updating my ollama to 0.1.18. Here is a link to my adapter_model.bin here. Oh I realize what's happening. I'm using a server and not my local machine for creation so it's searching for the .bin file on the server machine.

<!-- gh-comment-id:1879512925 --> @PhilipAmadasun commented on GitHub (Jan 6, 2024): @technovangelist I'm having the same "no such file or directory" problems, even after updating my ollama to 0.1.18. Here is a link to my `adapter_model.bin` [here](https://huggingface.co/uyiosa/test_falcon_7b_model/tree/main). Oh I realize what's happening. I'm using a server and not my local machine for creation so it's searching for the `.bin` file on the server machine.
Author
Owner

@mxyng commented on GitHub (Jan 18, 2024):

This should be fixed now (at least >=0.1.20 but maybe slightly older) where creating a model (including adapters!) should work with remote servers using local files. Please reopen the issue if that's not the case

<!-- gh-comment-id:1899399154 --> @mxyng commented on GitHub (Jan 18, 2024): This should be fixed now (at least >=0.1.20 but maybe slightly older) where creating a model (including adapters!) should work with remote servers using local files. Please reopen the issue if that's not the case
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#47165