[GH-ISSUE #9067] Issue: On windows, when creating a variant from a submodel with a tag, it tries to make a folder with a ":". #31664

Closed
opened 2026-04-22 12:20:26 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @ZephinueCode on GitHub (Feb 13, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/9067

What is the issue?

For example, ollama create test -f ./Modelfile with FROM deepseek-r1:14b results in the system trying to make an illegal folder with the name of "deepseek-r1:14b". Windows seems to not like it when ":" exists.

Relevant log output


OS

Windows

GPU

Nvidia

CPU

Intel

Ollama version

0.5.7

Originally created by @ZephinueCode on GitHub (Feb 13, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/9067 ### What is the issue? For example, ollama create test -f ./Modelfile with FROM deepseek-r1:14b results in the system trying to make an illegal folder with the name of "deepseek-r1:14b". Windows seems to not like it when ":" exists. ### Relevant log output ```shell ``` ### OS Windows ### GPU Nvidia ### CPU Intel ### Ollama version 0.5.7
GiteaMirror added the bug label 2026-04-22 12:20:27 -05:00
Author
Owner

@YonTracks commented on GitHub (Feb 13, 2025):

yep, same for me,
I only tried the api but. have not looked hard.
good luck

<!-- gh-comment-id:2656541043 --> @YonTracks commented on GitHub (Feb 13, 2025): yep, same for me, I only tried the api but. have not looked hard. good luck
Author
Owner

@rick-github commented on GitHub (Feb 13, 2025):

Could you provide more context?

C:\Users\bill>ollama -v
ollama version is 0.5.7

C:\Users\bill>echo FROM deepseek-r1:14b > Modelfile

C:\Users\bill>ollama list test
NAME    ID    SIZE    MODIFIED

C:\Users\bill>ollama create test -f ./Modelfile
gathering model components
using existing layer sha256:6e9f90f02bb3b39b59e81916e8cfce9deb45aeaeb9a54a5be4414486b907dc1e
using existing layer sha256:369ca498f347f710d068cbb38bf0b8692dd3fa30f30ca2ff755e211c94768150
using existing layer sha256:6e4c38e1172f42fdbff13edf9a7a017679fb82b0fde415a3e8b3c31c6ed4a4e4
using existing layer sha256:f4d24e9138dd4603380add165d2b0d970bef471fac194b436ebd50e6147c6588
writing manifest
success

C:\Users\bill>ollama list test
NAME           ID              SIZE      MODIFIED
test:latest    72d94c4c6e1c    9.0 GB    9 hours ago


<!-- gh-comment-id:2656645712 --> @rick-github commented on GitHub (Feb 13, 2025): Could you provide more context? ``` C:\Users\bill>ollama -v ollama version is 0.5.7 C:\Users\bill>echo FROM deepseek-r1:14b > Modelfile C:\Users\bill>ollama list test NAME ID SIZE MODIFIED C:\Users\bill>ollama create test -f ./Modelfile gathering model components using existing layer sha256:6e9f90f02bb3b39b59e81916e8cfce9deb45aeaeb9a54a5be4414486b907dc1e using existing layer sha256:369ca498f347f710d068cbb38bf0b8692dd3fa30f30ca2ff755e211c94768150 using existing layer sha256:6e4c38e1172f42fdbff13edf9a7a017679fb82b0fde415a3e8b3c31c6ed4a4e4 using existing layer sha256:f4d24e9138dd4603380add165d2b0d970bef471fac194b436ebd50e6147c6588 writing manifest success C:\Users\bill>ollama list test NAME ID SIZE MODIFIED test:latest 72d94c4c6e1c 9.0 GB 9 hours ago ```
Author
Owner

@YonTracks commented on GitHub (Feb 13, 2025):

I used the api. via an old next.js build with model creation, via modelfile,

 const response = await fetch("http://localhost:11434/api/create", {
      method: "POST",
      headers: { "Content-Type": "application/json" },
      body: JSON.stringify({ name, modelfile }),
    });

and
FROM llama3.1
for example.
but I only tried because I seen that issue on another post.
I have not looked into it, I will tomorrow,
cheers.

<!-- gh-comment-id:2656904343 --> @YonTracks commented on GitHub (Feb 13, 2025): I used the api. via an old next.js build with model creation, via modelfile, ``` const response = await fetch("http://localhost:11434/api/create", { method: "POST", headers: { "Content-Type": "application/json" }, body: JSON.stringify({ name, modelfile }), }); ``` and `FROM llama3.1` for example. but I only tried because I seen that issue on another post. I have not looked into it, I will tomorrow, cheers.
Author
Owner

@YonTracks commented on GitHub (Feb 14, 2025):

yep, it works great, cheers for your help, you are awesome. my issue was a readableStream issue
and my bad error handling.
a next.js route.

response snippet:

body: ReadableStream { locked: false, state: 'readable', supportsBYOB: true },
  bodyUsed: false,

I had to await, a responseBody first, depending on if steaming etc.

await response.text()

working:

body: ReadableStream { locked: true, state: 'closed', supportsBYOB: true },
  bodyUsed: true,
<!-- gh-comment-id:2658042342 --> @YonTracks commented on GitHub (Feb 14, 2025): yep, it works great, cheers for your help, you are awesome. my issue was a readableStream issue and my bad error handling. a next.js route. response snippet: ``` body: ReadableStream { locked: false, state: 'readable', supportsBYOB: true }, bodyUsed: false, ``` I had to await, a responseBody first, depending on if steaming etc. `await response.text()` working: ``` body: ReadableStream { locked: true, state: 'closed', supportsBYOB: true }, bodyUsed: true, ```
Author
Owner

@YonTracks commented on GitHub (Feb 14, 2025):

no, I'm wrong, had to await for the error lol.
I tested the cli, and used the same mario modefile example from the doc's, for a test and it worked, I did that last night, I then used the same name, testing the api, and didn't realize same mario and I thought the api worked, anyway, cli works but the api, it doesn't work (It will be me, I'll sort it soon) srry far out, I am shockingly bad at communicating srry, you are awesome,
I added some error handling so yep, I'll sort it still. again cheers, you are awesome, love.
Response Body: {"error":"neither 'from' or 'files' was specified","status":400}

<!-- gh-comment-id:2658073224 --> @YonTracks commented on GitHub (Feb 14, 2025): no, I'm wrong, had to await for the error lol. I tested the cli, and used the same mario modefile example from the doc's, for a test and it worked, I did that last night, I then used the same name, testing the api, and didn't realize same mario and I thought the api worked, anyway, cli works but the api, it doesn't work (It will be me, I'll sort it soon) srry far out, I am shockingly bad at communicating srry, you are awesome, I added some error handling so yep, I'll sort it still. again cheers, you are awesome, love. `Response Body: {"error":"neither 'from' or 'files' was specified","status":400}`
Author
Owner

@YonTracks commented on GitHub (Feb 14, 2025):

I did exactly like the awesome error message told me.
Response Body: {"error":"neither 'from' or 'files' was specified","status":400}
so, I investigated, yep lol, the api is awesome, I was doing it wrong, the old way did work before. lol.

before not working:

issue modelfile should be from.

 const response = await fetch("http://localhost:11434/api/create", {
      method: "POST",
      headers: { "Content-Type": "application/json" },
      body: JSON.stringify({ name, modelfile }),
    });

now working and much better:

    const response = await fetch("http://localhost:11434/api/create", {
      method: "POST",
      headers: { "Content-Type": "application/json" },
      body: JSON.stringify({
        model: name,
        from,
        system,
        parameters,
      }),
    });
Creating model with: {
  name: 'llama3.2-mario',
  from: 'llama3.2',
  system: 'You are Mario from Super Mario Bros. Answer as Mario, the assistant, only.',
  parameters: { temperature: 1 }
}
Response Body: {"status":"using existing layer sha256:dde5aa3fc5ffc17176b5e8bdc82f587b24b2678c6c66101bf7da77af9f7ccdff"}
{"status":"using existing layer sha256:966de95ca8a62200913e3f8bfbf84c8494536f1b94b49166851e76644e966396"}
{"status":"using existing layer sha256:fcc5a6bec9daf9b561a68827b67ab6088e1dba9d1fa2a50d7bbcc8384e0a265d"}
{"status":"using existing layer sha256:a70ff7e570d97baaf4e62ac6e6ad9975e04caa6d900d3742d37698494479e0cd"}
{"status":"creating new layer sha256:26e8a9fb842226088ea277005780fe69312e5b2ff9a52d4e0bac97c27299f405"}
{"status":"creating new layer sha256:7fa4d1c192726882c2c46a2ffd5af3caddd99e96404e81b3cf2a41de36e25991"}
{"status":"writing manifest"}
{"status":"success"}

 POST /api/create 200 in 49ms
 GET /api/models 200 in 27ms
 ✓ Compiled in 802ms (2297 modules)

cheers ollama. epic.

<!-- gh-comment-id:2658115752 --> @YonTracks commented on GitHub (Feb 14, 2025): I did exactly like the awesome error message told me. `Response Body: {"error":"neither 'from' or 'files' was specified","status":400}` so, I investigated, yep lol, the api is `awesome`, I was doing it wrong, the old way did work before. lol. before not working: issue `modelfile` should be `from`. ``` const response = await fetch("http://localhost:11434/api/create", { method: "POST", headers: { "Content-Type": "application/json" }, body: JSON.stringify({ name, modelfile }), }); ``` now working and much better: ``` const response = await fetch("http://localhost:11434/api/create", { method: "POST", headers: { "Content-Type": "application/json" }, body: JSON.stringify({ model: name, from, system, parameters, }), }); ``` ```GET /api/models 200 in 24ms Creating model with: { name: 'llama3.2-mario', from: 'llama3.2', system: 'You are Mario from Super Mario Bros. Answer as Mario, the assistant, only.', parameters: { temperature: 1 } } Response Body: {"status":"using existing layer sha256:dde5aa3fc5ffc17176b5e8bdc82f587b24b2678c6c66101bf7da77af9f7ccdff"} {"status":"using existing layer sha256:966de95ca8a62200913e3f8bfbf84c8494536f1b94b49166851e76644e966396"} {"status":"using existing layer sha256:fcc5a6bec9daf9b561a68827b67ab6088e1dba9d1fa2a50d7bbcc8384e0a265d"} {"status":"using existing layer sha256:a70ff7e570d97baaf4e62ac6e6ad9975e04caa6d900d3742d37698494479e0cd"} {"status":"creating new layer sha256:26e8a9fb842226088ea277005780fe69312e5b2ff9a52d4e0bac97c27299f405"} {"status":"creating new layer sha256:7fa4d1c192726882c2c46a2ffd5af3caddd99e96404e81b3cf2a41de36e25991"} {"status":"writing manifest"} {"status":"success"} POST /api/create 200 in 49ms GET /api/models 200 in 27ms ✓ Compiled in 802ms (2297 modules) ``` cheers ollama. epic.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#31664