[GH-ISSUE #4503] Ollama create fails when using a utf16 Modelfile #28580

Closed
opened 2026-04-22 06:53:00 -05:00 by GiteaMirror · 22 comments
Owner

Originally created by @dehlong on GitHub (May 17, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/4503

What is the issue?

Hello,
I try to create a new model and mo matter what the model file is, 90% of the time I get:
Error: command must be one of "from", "license", "template", "system", "adapter", "parameter", or "message"

Is there any solution to this?
This is my modelfile:
FROM llama3
PARAMETER temperature 1
PARAMETER num_ctx 4096
SYSTEM You are Mario from super mario bros, acting as an assistant.

OS

Linux

GPU

Other

CPU

Intel

Ollama version

0.1.38

Originally created by @dehlong on GitHub (May 17, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/4503 ### What is the issue? Hello, I try to create a new model and mo matter what the model file is, 90% of the time I get: Error: command must be one of "from", "license", "template", "system", "adapter", "parameter", or "message" Is there any solution to this? This is my modelfile: FROM llama3 PARAMETER temperature 1 PARAMETER num_ctx 4096 SYSTEM You are Mario from super mario bros, acting as an assistant. ### OS Linux ### GPU Other ### CPU Intel ### Ollama version 0.1.38
GiteaMirror added the bug label 2026-04-22 06:53:00 -05:00
Author
Owner

@brt-yilmaz commented on GitHub (May 17, 2024):

Be careful about the spaces, first try to copy modelfile's string from : https://github.com/ollama/ollama/blob/main/docs/api.md#request-13
And than customise it.

<!-- gh-comment-id:2118475395 --> @brt-yilmaz commented on GitHub (May 17, 2024): Be careful about the spaces, first try to copy modelfile's string from : https://github.com/ollama/ollama/blob/main/docs/api.md#request-13 And than customise it.
Author
Owner

@pdevine commented on GitHub (May 18, 2024):

@dehlong I just created a model using that modelfile and it worked fine. Was there anything else you had added into it?

<!-- gh-comment-id:2119022780 --> @pdevine commented on GitHub (May 18, 2024): @dehlong I just created a model using that modelfile and it worked fine. Was there anything else you had added into it?
Author
Owner

@pdevine commented on GitHub (May 18, 2024):

% ollama create -f ~/Modelfiles/Mariomodelfile pdevine/mario
transferring model data
using existing layer sha256:00e1317cbf74d901080d7100f57580ba8dd8de57203072dc6f668324ba545f29
using existing layer sha256:4fa551d4f938f68b8c1e6afa9d28befb70e3f33f75d0753248d530364aeea40f
using existing layer sha256:8ab4849b038cf0abc5b1c9b8ee1443dca6b93a045c2272180d985126eb40bf6f
creating new layer sha256:278f3e552ef89955f0e5b42c48d52a37794179dc28d1caff2d5b8e8ff133e158
creating new layer sha256:40440ec37ef2b2862d182b7926987668264d13ff9c97407acf36a44106997f8f
creating new layer sha256:bd886c34b18cdaf5ed2d30acb3de0d3010c5546c6b1d259d5e75b2efe4a85c70
writing manifest
success
% ollama run pdevine/mario
>>> hi there
"It's-a me, Mario! Welcome to our little corner of the Mushroom Kingdom! I'm-a here to help you with anything you might need. Whether it's rescuing Princess Peach
from Bowser or finding the hidden Warp Pipes, I'm your guy! So, what seems to be the problem? Need some power-ups or maybe a map to the nearest castle?"

The Modelfile looks like:

FROM llama3
PARAMETER temperature 1
PARAMETER num_ctx 4096
SYSTEM You are Mario from super mario bros, acting as an assistant.
<!-- gh-comment-id:2119022995 --> @pdevine commented on GitHub (May 18, 2024): ``` % ollama create -f ~/Modelfiles/Mariomodelfile pdevine/mario transferring model data using existing layer sha256:00e1317cbf74d901080d7100f57580ba8dd8de57203072dc6f668324ba545f29 using existing layer sha256:4fa551d4f938f68b8c1e6afa9d28befb70e3f33f75d0753248d530364aeea40f using existing layer sha256:8ab4849b038cf0abc5b1c9b8ee1443dca6b93a045c2272180d985126eb40bf6f creating new layer sha256:278f3e552ef89955f0e5b42c48d52a37794179dc28d1caff2d5b8e8ff133e158 creating new layer sha256:40440ec37ef2b2862d182b7926987668264d13ff9c97407acf36a44106997f8f creating new layer sha256:bd886c34b18cdaf5ed2d30acb3de0d3010c5546c6b1d259d5e75b2efe4a85c70 writing manifest success % ollama run pdevine/mario >>> hi there "It's-a me, Mario! Welcome to our little corner of the Mushroom Kingdom! I'm-a here to help you with anything you might need. Whether it's rescuing Princess Peach from Bowser or finding the hidden Warp Pipes, I'm your guy! So, what seems to be the problem? Need some power-ups or maybe a map to the nearest castle?" ``` The Modelfile looks like: ``` FROM llama3 PARAMETER temperature 1 PARAMETER num_ctx 4096 SYSTEM You are Mario from super mario bros, acting as an assistant. ```
Author
Owner

@dehlong commented on GitHub (May 19, 2024):

Nope, it was just this.

<!-- gh-comment-id:2119277008 --> @dehlong commented on GitHub (May 19, 2024): Nope, it was just this.
Author
Owner

@namoench commented on GitHub (May 19, 2024):

I've been having the same issue on Windows for a few days.

At this point I'm sure it's not an issue with the model file.
use show --modelfile to copy the modelfile from an existing model and then use ollama create test58 -f ./Modelfile I get the same Error: command must be one of "from", "license", "template", "system", "adapter", "parameter", or "message" that Dehlong does.

I've uninstalled and reinstalled ollama. Using OpenWebUI model builder does work, but I need to use ollama create for reasons.

@dehlong I'll let you know if I trip into an answer but at this point I feel pretty stuck.

For clarity this has been working for me previously. I've used ollama create successfully dozens of times. It broke for me with the last update.

<!-- gh-comment-id:2119343701 --> @namoench commented on GitHub (May 19, 2024): I've been having the same issue on Windows for a few days. At this point I'm sure it's not an issue with the model file. use show --modelfile to copy the modelfile from an existing model and then use ollama create test58 -f ./Modelfile I get the same Error: command must be one of "from", "license", "template", "system", "adapter", "parameter", or "message" that Dehlong does. I've uninstalled and reinstalled ollama. Using OpenWebUI model builder does work, but I need to use ollama create for reasons. @dehlong I'll let you know if I trip into an answer but at this point I feel pretty stuck. For clarity this has been working for me previously. I've used ollama create successfully dozens of times. It broke for me with the last update.
Author
Owner

@pdevine commented on GitHub (May 19, 2024):

I'm wondering if this is just a problem w/ msdos files adding a carriage return+line feed to the end of each line in the file. Can either of you add the file to the issue? I think you should be able to drag+drop it into a comment.

<!-- gh-comment-id:2119345810 --> @pdevine commented on GitHub (May 19, 2024): I'm wondering if this is just a problem w/ msdos files adding a carriage return+line feed to the end of each line in the file. Can either of you add the file to the issue? I think you should be able to drag+drop it into a comment.
Author
Owner

@namoench commented on GitHub (May 19, 2024):

Modelfile.zip

Does this work?
I realize now the file type is showing up in windows explorer as a Program PhotonWorkShop file (anycubic slicer). unclear to me if this is a problem or not, unclear to me if it's likely to be the same issue Dehlong has considering he's on linux.

edit: TBC - having the same problem with other modelfiles that show up as type '3 file' and '6 file'

image

<!-- gh-comment-id:2119349204 --> @namoench commented on GitHub (May 19, 2024): [Modelfile.zip](https://github.com/ollama/ollama/files/15370655/Modelfile.zip) Does this work? I realize now the file type is showing up in windows explorer as a Program PhotonWorkShop file (anycubic slicer). unclear to me if this is a problem or not, unclear to me if it's likely to be the same issue Dehlong has considering he's on linux. edit: TBC - having the same problem with other modelfiles that show up as type '3 file' and '6 file' ![image](https://github.com/ollama/ollama/assets/154482269/1968e585-eef2-4394-bb3a-b7eecfec9142)
Author
Owner

@pdevine commented on GitHub (May 19, 2024):

@duck1y Perfect. I was able to reproduce the problem using that file. Will try to sort out a fix now.

<!-- gh-comment-id:2119352909 --> @pdevine commented on GitHub (May 19, 2024): @duck1y Perfect. I was able to reproduce the problem using that file. Will try to sort out a fix now.
Author
Owner

@pdevine commented on GitHub (May 19, 2024):

The problem turns out to be the file is a utf16 file and we're trying to parse it as a utf8 file. The temporary work around is you can convert the file in powershell using the command:

powershell "Get-Content 'Modelfile' | Out-File 'Newmodelfile' -Encoding ascii"

I have a fix I'm working on, which hopefully we can get into 0.1.39.

<!-- gh-comment-id:2119376677 --> @pdevine commented on GitHub (May 19, 2024): The problem turns out to be the file is a utf16 file and we're trying to parse it as a utf8 file. The temporary work around is you can convert the file in powershell using the command: ``` powershell "Get-Content 'Modelfile' | Out-File 'Newmodelfile' -Encoding ascii" ``` I have a fix I'm working on, which hopefully we can get into `0.1.39`.
Author
Owner

@namoench commented on GitHub (May 19, 2024):

@pdevine the workaround worked for me - thank you so much!! I appreciate you!!!

Is it clear that this is likely to be the source of @dehlong 's issue aswell?
@dehlong can you confirm this workaround works for you? I would hate to have hijacked your thread and left you without a solution.

Ty both :):)

<!-- gh-comment-id:2119378952 --> @namoench commented on GitHub (May 19, 2024): @pdevine the workaround worked for me - thank you so much!! I appreciate you!!! Is it clear that this is likely to be the source of @dehlong 's issue aswell? @dehlong can you confirm this workaround works for you? I would hate to have hijacked your thread and left you without a solution. Ty both :):)
Author
Owner

@pdevine commented on GitHub (May 19, 2024):

@duck1y it's almost certainly the same problem.

<!-- gh-comment-id:2119380301 --> @pdevine commented on GitHub (May 19, 2024): @duck1y it's almost certainly the same problem.
Author
Owner

@Anorid commented on GitHub (May 20, 2024):

@pdevine If you can, the big guy can also help look at this error

<!-- gh-comment-id:2119603321 --> @Anorid commented on GitHub (May 20, 2024): @pdevine If you can, the big guy can also help look at this error
Author
Owner

@pdevine commented on GitHub (May 20, 2024):

@Anorid can you create a new issue for that problem and post the Modelfile along with any relevant info (like if you're trying to use a converted model where you got the weights from)? This is definitely a different issue than the one you posted.

<!-- gh-comment-id:2119658018 --> @pdevine commented on GitHub (May 20, 2024): @Anorid can you create a new issue for that problem and post the Modelfile along with any relevant info (like if you're trying to use a converted model where you got the weights from)? This is definitely a different issue than the one you posted.
Author
Owner

@Anorid commented on GitHub (May 20, 2024):

I've created a new issue and posted the relevant information
#4529

<!-- gh-comment-id:2119661123 --> @Anorid commented on GitHub (May 20, 2024): I've created a new issue and posted the relevant information #4529
Author
Owner

@Patrickeik commented on GitHub (May 25, 2024):

sorry. i try to do as instructed. still error: get-content LLMteacher-modelfile | out-file LLMtest -Encoding ascii
PS C:\Users\p\Documents\testing> ollama create LLMTeacher -f LLMtest

Error: command must be one of "from", "license", "template", "system", "adapter", "parameter", or "message"

windows 11

what do i do wrong

<!-- gh-comment-id:2131212690 --> @Patrickeik commented on GitHub (May 25, 2024): sorry. i try to do as instructed. still error: get-content LLMteacher-modelfile | out-file LLMtest -Encoding ascii PS C:\Users\p\Documents\testing> ollama create LLMTeacher -f LLMtest Error: command must be one of "from", "license", "template", "system", "adapter", "parameter", or "message" windows 11 what do i do wrong
Author
Owner

@Patrickeik commented on GitHub (May 28, 2024):

here are more details:
PS C:\Users\peike\ollama> ollama show llama3 --modelfile > test
PS C:\Users\peike\ollama> ollama create ll3teacher -f test

Error: command must be one of "from", "license", "template", "system", "adapter", "parameter", or "message"
PS C:\Users\peike\ollama> Get-Content test | Out-File newtest -Encoding ascii
PS C:\Users\peike\ollama> ollama create ll3teacher -f newtest

Error: command must be one of "from", "license", "template", "system", "adapter", "parameter", or "message"
PS C:\Users\peike\ollama>

<!-- gh-comment-id:2134957629 --> @Patrickeik commented on GitHub (May 28, 2024): here are more details: PS C:\Users\peike\ollama> ollama show llama3 --modelfile > test PS C:\Users\peike\ollama> ollama create ll3teacher -f test Error: command must be one of "from", "license", "template", "system", "adapter", "parameter", or "message" PS C:\Users\peike\ollama> Get-Content test | Out-File newtest -Encoding ascii PS C:\Users\peike\ollama> ollama create ll3teacher -f newtest Error: command must be one of "from", "license", "template", "system", "adapter", "parameter", or "message" PS C:\Users\peike\ollama>
Author
Owner

@donaldafeith commented on GitHub (May 30, 2024):

For anyone having problems creating a model... (On Windows)
Get a text file put in it
FROM ./themodelfileyouwant
save text file
rename the text file Modelfile (no txt extension)
then go to cmd
ollama create thenameyouwanttonameyourmodel -f Modelfile
Hit enter and wait for the magic.

Not sure why it's not told in a straightforward manner but that's how I do it. Works great.

<!-- gh-comment-id:2139316691 --> @donaldafeith commented on GitHub (May 30, 2024): For anyone having problems creating a model... (On Windows) Get a text file put in it FROM ./themodelfileyouwant save text file rename the text file Modelfile (no txt extension) then go to cmd ollama create thenameyouwanttonameyourmodel -f Modelfile Hit enter and wait for the magic. Not sure why it's not told in a straightforward manner but that's how I do it. Works great.
Author
Owner

@pdevine commented on GitHub (May 30, 2024):

This should be fixed w/ 0.1.39 which now can parse a utf16 file (albeit only w/ 8 bit characters). There's another PR coming for allowing utf16 characters in the modelfile itself.

<!-- gh-comment-id:2140455278 --> @pdevine commented on GitHub (May 30, 2024): This should be fixed w/ `0.1.39` which now can parse a utf16 file (albeit _only_ w/ 8 bit characters). There's another PR coming for allowing utf16 characters in the modelfile itself.
Author
Owner

@Patrickeik commented on GitHub (May 30, 2024):

thank you both!. workaround helped me! thank you!

<!-- gh-comment-id:2140714832 --> @Patrickeik commented on GitHub (May 30, 2024): thank you both!. workaround helped me! thank you!
Author
Owner

@Ajun0302 commented on GitHub (Oct 29, 2024):

PS C:\Users\32960> cd .\Desktop
PS C:\Users\32960\Desktop> cd .\iict
PS C:\Users\32960\Desktop\iict> cd .\lightRAG
PS C:\Users\32960\Desktop\iict\lightRAG> powershell "Get-Content 'Modelfile' | Out-File 'Newmodelfile' -Encoding ascii"
PS C:\Users\32960\Desktop\iict\lightRAG> ollama create qwen2m -f Newmodelfile

Error: command must be one of "from", "license", "template", "system", "adapter", "parameter", or "message"
PS C:\Users\32960\Desktop\iict\lightRAG>

<!-- gh-comment-id:2443026390 --> @Ajun0302 commented on GitHub (Oct 29, 2024): PS C:\Users\32960> cd .\Desktop\ PS C:\Users\32960\Desktop> cd .\iict\ PS C:\Users\32960\Desktop\iict> cd .\lightRAG\ PS C:\Users\32960\Desktop\iict\lightRAG> powershell "Get-Content 'Modelfile' | Out-File 'Newmodelfile' -Encoding ascii" PS C:\Users\32960\Desktop\iict\lightRAG> ollama create qwen2m -f Newmodelfile Error: command must be one of "from", "license", "template", "system", "adapter", "parameter", or "message" PS C:\Users\32960\Desktop\iict\lightRAG>
Author
Owner

@Ajun0302 commented on GitHub (Oct 29, 2024):

how to "FROM ./themodelfileyouwant"?

<!-- gh-comment-id:2443026935 --> @Ajun0302 commented on GitHub (Oct 29, 2024): how to "FROM ./themodelfileyouwant"?
Author
Owner

@Moxie15 commented on GitHub (Dec 13, 2024):

im still having the problem on Windows 11
image
image

<!-- gh-comment-id:2542550897 --> @Moxie15 commented on GitHub (Dec 13, 2024): im still having the problem on Windows 11 ![image](https://github.com/user-attachments/assets/b05fc2fc-dcd4-4ab9-acf6-337fc2018e5d) ![image](https://github.com/user-attachments/assets/8bdce6b0-ec90-4c1f-8f38-46d9f4cba1a4)
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#28580