[GH-ISSUE #3916] Error: The parameter is incorrect. #2428

Closed
opened 2026-04-12 12:44:18 -05:00 by GiteaMirror · 29 comments
Owner

Originally created by @aaamoon on GitHub (Apr 25, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3916

Originally assigned to: @dhiltgen on GitHub.

What is the issue?

[GIN] 2024/04/26 - 01:24:28 | 200 | 0s | 127.0.0.1 | HEAD "/"
[GIN] 2024/04/26 - 01:24:28 | 200 | 1.1779ms | 127.0.0.1 | POST "/api/show"
[GIN] 2024/04/26 - 01:24:28 | 200 | 1.4496ms | 127.0.0.1 | POST "/api/show"
{"function":"process_single_task","level":"INFO","line":1510,"msg":"slot data","n_idle_slots":1,"n_processing_slots":0,"task_id":7,"tid":"28416","timestamp":1714065868}
{"function":"log_server_request","level":"INFO","line":2741,"method":"GET","msg":"request","params":{},"path":"/health","remote_addr":"127.0.0.1","remote_port":53568,"status":200,"tid":"32280","timestamp":1714065868}
[GIN] 2024/04/26 - 01:24:28 | 200 | 2.2784ms | 127.0.0.1 | POST "/api/chat"

image

OS

Windows

GPU

Nvidia

CPU

Intel

Ollama version

0.1.32

Originally created by @aaamoon on GitHub (Apr 25, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3916 Originally assigned to: @dhiltgen on GitHub. ### What is the issue? [GIN] 2024/04/26 - 01:24:28 | 200 | 0s | 127.0.0.1 | HEAD "/" [GIN] 2024/04/26 - 01:24:28 | 200 | 1.1779ms | 127.0.0.1 | POST "/api/show" [GIN] 2024/04/26 - 01:24:28 | 200 | 1.4496ms | 127.0.0.1 | POST "/api/show" {"function":"process_single_task","level":"INFO","line":1510,"msg":"slot data","n_idle_slots":1,"n_processing_slots":0,"task_id":7,"tid":"28416","timestamp":1714065868} {"function":"log_server_request","level":"INFO","line":2741,"method":"GET","msg":"request","params":{},"path":"/health","remote_addr":"127.0.0.1","remote_port":53568,"status":200,"tid":"32280","timestamp":1714065868} [GIN] 2024/04/26 - 01:24:28 | 200 | 2.2784ms | 127.0.0.1 | POST "/api/chat" <img width="691" alt="image" src="https://github.com/ollama/ollama/assets/25700476/0a4c78aa-6a17-4a04-be93-d0cfec776682"> ### OS Windows ### GPU Nvidia ### CPU Intel ### Ollama version 0.1.32
GiteaMirror added the bugwindows labels 2026-04-12 12:44:18 -05:00
Author
Owner

@aaamoon commented on GitHub (Apr 28, 2024):

It works fine if I run the command from a terminal in VSCode.

<!-- gh-comment-id:2081278665 --> @aaamoon commented on GitHub (Apr 28, 2024): It works fine if I run the command from a terminal in VSCode.
Author
Owner

@dhiltgen commented on GitHub (May 1, 2024):

I'm not able to reproduce this. Can you explain your setup a bit more?

<!-- gh-comment-id:2089198200 --> @dhiltgen commented on GitHub (May 1, 2024): I'm not able to reproduce this. Can you explain your setup a bit more?
Author
Owner

@i486 commented on GitHub (May 1, 2024):

Are you on Windows 7?

<!-- gh-comment-id:2089255009 --> @i486 commented on GitHub (May 1, 2024): Are you on Windows 7?
Author
Owner

@dhiltgen commented on GitHub (May 2, 2024):

My suspicion is this is somehow locale based. We attempt to put the terminal into a mode that supports control characters.
This appears not to be working which is why it isn't rendering the spinner. Unfortunately this error Error: The parameter is incorrect. isn't particularly specific, and doesn't come from our code, so some underlying library or windows API is reporting this, and I'm not sure where yet.

Could someone who's seeing this try running through the various commands in the CLI to see which of them trigger this error, and which (if any) don't? Maybe that will help narrow the search to try to find the problem.

<!-- gh-comment-id:2091232605 --> @dhiltgen commented on GitHub (May 2, 2024): My suspicion is this is somehow locale based. We attempt to put the terminal into a mode that supports control characters. This appears not to be working which is why it isn't rendering the spinner. Unfortunately this error `Error: The parameter is incorrect.` isn't particularly specific, and doesn't come from our code, so some underlying library or windows API is reporting this, and I'm not sure where yet. Could someone who's seeing this try running through the various commands in the CLI to see which of them trigger this error, and which (if any) don't? Maybe that will help narrow the search to try to find the problem.
Author
Owner

@i486 commented on GitHub (May 2, 2024):

I'm currently in the process of testing this.

For me, it seems like most commands work fine. Only the "run" command is affected by the "Error: The parameter is incorrect." error.

ollama0130-1

ollama0130-2

ollama0130-3

As you can see, I am on Windows 7, which is why you see these (←[?25h←[?25l) artifacts. Ignore this as it seems to be Windows 7 specific. Additionally, I previously tested these commands on versions 1.31 and 1.32, but forgot to take screenshots.
Anyway, I remember being able to execute "ollama run model" to initiate a model starting a conversation in some earlier version of Ollama. I'm uncertain which Ollama version introduced this parameter error, as I primarily use GUI apps. Hence, I'm currently downgrading Ollama version by version to pinpoint when this error started occurring.

PS: I'm aware that Ollama doesn't officially support Windows 7, but since this specific issue was reported by some Windows 10 users, I thought it might be help. Also please let me know if you need more commands to test as I am kinda new to Ollama's command line.

<!-- gh-comment-id:2091408944 --> @i486 commented on GitHub (May 2, 2024): I'm currently in the process of testing this. For me, it seems like most commands work fine. Only the "run" command is affected by the "Error: The parameter is incorrect." error. ![ollama0130-1](https://github.com/ollama/ollama/assets/625325/a6f595c5-129c-4117-92f5-ced18cef5a86) ![ollama0130-2](https://github.com/ollama/ollama/assets/625325/c731488d-1f3a-4180-ad2a-f940e8fd8d20) ![ollama0130-3](https://github.com/ollama/ollama/assets/625325/dd587c49-a934-4521-be9b-3cda244e7043) As you can see, I am on Windows 7, which is why you see these (←[?25h←[?25l) artifacts. Ignore this as it seems to be Windows 7 specific. Additionally, I previously tested these commands on versions 1.31 and 1.32, but forgot to take screenshots. Anyway, I remember being able to execute "ollama run model" to initiate a model starting a conversation in some earlier version of Ollama. I'm uncertain which Ollama version introduced this parameter error, as I primarily use GUI apps. Hence, I'm currently downgrading Ollama version by version to pinpoint when this error started occurring. PS: I'm aware that Ollama doesn't officially support Windows 7, but since this specific issue was reported by some Windows 10 users, I thought it might be help. Also please let me know if you need more commands to test as I am kinda new to Ollama's command line.
Author
Owner

@i486 commented on GitHub (May 2, 2024):

So it seems this parameter bug started occurring from version 0.1.30.

image

By the way, it would be great if you could address this ←[?25l⠙ ←[?25h←[?25l←[2K←[1G⠹ ←[?25h←[?25l issue for us Windows 7 users. It seems that only one Windows 10-only API is preventing the new version of Golang from running under Windows 7. Otherwise, Go itself and its compiled apps run smoothly under Windows 7 after extending the kernel with VxKex

<!-- gh-comment-id:2091421939 --> @i486 commented on GitHub (May 2, 2024): So it seems this parameter bug started occurring from version 0.1.30. ![image](https://github.com/ollama/ollama/assets/625325/4d9b2767-d35e-47b0-a252-54d30f0cf622) By the way, it would be great if you could address this ←[?25l⠙ ←[?25h←[?25l←[2K←[1G⠹ ←[?25h←[?25l issue for us Windows 7 users. It seems that only one Windows 10-only API is preventing the new version of Golang from running under Windows 7. Otherwise, Go itself and its compiled apps run smoothly under Windows 7 after extending the kernel with [VxKex](https://github.com/vxiiduu/VxKex)
Author
Owner

@dhiltgen commented on GitHub (May 2, 2024):

@i486 thanks for that extra detail! As far as windows 7 support, I think we'd be open to clean/simple PRs that fix bugs which help people running on windows 7, but the core maintainers probably don't have bandwidth to focus on these.

@nb001 can you confirm you see the same on Windows 10?

<!-- gh-comment-id:2091576641 --> @dhiltgen commented on GitHub (May 2, 2024): @i486 thanks for that extra detail! As far as windows 7 support, I think we'd be open to clean/simple PRs that fix bugs which help people running on windows 7, but the core maintainers probably don't have bandwidth to focus on these. @nb001 can you confirm you see the same on Windows 10?
Author
Owner

@aaamoon commented on GitHub (May 6, 2024):

Reference

I am on Windows 10.

<!-- gh-comment-id:2095127526 --> @aaamoon commented on GitHub (May 6, 2024): > Reference I am on Windows 10.
Author
Owner

@aaamoon commented on GitHub (May 6, 2024):

So it seems this parameter bug started occurring from version 0.1.30.

image

By the way, it would be great if you could address this ←[?25l⠙ ←[?25h←[?25l←[2K←[1G⠹ ←[?25h←[?25l issue for us Windows 7 users. It seems that only one Windows 10-only API is preventing the new version of Golang from running under Windows 7. Otherwise, Go itself and its compiled apps run smoothly under Windows 7 after extending the kernel with VxKex

@i486 @dhiltgen I am on Windows 10. Only the "run" command is affected by the "Error: The parameter is incorrect" error when I use version 0.1.32. It works fine when I use version 0.1.29.

<!-- gh-comment-id:2095130430 --> @aaamoon commented on GitHub (May 6, 2024): > So it seems this parameter bug started occurring from version 0.1.30. > > ![image](https://private-user-images.githubusercontent.com/625325/327567276-4d9b2767-d35e-47b0-a252-54d30f0cf622.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTQ5NjQ4MTIsIm5iZiI6MTcxNDk2NDUxMiwicGF0aCI6Ii82MjUzMjUvMzI3NTY3Mjc2LTRkOWIyNzY3LWQzNWUtNDdiMC1hMjUyLTU0ZDMwZjBjZjYyMi5wbmc_WC1BbXotQWxnb3JpdGhtPUFXUzQtSE1BQy1TSEEyNTYmWC1BbXotQ3JlZGVudGlhbD1BS0lBVkNPRFlMU0E1M1BRSzRaQSUyRjIwMjQwNTA2JTJGdXMtZWFzdC0xJTJGczMlMkZhd3M0X3JlcXVlc3QmWC1BbXotRGF0ZT0yMDI0MDUwNlQwMzAxNTJaJlgtQW16LUV4cGlyZXM9MzAwJlgtQW16LVNpZ25hdHVyZT0yOGFkNDAyYzljZmJhMzQ2NTAxN2UxMGU2YzY5OTEyMThhZmRjZmY3YmRjZDZhZTcyM2ZlMGYyNGNmNWU3ZmQ5JlgtQW16LVNpZ25lZEhlYWRlcnM9aG9zdCZhY3Rvcl9pZD0wJmtleV9pZD0wJnJlcG9faWQ9MCJ9.pjncNdOtlGRCdzZ2GOB3wLbtEZmuwwD-1Wog9Efoayc) > > By the way, it would be great if you could address this ←[?25l⠙ ←[?25h←[?25l←[2K←[1G⠹ ←[?25h←[?25l issue for us Windows 7 users. It seems that only one Windows 10-only API is preventing the new version of Golang from running under Windows 7. Otherwise, Go itself and its compiled apps run smoothly under Windows 7 after extending the kernel with [VxKex](https://github.com/vxiiduu/VxKex) @i486 @dhiltgen I am on Windows 10. Only the "run" command is affected by the "Error: The parameter is incorrect" error when I use version 0.1.32. It works fine when I use version 0.1.29.
Author
Owner

@i486 commented on GitHub (May 6, 2024):

@aaamoon

That's odd. I noticed that the screenshot in your first post looks just like mine (←[?25l⠙ ←[?25h) when I'm using Windows 7.

I tried it on Windows 10 recently, and I didn't have those issues with the screen artifacts or the "Parameter is incorrect" error.

Are you using an older version of Windows 10 or something?

Anyway, could you please share the server.log file located in %LOCALAPPDATA%\Ollama?

<!-- gh-comment-id:2095152906 --> @i486 commented on GitHub (May 6, 2024): @aaamoon That's odd. I noticed that the screenshot in your first post looks just like mine (←[?25l⠙ ←[?25h) when I'm using Windows 7. I tried it on Windows 10 recently, and I didn't have those issues with the screen artifacts or the "Parameter is incorrect" error. Are you using an older version of Windows 10 or something? Anyway, could you please share the server.log file located in %LOCALAPPDATA%\Ollama?
Author
Owner

@aaamoon commented on GitHub (May 6, 2024):

@i486 I am on Windows 10, version 21H1 (Professional).

This should be a problem with the Windows. There is no obvious error in the server.log.

When I use "run" command, the log will output the following content:

[GIN] 2024/04/26 - 01:24:28 | 200 | 0s | 127.0.0.1 | HEAD "/"
[GIN] 2024/04/26 - 01:24:28 | 200 | 1.1779ms | 127.0.0.1 | POST "/api/show"
[GIN] 2024/04/26 - 01:24:28 | 200 | 1.4496ms | 127.0.0.1 | POST "/api/show"
{"function":"process_single_task","level":"INFO","line":1510,"msg":"slot data","n_idle_slots":1,"n_processing_slots":0,"task_id":7,"tid":"28416","timestamp":1714065868}
{"function":"log_server_request","level":"INFO","line":2741,"method":"GET","msg":"request","params":{},"path":"/health","remote_addr":"127.0.0.1","remote_port":53568,"status":200,"tid":"32280","timestamp":1714065868}
[GIN] 2024/04/26 - 01:24:28 | 200 | 2.2784ms | 127.0.0.1 | POST "/api/chat"

<!-- gh-comment-id:2095204572 --> @aaamoon commented on GitHub (May 6, 2024): @i486 I am on Windows 10, version 21H1 (Professional). This should be a problem with the Windows. There is no obvious error in the server.log. When I use "run" command, the log will output the following content: [GIN] 2024/04/26 - 01:24:28 | 200 | 0s | 127.0.0.1 | HEAD "/" [GIN] 2024/04/26 - 01:24:28 | 200 | 1.1779ms | 127.0.0.1 | POST "/api/show" [GIN] 2024/04/26 - 01:24:28 | 200 | 1.4496ms | 127.0.0.1 | POST "/api/show" {"function":"process_single_task","level":"INFO","line":1510,"msg":"slot data","n_idle_slots":1,"n_processing_slots":0,"task_id":7,"tid":"28416","timestamp":1714065868} {"function":"log_server_request","level":"INFO","line":2741,"method":"GET","msg":"request","params":{},"path":"/health","remote_addr":"127.0.0.1","remote_port":53568,"status":200,"tid":"32280","timestamp":1714065868} [GIN] 2024/04/26 - 01:24:28 | 200 | 2.2784ms | 127.0.0.1 | POST "/api/chat"
Author
Owner

@dhiltgen commented on GitHub (May 7, 2024):

@aaamoon that's a good data point. From what I can tell, it looks like Win 10 21H1 is out of support. Are other people seeing this problem also running older updates of Windows? If you update to Win 10 22H2 does the problem go away?

<!-- gh-comment-id:2099475205 --> @dhiltgen commented on GitHub (May 7, 2024): @aaamoon that's a good data point. From what I can tell, it looks like Win 10 21H1 is out of support. Are other people seeing this problem also running older updates of Windows? If you update to Win 10 22H2 does the problem go away?
Author
Owner

@aaamoon commented on GitHub (May 8, 2024):

@dhiltgen I have updated to Win 10 22H2, the problem go away.

<!-- gh-comment-id:2099655218 --> @aaamoon commented on GitHub (May 8, 2024): @dhiltgen I have updated to Win 10 22H2, the problem go away.
Author
Owner

@MrDoe commented on GitHub (Jun 4, 2024):

Not for me, I'm using Win 10 22H2 and since I installed all new Windows updates and the Ollama update yesterday, this issue suddenly came up.
A workaround is to use the new Terminal app.

<!-- gh-comment-id:2146710453 --> @MrDoe commented on GitHub (Jun 4, 2024): Not for me, I'm using Win 10 22H2 and since I installed all new Windows updates and the Ollama update yesterday, this issue suddenly came up. A workaround is to use the new [Terminal app](https://apps.microsoft.com/detail/9n0dx20hk701?hl=de-de&gl=DE).
Author
Owner

@264312431 commented on GitHub (Jun 21, 2024):

i have the same problem on older win10 version with cmd.exe/clink/readline.
the encoding is fine but running "ollama run x" (without passing any input) will always crash with "Error: The parameter is incorrect.

I gave it a quick look in ida, and it's caused by the ENABLE_VIRTUAL_TERMINAL_INPUT flag:

c7c2f3bc22/readline/term_windows.go (L27)

removing that fixes it for me.

<!-- gh-comment-id:2181986921 --> @264312431 commented on GitHub (Jun 21, 2024): i have the same problem on older win10 version with cmd.exe/clink/readline. the encoding is fine but running "ollama run x" (without passing any input) will always crash with "Error: The parameter is incorrect. I gave it a quick look in ida, and it's caused by the ENABLE_VIRTUAL_TERMINAL_INPUT flag: https://github.com/ollama/ollama/blob/c7c2f3bc228714988c650d4c238a08510f58e8cb/readline/term_windows.go#L27 removing that fixes it for me.
Author
Owner

@i486 commented on GitHub (Jun 21, 2024):

i have the same problem on older win10 version with cmd.exe/clink/readline. the encoding is fine but running "ollama run x" (without passing any input) will always crash with "Error: The parameter is incorrect.

I gave it a quick look in ida, and it's caused by the ENABLE_VIRTUAL_TERMINAL_INPUT flag:

c7c2f3bc22/readline/term_windows.go (L27)

removing that fixes it for me.

Does this also fix ←[?25l⠙ ←[?25h←[?25l←[2K←[1G⠹ ←[?25h←[?25l?

<!-- gh-comment-id:2182310889 --> @i486 commented on GitHub (Jun 21, 2024): > i have the same problem on older win10 version with cmd.exe/clink/readline. the encoding is fine but running "ollama run x" (without passing any input) will always crash with "Error: The parameter is incorrect. > > I gave it a quick look in ida, and it's caused by the ENABLE_VIRTUAL_TERMINAL_INPUT flag: > > https://github.com/ollama/ollama/blob/c7c2f3bc228714988c650d4c238a08510f58e8cb/readline/term_windows.go#L27 > > removing that fixes it for me. Does this also fix ←[?25l⠙ ←[?25h←[?25l←[2K←[1G⠹ ←[?25h←[?25l?
Author
Owner

@264312431 commented on GitHub (Jun 23, 2024):

idk, but you can try my patch if you want https://gofile.io/d/Hg0iES
but it looks more like some encoding error, maybe try chcp 65001 or 850 or 1252(?)

<!-- gh-comment-id:2185148364 --> @264312431 commented on GitHub (Jun 23, 2024): idk, but you can try my patch if you want https://gofile.io/d/Hg0iES but it looks more like some encoding error, maybe try chcp 65001 or 850 or 1252(?)
Author
Owner

@i486 commented on GitHub (Jun 24, 2024):

idk, but you can try my patch if you want https://gofile.io/d/Hg0iES but it looks more like some encoding error, maybe try chcp 65001 or 850 or 1252(?)

I can confirm Error: The parameter is incorrect. error no longer appears on your patched Ollama.

image

Nice job investigating and sharing the solution

<!-- gh-comment-id:2186223427 --> @i486 commented on GitHub (Jun 24, 2024): > idk, but you can try my patch if you want https://gofile.io/d/Hg0iES but it looks more like some encoding error, maybe try chcp 65001 or 850 or 1252(?) I can confirm `Error: The parameter is incorrect.` error no longer appears on your patched Ollama. ![image](https://github.com/ollama/ollama/assets/625325/8a0f3f96-4f03-478d-a889-7da92ecc2602) Nice job investigating and sharing the solution
Author
Owner

@justinsverige commented on GitHub (Jul 27, 2024):

idk, but you can try my patch if you want https://gofile.io/d/Hg0iES but it looks more like some encoding error, maybe try chcp 65001 or 850 or 1252(?)

you patch is not public , can I get access to it please :)

<!-- gh-comment-id:2254262713 --> @justinsverige commented on GitHub (Jul 27, 2024): > idk, but you can try my patch if you want https://gofile.io/d/Hg0iES but it looks more like some encoding error, maybe try chcp 65001 or 850 or 1252(?) you patch is not public , can I get access to it please :)
Author
Owner

@264312431 commented on GitHub (Jul 28, 2024):

you patch is not public , can I get access to it please :)

https://pixeldrain.com/u/SSL46cTR

<!-- gh-comment-id:2254333276 --> @264312431 commented on GitHub (Jul 28, 2024): > > you patch is not public , can I get access to it please :) > https://pixeldrain.com/u/SSL46cTR
Author
Owner

@iplayfast commented on GitHub (Jan 24, 2025):

I can confirm that ollama 0.5.7 (jan 23 2025) still has this error on a windows 11 box.

<!-- gh-comment-id:2611373669 --> @iplayfast commented on GitHub (Jan 24, 2025): I can confirm that ollama 0.5.7 (jan 23 2025) still has this error on a windows 11 box.
Author
Owner

@mabdulre9 commented on GitHub (Jan 31, 2025):

I have found a solution to this problem ----- Error: The parameter is incorrect. ----- There is an instruction set pre-built in processors called AVX2 (advanced vector extensions 2). It started in 4th generation intel processors. It is required to run any models on ollama. If your processor is below a 4th generation i3, i5 or i7 then you will get this error.

<!-- gh-comment-id:2626205037 --> @mabdulre9 commented on GitHub (Jan 31, 2025): I have found a solution to this problem ----- Error: The parameter is incorrect. ----- There is an instruction set pre-built in processors called AVX2 (advanced vector extensions 2). It started in 4th generation intel processors. It is required to run any models on ollama. If your processor is below a 4th generation i3, i5 or i7 then you will get this error.
Author
Owner

@GoodStudy-DayUp commented on GitHub (Feb 4, 2025):

ollama 0.5.7 still has this error on a windows 11 box.
Image

<!-- gh-comment-id:2632723510 --> @GoodStudy-DayUp commented on GitHub (Feb 4, 2025): ollama 0.5.7 still has this error on a windows 11 box. ![Image](https://github.com/user-attachments/assets/25311fab-b11b-4e61-910c-ff785523d4cb)
Author
Owner

@dhiltgen commented on GitHub (Feb 10, 2025):

Unfortunately this error message is a generic windows error message which can come from many different places. On the topic of CPUs without AVX support, the next release 0.5.8 should perform better on CPUs without AVX support.

@GoodStudy-DayUp please open a new issue and include server logs so we can evaluate why deepseek isn't running on your system. https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md

<!-- gh-comment-id:2649362688 --> @dhiltgen commented on GitHub (Feb 10, 2025): Unfortunately this error message is a generic windows error message which can come from many different places. On the topic of CPUs without AVX support, the next release 0.5.8 should perform better on CPUs without AVX support. @GoodStudy-DayUp please open a new issue and include server logs so we can evaluate why deepseek isn't running on your system. https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md
Author
Owner

@airkg215 commented on GitHub (Feb 11, 2025):

我是win10企业版,我在cmd,powershell,vscode终端运行,都提示The parameter is incorrect。心碎了...

Image

<!-- gh-comment-id:2650217320 --> @airkg215 commented on GitHub (Feb 11, 2025): 我是win10企业版,我在cmd,powershell,vscode终端运行,都提示The parameter is incorrect。心碎了... ![Image](https://github.com/user-attachments/assets/72902f1f-6e9d-4f1e-bd11-f1f9817f09cc)
Author
Owner

@justinsverige commented on GitHub (Feb 11, 2025):

你这个估计就是cpu的问题,换台电脑吧。。。。


发件人: airkg215 @.>
发送时间: 2025年2月11日 9:15
收件人: ollama/ollama @.
>
抄送: justinsverige @.>; Comment @.>
主题: Re: [ollama/ollama] Error: The parameter is incorrect. (Issue #3916)

我是win10企业版,我在cmd,powershell,vscode终端运行,都提示The parameter is incorrect。心碎了...

image.png (view on web)https://github.com/user-attachments/assets/72902f1f-6e9d-4f1e-bd11-f1f9817f09cc


Reply to this email directly, view it on GitHubhttps://github.com/ollama/ollama/issues/3916#issuecomment-2650217320, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AKY3SCWTLBIPMIHAOZHO5HD2PG5SXAVCNFSM6AAAAABVYVBCMKVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDMNJQGIYTOMZSGA.
You are receiving this because you commented.Message ID: @.***>

<!-- gh-comment-id:2650345100 --> @justinsverige commented on GitHub (Feb 11, 2025): 你这个估计就是cpu的问题,换台电脑吧。。。。 ________________________________ 发件人: airkg215 ***@***.***> 发送时间: 2025年2月11日 9:15 收件人: ollama/ollama ***@***.***> 抄送: justinsverige ***@***.***>; Comment ***@***.***> 主题: Re: [ollama/ollama] Error: The parameter is incorrect. (Issue #3916) 我是win10企业版,我在cmd,powershell,vscode终端运行,都提示The parameter is incorrect。心碎了... image.png (view on web)<https://github.com/user-attachments/assets/72902f1f-6e9d-4f1e-bd11-f1f9817f09cc> ― Reply to this email directly, view it on GitHub<https://github.com/ollama/ollama/issues/3916#issuecomment-2650217320>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/AKY3SCWTLBIPMIHAOZHO5HD2PG5SXAVCNFSM6AAAAABVYVBCMKVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDMNJQGIYTOMZSGA>. You are receiving this because you commented.Message ID: ***@***.***>
Author
Owner

@dhiltgen commented on GitHub (Feb 11, 2025):

@airkg215 server logs may help us understand what's going wrong.

https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md

<!-- gh-comment-id:2651300466 --> @dhiltgen commented on GitHub (Feb 11, 2025): @airkg215 server logs may help us understand what's going wrong. https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md
Author
Owner

@airkg215 commented on GitHub (Feb 14, 2025):

@airkg215 server logs may help us understand what's going wrong.

https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md

2025/02/13 09:02:23 routes.go:1187: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:F:\Ollama OLLAMA_MULTIUSER_CACHE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://*] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES:]"
time=2025-02-13T09:02:23.980+08:00 level=INFO source=images.go:432 msg="total blobs: 5"
time=2025-02-13T09:02:23.981+08:00 level=INFO source=images.go:439 msg="total unused blobs removed: 0"
time=2025-02-13T09:02:23.984+08:00 level=INFO source=routes.go:1238 msg="Listening on 127.0.0.1:11434 (version 0.5.7)"
time=2025-02-13T09:02:23.987+08:00 level=INFO source=routes.go:1267 msg="Dynamic LLM libraries" runners="[rocm_avx cpu cpu_avx cpu_avx2 cuda_v11_avx cuda_v12_avx]"
time=2025-02-13T09:02:23.987+08:00 level=INFO source=gpu.go:226 msg="looking for compatible GPUs"
time=2025-02-13T09:02:23.987+08:00 level=INFO source=gpu_windows.go:167 msg=packages count=1
time=2025-02-13T09:02:23.987+08:00 level=INFO source=gpu_windows.go:214 msg="" package=0 cores=6 efficiency=0 threads=6
time=2025-02-13T09:02:24.148+08:00 level=INFO source=gpu.go:392 msg="no compatible GPUs were discovered"
time=2025-02-13T09:02:24.148+08:00 level=INFO source=types.go:131 msg="inference compute" id=0 library=cpu variant=avx2 compute="" driver=0.0 name="" total="15.8 GiB" available="12.3 GiB"

What's the problem with my system?

<!-- gh-comment-id:2658045829 --> @airkg215 commented on GitHub (Feb 14, 2025): > [@airkg215](https://github.com/airkg215) server logs may help us understand what's going wrong. > > https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md 2025/02/13 09:02:23 routes.go:1187: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: HTTPS_PROXY: HTTP_PROXY: NO_PROXY: OLLAMA_DEBUG:false OLLAMA_FLASH_ATTENTION:false OLLAMA_GPU_OVERHEAD:0 OLLAMA_HOST:http://127.0.0.1:11434 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_KV_CACHE_TYPE: OLLAMA_LLM_LIBRARY: OLLAMA_LOAD_TIMEOUT:5m0s OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:F:\\Ollama OLLAMA_MULTIUSER_CACHE:false OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://* vscode-webview://*] OLLAMA_SCHED_SPREAD:false ROCR_VISIBLE_DEVICES:]" time=2025-02-13T09:02:23.980+08:00 level=INFO source=images.go:432 msg="total blobs: 5" time=2025-02-13T09:02:23.981+08:00 level=INFO source=images.go:439 msg="total unused blobs removed: 0" time=2025-02-13T09:02:23.984+08:00 level=INFO source=routes.go:1238 msg="Listening on 127.0.0.1:11434 (version 0.5.7)" time=2025-02-13T09:02:23.987+08:00 level=INFO source=routes.go:1267 msg="Dynamic LLM libraries" runners="[rocm_avx cpu cpu_avx cpu_avx2 cuda_v11_avx cuda_v12_avx]" time=2025-02-13T09:02:23.987+08:00 level=INFO source=gpu.go:226 msg="looking for compatible GPUs" time=2025-02-13T09:02:23.987+08:00 level=INFO source=gpu_windows.go:167 msg=packages count=1 time=2025-02-13T09:02:23.987+08:00 level=INFO source=gpu_windows.go:214 msg="" package=0 cores=6 efficiency=0 threads=6 time=2025-02-13T09:02:24.148+08:00 level=INFO source=gpu.go:392 msg="no compatible GPUs were discovered" time=2025-02-13T09:02:24.148+08:00 level=INFO source=types.go:131 msg="inference compute" id=0 library=cpu variant=avx2 compute="" driver=0.0 name="" total="15.8 GiB" available="12.3 GiB" What's the problem with my system?
Author
Owner

@dhiltgen commented on GitHub (Feb 22, 2025):

@airkg215 thanks for the server logs. I don't see anything obvious in there - the server seems to be healthy, so it does appear to be a bug specific to the client. Unfortunately this error is a generic error, so I'm not sure where it's coming from yet. We can try one additional experiment to confirm the server is working properly, and maybe help narrow down where the client is hitting a problem.

Try running

ollama pull deepseek-r1:1.5b

if that gives the same error, try the following instead to use the API

(Invoke-WebRequest -method POST -Body '{"model":"deepseek-r1:1.5b","stream": false}' -uri http://localhost:11434/api/pull ).Content | ConvertFrom-json

Once the model is present, then try running it via the API:

 (Invoke-WebRequest -method POST -Body '{"model":"deepseek-r1:1.5b", "prompt":"Why is the sky blue?", "stream": false}' -uri http://localhost:11434/api/generate ).Content | ConvertFrom-json
<!-- gh-comment-id:2676352529 --> @dhiltgen commented on GitHub (Feb 22, 2025): @airkg215 thanks for the server logs. I don't see anything obvious in there - the server seems to be healthy, so it does appear to be a bug specific to the client. Unfortunately this error is a generic error, so I'm not sure where it's coming from yet. We can try one additional experiment to confirm the server is working properly, and maybe help narrow down where the client is hitting a problem. Try running ``` ollama pull deepseek-r1:1.5b ``` if that gives the same error, try the following instead to use the API ``` (Invoke-WebRequest -method POST -Body '{"model":"deepseek-r1:1.5b","stream": false}' -uri http://localhost:11434/api/pull ).Content | ConvertFrom-json ``` Once the model is present, then try running it via the API: ``` (Invoke-WebRequest -method POST -Body '{"model":"deepseek-r1:1.5b", "prompt":"Why is the sky blue?", "stream": false}' -uri http://localhost:11434/api/generate ).Content | ConvertFrom-json ```
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#2428