[GH-ISSUE #2529] Ollama Windows is much slower at inference than Ollama on WSL2 #27241

Closed
opened 2026-04-22 04:24:28 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @devinprater on GitHub (Feb 16, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2529

CPU: AMD 5500U with Radion internal GPU. Ollama runs on CPU mode on both WSL2 and Windows. Attached are the logs from Windows, and Linux.
server.log
ollama-log-linux.log

Originally created by @devinprater on GitHub (Feb 16, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2529 CPU: AMD 5500U with Radion internal GPU. Ollama runs on CPU mode on both WSL2 and Windows. Attached are the logs from Windows, and Linux. [server.log](https://github.com/ollama/ollama/files/14303692/server.log) [ollama-log-linux.log](https://github.com/ollama/ollama/files/14303696/ollama-log-linux.log)
GiteaMirror added the windows label 2026-04-22 04:24:28 -05:00
Author
Owner

@Pey-crypto commented on GitHub (Feb 16, 2024):

currently ollama is only searching for nvidia and amd based libraries, in the file server.log on the line 69 you can see the search paths for the nvidia libraries
time=2024-02-15T14:08:41.094-06:00 level=DEBUG source=gpu.go:280 msg="gpu management search paths:
but none were detected for your system on amd. You can see that it has not detected any gpu on line 70
msg="Discovered GPU libraries: []"
I don't think ollama supports amd based gpu for now.
I stand corrected. I stand recorrected

<!-- gh-comment-id:1947900605 --> @Pey-crypto commented on GitHub (Feb 16, 2024): currently ollama is only searching for nvidia and amd based libraries, in the file server.log on the line 69 you can see the search paths for the nvidia libraries `time=2024-02-15T14:08:41.094-06:00 level=DEBUG source=gpu.go:280 msg="gpu management search paths: ` but none were detected for your system on amd. You can see that it has not detected any gpu on line 70 `msg="Discovered GPU libraries: []"` ~I don't think ollama supports amd based gpu for now.~ ~I stand corrected.~ I stand recorrected
Author
Owner

@wolfgang-azevedo commented on GitHub (Feb 16, 2024):

time=2024-02-16T12:44:05.907+04:00 level=INFO source=gpu.go:308 msg="Discovered GPU libraries: []"
time=2024-02-16T12:44:05.907+04:00 level=INFO source=gpu.go:262 msg="Searching for GPU management library rocm_smi64.dll"
RocM is used for AMD GPUs, please check if you have a compatible GPU otherwise it will fallback to CPU.

https://rocm.docs.amd.com/en/latest/

<!-- gh-comment-id:1948027187 --> @wolfgang-azevedo commented on GitHub (Feb 16, 2024): time=2024-02-16T12:44:05.907+04:00 level=INFO source=gpu.go:308 msg="Discovered GPU libraries: []" time=2024-02-16T12:44:05.907+04:00 level=INFO source=gpu.go:262 msg="Searching for GPU management library **rocm_smi64.dll**" RocM is used for AMD GPUs, please check if you have a compatible GPU otherwise it will fallback to CPU. https://rocm.docs.amd.com/en/latest/
Author
Owner

@devinprater commented on GitHub (Feb 16, 2024):

I don't mind if it's on CPU. On Linux it works fine on CPU, on Windows it's slow on CPU.

<!-- gh-comment-id:1948357431 --> @devinprater commented on GitHub (Feb 16, 2024): I don't mind if it's on CPU. On Linux it works fine on CPU, on Windows it's slow on CPU.
Author
Owner

@Inovvia commented on GitHub (Feb 17, 2024):

I have installed ROCM/HIP for windows but I don't see rocm_smi64.dll listed in the bin folder. Additionally it seems that according to rocm smi git

C library for Linux

image

<!-- gh-comment-id:1950543823 --> @Inovvia commented on GitHub (Feb 17, 2024): I have installed ROCM/HIP for windows but I don't see rocm_smi64.dll listed in the bin folder. Additionally it seems that according to rocm smi git > C library for Linux ![image](https://github.com/ollama/ollama/assets/70137651/05b4f6f0-5bca-494c-9fbd-975871f70460)
Author
Owner

@dhiltgen commented on GitHub (Feb 19, 2024):

Radeon cards are not yet supported by our native windows app. We'll track adding that support in #2598

<!-- gh-comment-id:1953173944 --> @dhiltgen commented on GitHub (Feb 19, 2024): Radeon cards are not yet supported by our native windows app. We'll track adding that support in #2598
Author
Owner

@zhiyuan1i commented on GitHub (Feb 21, 2024):

I run ollama on CPU in both wsl2 and Windows native, but the windows client is twice as slow as wsl2.

<!-- gh-comment-id:1956222950 --> @zhiyuan1i commented on GitHub (Feb 21, 2024): I run ollama on CPU in both wsl2 and Windows native, but the windows client is twice as slow as wsl2.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#27241