[GH-ISSUE #10087] Support of WSL1: cpu-only mode should probably work? #53124

Closed
opened 2026-04-29 02:01:06 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @vadimkantorov on GitHub (Apr 2, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10087

This is useful for some basic dev-tinkering before moving to a proper (and more expensive) GPU-enabled machine. I'm using WSL1 instead of WSL2 on my laptop because it's less resource-heavy and does not require Hyper-V. So would be nice to have some basic ollama functionality working even in this setup.

Should cpu-only mode work even now as is?

If systemd-service is not supported on WSL1, ollama could still run as simple server/client separate binaries? (or for most basic REPL query box test even run as a single synchronous single-query-at-a-time server+client)


Here is what I got when using the Linux install command on my WSL1+Ubuntu24.04:

$ curl -fsSL https://ollama.com/install.sh | sh
ERROR: Microsoft WSL1 is not currently supported. Please use WSL2 with 'wsl --set-version <distro> 2'
Originally created by @vadimkantorov on GitHub (Apr 2, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10087 This is useful for some basic dev-tinkering before moving to a proper (and more expensive) GPU-enabled machine. I'm using WSL1 instead of WSL2 on my laptop because it's less resource-heavy and does not require Hyper-V. So would be nice to have some basic ollama functionality working even in this setup. Should cpu-only mode work even now as is? If systemd-service is not supported on WSL1, ollama could still run as simple server/client separate binaries? (or for most basic REPL query box test even run as a single synchronous single-query-at-a-time server+client) --- Here is what I got when using the Linux install command on my WSL1+Ubuntu24.04: ``` $ curl -fsSL https://ollama.com/install.sh | sh ERROR: Microsoft WSL1 is not currently supported. Please use WSL2 with 'wsl --set-version <distro> 2' ```
GiteaMirror added the question label 2026-04-29 02:01:06 -05:00
Author
Owner

@dhiltgen commented on GitHub (Jul 5, 2025):

Since WSL1 does not support GPUs this installer check is by design to prevent users from accidentally setting things up incorrectly and getting frustrated without their GPU working.

The ollama binary should work inside WSL1 CPU only so you can follow the manual install instructions if you like.
https://github.com/ollama/ollama/blob/main/docs/linux.md#manual-install

<!-- gh-comment-id:3040267531 --> @dhiltgen commented on GitHub (Jul 5, 2025): Since WSL1 does not support GPUs this installer check is by design to prevent users from accidentally setting things up incorrectly and getting frustrated without their GPU working. The ollama binary should work inside WSL1 CPU only so you can follow the manual install instructions if you like. https://github.com/ollama/ollama/blob/main/docs/linux.md#manual-install
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#53124