[GH-ISSUE #9796] Unable to load CUDA drivers on 0.6.1 #6406

Open
opened 2026-04-12 17:57:09 -05:00 by GiteaMirror · 5 comments
Owner

Originally created by @mustyoshi on GitHub (Mar 16, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/9796

What is the issue?

Using 550.90.07 on a Proxmox CT, I get error code 999 when starting Ollama.
I have the GPU passing through from the host to the CT.
I downgrade to 0.6.0 and it works perfect.

Image

Relevant log output


OS

Linux

GPU

Nvidia

CPU

AMD

Ollama version

0.6.1

Originally created by @mustyoshi on GitHub (Mar 16, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/9796 ### What is the issue? Using 550.90.07 on a Proxmox CT, I get error code 999 when starting Ollama. I have the GPU passing through from the host to the CT. I downgrade to 0.6.0 and it works perfect. ![Image](https://github.com/user-attachments/assets/efae2ac5-80de-4cee-834d-db451f795f71) ### Relevant log output ```shell ``` ### OS Linux ### GPU Nvidia ### CPU AMD ### Ollama version 0.6.1
GiteaMirror added the bug label 2026-04-12 17:57:09 -05:00
Author
Owner

@hkthomas commented on GitHub (Mar 17, 2025):

this is big bug, A100 also won't load.

<!-- gh-comment-id:2727898172 --> @hkthomas commented on GitHub (Mar 17, 2025): this is big bug, A100 also won't load.
Author
Owner

@HuronExplodium commented on GitHub (Mar 17, 2025):

yep same here - missing some lib or something?

<!-- gh-comment-id:2730178889 --> @HuronExplodium commented on GitHub (Mar 17, 2025): yep same here - missing some lib or something?
Author
Owner

@liamsmith86 commented on GitHub (Mar 17, 2025):

Yeah same here on a 3090

<!-- gh-comment-id:2730217988 --> @liamsmith86 commented on GitHub (Mar 17, 2025): Yeah same here on a 3090
Author
Owner

@mustyoshi commented on GitHub (Apr 14, 2025):

Seems to be occurring on 0.6.5 still

<!-- gh-comment-id:2803358910 --> @mustyoshi commented on GitHub (Apr 14, 2025): Seems to be occurring on 0.6.5 still
Author
Owner

@chpego commented on GitHub (Apr 15, 2025):

Seems to be occurring on 0.6.5 still

Could you tell us more info about your CT ?
I was in the same situation, but i fix the problem and it wasn't Ollama's version that was the problem.

in my case, in my LXC, I was missing :

  • a device (/dev/nvidia-uvm); I'd have to check whether all nvidia-related devices have been added to this LXC.
  • my container was not in privileged mode
<!-- gh-comment-id:2803773727 --> @chpego commented on GitHub (Apr 15, 2025): > Seems to be occurring on 0.6.5 still Could you tell us more info about your CT ? I was in the same situation, but i fix the problem and it wasn't Ollama's version that was the problem. in my case, in my LXC, I was missing : - a device (**/dev/nvidia-uvm**); I'd have to check whether all nvidia-related devices have been added to this LXC. - my container was not in privileged mode
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#6406