[GH-ISSUE #2392] unable to initialize llm library Radeon card detected #1391

Closed
opened 2026-04-12 11:13:18 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @sigmaya on GitHub (Feb 7, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2392

Hello,
I am trying to run as user and manually, i get this error:
time=2024-02-07T19:00:18.967+01:00 level=INFO source=payload_common.go:106 msg="Extracting dynamic libraries..."
Error: unable to initialize llm library Radeon card detected, but permissions not set up properly. Either run ollama as root, or add you user account to the render group.

I had a firepro w7100 but some days ago , i removed it and now i am using an nvidia 3060, I am on ubuntu 20 and i have no idea how to tell ollama that the gpu is nvidia.

Originally created by @sigmaya on GitHub (Feb 7, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2392 Hello, I am trying to run as user and manually, i get this error: time=2024-02-07T19:00:18.967+01:00 level=INFO source=payload_common.go:106 msg="Extracting dynamic libraries..." Error: unable to initialize llm library Radeon card detected, but permissions not set up properly. Either run ollama as root, or add you user account to the render group. I had a firepro w7100 but some days ago , i removed it and now i am using an nvidia 3060, I am on ubuntu 20 and i have no idea how to tell ollama that the gpu is nvidia.
Author
Owner

@sigmaya commented on GitHub (Feb 7, 2024):

I had to remove amdgpu mod .

<!-- gh-comment-id:1933061101 --> @sigmaya commented on GitHub (Feb 7, 2024): I had to remove amdgpu mod .
Author
Owner

@swoh816 commented on GitHub (Feb 26, 2024):

@sigmaya I have the same issue as the one you had, may I ask how you removed andgpu mod?

<!-- gh-comment-id:1963895182 --> @swoh816 commented on GitHub (Feb 26, 2024): @sigmaya I have the same issue as the one you had, may I ask how you removed andgpu mod?
Author
Owner

@unclemcz commented on GitHub (Feb 28, 2024):

@swoh816 you can try:

  1. sudo systemctl stop ollama
  2. sudo ollama serve
<!-- gh-comment-id:1968245555 --> @unclemcz commented on GitHub (Feb 28, 2024): @swoh816 you can try: 1. sudo systemctl stop ollama 2. sudo ollama serve
Author
Owner

@swoh816 commented on GitHub (Feb 28, 2024):

@unclemcz Thanks, you're right, I used sudo and it doesn't raise an issue with amdgpu anymore.

Btw, may I ask why is that the case? Ollama used to work without root permission until recently, so it puzzles me that it suddenly dooesn't work without the root permission.

Also, is Ollama incompatible with AMDGPU and that's why you cannot initialize with Radeon card?

<!-- gh-comment-id:1968887851 --> @swoh816 commented on GitHub (Feb 28, 2024): @unclemcz Thanks, you're right, I used `sudo` and it doesn't raise an issue with amdgpu anymore. Btw, may I ask why is that the case? Ollama used to work without root permission until recently, so it puzzles me that it suddenly dooesn't work without the root permission. Also, is Ollama incompatible with AMDGPU and that's why you cannot initialize with Radeon card?
Author
Owner

@unclemcz commented on GitHub (Feb 29, 2024):

@swoh816 actually,i don't know either,just follow the error info and keep trying again and again. o(╥﹏╥)o

<!-- gh-comment-id:1970179497 --> @unclemcz commented on GitHub (Feb 29, 2024): @swoh816 actually,i don't know either,just follow the error info and keep trying again and again. o(╥﹏╥)o
Author
Owner

@swoh816 commented on GitHub (Feb 29, 2024):

@unclemcz You actually helped me work, thanks :) I was trying to get Obsidian Copilot work, and you gave me a huge hint to come up with a solution: https://github.com/logancyang/obsidian-copilot/issues/269#issuecomment-1969011057

<!-- gh-comment-id:1970181070 --> @swoh816 commented on GitHub (Feb 29, 2024): @unclemcz You actually helped me work, thanks :) I was trying to get Obsidian Copilot work, and you gave me a huge hint to come up with a solution: https://github.com/logancyang/obsidian-copilot/issues/269#issuecomment-1969011057
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#1391