[GH-ISSUE #7715] Ollama 0.4 not using VRAM on AMD RX 7900 XTX #4925

Closed
opened 2026-04-12 15:59:00 -05:00 by GiteaMirror · 4 comments
Owner

Originally created by @galizhur on GitHub (Nov 18, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7715

What is the issue?

Any model that I load seems to use system RAM instead the VRAM. This doesn't happen on version 0.3 of ollama. Something is clearly wrong because ollama freezes after a couple of requests and the only way to make it work again is to restart it every minute.

Here is the server log server.log

Also below I attached 2 screeshots with ollama with a loaded model and ollama stut down.

Model unloaded
Screenshot 2024-11-18 042227

Model loaded
Screenshot 2024-11-18 042318

OS

Windows

GPU

AMD

CPU

AMD

Ollama version

0.4.2

Originally created by @galizhur on GitHub (Nov 18, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7715 ### What is the issue? Any model that I load seems to use system RAM instead the VRAM. This doesn't happen on version 0.3 of ollama. Something is clearly wrong because ollama freezes after a couple of requests and the only way to make it work again is to restart it every minute. Here is the server log [server.log](https://github.com/user-attachments/files/17793658/server.log) Also below I attached 2 screeshots with ollama with a loaded model and ollama stut down. Model unloaded ![Screenshot 2024-11-18 042227](https://github.com/user-attachments/assets/6f9d8901-c9fd-4537-ab9c-02c03d348899) Model loaded ![Screenshot 2024-11-18 042318](https://github.com/user-attachments/assets/2d76d52e-8187-47e5-a932-48eace396f8f) ### OS Windows ### GPU AMD ### CPU AMD ### Ollama version 0.4.2
GiteaMirror added the bugamd labels 2026-04-12 15:59:00 -05:00
Author
Owner

@MrMiladinovic commented on GitHub (Nov 18, 2024):

Same issue here
After updating amd drivers to v-24.10.1 was working earlier today,

Rolling back to 24.9.1 does not seem to fix it

Thanks @dhiltgen can confirm as in #7107 rolling back further to 24.8.1 solved it.

<!-- gh-comment-id:2483981847 --> @MrMiladinovic commented on GitHub (Nov 18, 2024): Same issue here After updating amd drivers to v-24.10.1 was working earlier today, <strike> Rolling back to 24.9.1 does not seem to fix it </strike> Thanks @dhiltgen can confirm as in #7107 rolling back further to 24.8.1 solved it.
Author
Owner

@dhiltgen commented on GitHub (Nov 18, 2024):

@MrMiladinovic can you clarify which version you rolled back to? I believe you'll need to go back to 24.8.1 or older to resolve the problem until a newer driver is released.

We're tracking the driver bug with #7107

<!-- gh-comment-id:2484228174 --> @dhiltgen commented on GitHub (Nov 18, 2024): @MrMiladinovic can you clarify which version you rolled back to? I believe you'll need to go back to 24.8.1 or older to resolve the problem until a newer driver is released. We're tracking the driver bug with #7107
Author
Owner

@Torckane commented on GitHub (Nov 29, 2024):

Same problem for me on 7900 XTX. A day i have do an uptade of ollama, and now he do not want work anymore.
I have have try many things, re-install ollama. Uninstall > Re-install rocm 6.1.2 and 5.7.1 and every combinaison.
He do not want heard anything, only cpu use.
With https://github.com/likelovewant/ollama-for-amd
I have made it work on GPU now ... but he only use CPU RAM, he do not want VRAM.

PS C:\Users\Evangelion> ollama ps
NAME ID SIZE PROCESSOR UNTIL
mistral-small:latest d095cd553b04 15 GB 100% GPU 4 minutes from now

What version of v-24.10.1 are you talking ?

Windows 10 - 7900 XTX - Ollama 0.4.2

Ok thank you now i have understand you talk about AMD Adrenalin. ^^
Probleme resolve now it's work. ^^
I give the link for the next peoples with the same probleme to help :
https://www.amd.com/en/resources/support-articles/release-notes/RN-RAD-WIN-24-8-1.html#Contents

<!-- gh-comment-id:2508736180 --> @Torckane commented on GitHub (Nov 29, 2024): Same problem for me on 7900 XTX. A day i have do an uptade of ollama, and now he do not want work anymore. I have have try many things, re-install ollama. Uninstall > Re-install rocm 6.1.2 and 5.7.1 and every combinaison. He do not want heard anything, only cpu use. With https://github.com/likelovewant/ollama-for-amd I have made it work on GPU now ... but he only use CPU RAM, he do not want VRAM. PS C:\Users\Evangelion> ollama ps NAME ID SIZE PROCESSOR UNTIL mistral-small:latest d095cd553b04 15 GB 100% GPU 4 minutes from now What version of v-24.10.1 are you talking ? Windows 10 - 7900 XTX - Ollama 0.4.2 Ok thank you now i have understand you talk about AMD Adrenalin. ^^ Probleme resolve now it's work. ^^ I give the link for the next peoples with the same probleme to help : https://www.amd.com/en/resources/support-articles/release-notes/RN-RAD-WIN-24-8-1.html#Contents
Author
Owner

@MrMiladinovic commented on GitHub (Dec 4, 2024):

@Torckane , we are referring to AMD drivers> No issues with Ollama on Adrenalin 24.8.1 (slightly older driver)
Windows 11 v-24.8.1
Windows 10 v-24.8.1
Hope this helps

<!-- gh-comment-id:2516921034 --> @MrMiladinovic commented on GitHub (Dec 4, 2024): @Torckane , we are referring to AMD drivers> No issues with Ollama on Adrenalin 24.8.1 (slightly older driver) [Windows 11 v-24.8.1](https://drivers.amd.com/drivers/whql-amd-software-adrenalin-edition-24.8.1-win10-win11-aug-rdna.exe) [Windows 10 v-24.8.1](https://drivers.amd.com/drivers/whql-amd-software-adrenalin-edition-24.8.1-win10-win11-aug-rdna.exe) Hope this helps
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#4925