[GH-ISSUE #10715] iMac not using GPU #7041

Closed
opened 2026-04-12 18:57:02 -05:00 by GiteaMirror · 11 comments
Owner

Originally created by @russell-kitchen on GitHub (May 15, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10715

What is the issue?

Platform:
Mac OSX
Retina 5K, 27-inch, 2019
Sonoma 14.7.2
Radeon Pro Vega 48 8 GB

It doesn't appear that Ollama is using the GPU at all. When running ollama ps is says 100% CPU.

I'd posted an incorrect issue a week or so back without properly getting the right information. It appears my mac GPU does support Metal(as noted below from the About Mac section on my computer) which theoretically means it should work. Any ideas?

Chipset Model: Radeon Pro Vega 48
Type: GPU
Bus: PCIe
PCIe Lane Width: x16
VRAM (Total): 8 GB
Vendor: AMD (0x1002)
Device ID: 0x6869
Revision ID: 0x0000
ROM Revision: 113-D0650E-072
VBIOS Version: 113-D05001A1XG-011
Option ROM Version: 113-D05001A1XG-011
EFI Driver Version: 01.01.072
Metal Support: Metal 3

Thank you!

Relevant log output


OS

macOS

GPU

AMD

CPU

Intel

Ollama version

0.6.8

Originally created by @russell-kitchen on GitHub (May 15, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10715 ### What is the issue? Platform: Mac OSX Retina 5K, 27-inch, 2019 Sonoma 14.7.2 Radeon Pro Vega 48 8 GB It doesn't appear that Ollama is using the GPU at all. When running `ollama ps` is says 100% CPU. I'd posted an incorrect issue a week or so back without properly getting the right information. It appears my mac GPU does support Metal(as noted below from the About Mac section on my computer) which theoretically means it should work. Any ideas? Chipset Model: Radeon Pro Vega 48 Type: GPU Bus: PCIe PCIe Lane Width: x16 VRAM (Total): 8 GB Vendor: AMD (0x1002) Device ID: 0x6869 Revision ID: 0x0000 ROM Revision: 113-D0650E-072 VBIOS Version: 113-D05001A1XG-011 Option ROM Version: 113-D05001A1XG-011 EFI Driver Version: 01.01.072 Metal Support: Metal 3 Thank you! ### Relevant log output ```shell ``` ### OS macOS ### GPU AMD ### CPU Intel ### Ollama version 0.6.8
GiteaMirror added the bugmacos labels 2026-04-12 18:57:02 -05:00
Author
Owner

@rick-github commented on GitHub (May 15, 2025):

Server logs may aid in debugging.

<!-- gh-comment-id:2884728651 --> @rick-github commented on GitHub (May 15, 2025): [Server logs](https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#how-to-troubleshoot-issues) may aid in debugging.
Author
Owner

@russell-kitchen commented on GitHub (May 15, 2025):

Server logs may aid in debugging.

Fab, thank you for the pointer!

My old one:
server.log

A fresh one with two chat prompts ran with 100% CPU:
server.log

<!-- gh-comment-id:2884860056 --> @russell-kitchen commented on GitHub (May 15, 2025): > [Server logs](https://github.com/ollama/ollama/blob/main/docs/troubleshooting.md#how-to-troubleshoot-issues) may aid in debugging. Fab, thank you for the pointer! My old one: [server.log](https://github.com/user-attachments/files/20233415/server.log) A fresh one with two chat prompts ran with 100% CPU: [server.log](https://github.com/user-attachments/files/20233497/server.log)
Author
Owner

@rick-github commented on GitHub (May 15, 2025):

time=2025-05-15T20:26:23.964+01:00 level=INFO source=types.go:130 msg="inference compute" id="" library=cpu variant="" compute="" driver=0.0 name="" total="16.0 GiB" available="5.0 GiB"

The GPU is not detected. If you add OLLAMA_DEBUG=1 to the server environment, there will be more details about device detection.

<!-- gh-comment-id:2884990730 --> @rick-github commented on GitHub (May 15, 2025): ``` time=2025-05-15T20:26:23.964+01:00 level=INFO source=types.go:130 msg="inference compute" id="" library=cpu variant="" compute="" driver=0.0 name="" total="16.0 GiB" available="5.0 GiB" ``` The GPU is not detected. If you add `OLLAMA_DEBUG=1` to the server environment, there will be more details about device detection.
Author
Owner

@russell-kitchen commented on GitHub (May 16, 2025):

Aces, thank you! With debug on I believe:

server.log

<!-- gh-comment-id:2885705065 --> @russell-kitchen commented on GitHub (May 16, 2025): Aces, thank you! With debug on I believe: [server.log](https://github.com/user-attachments/files/20238814/server.log)
Author
Owner

@rick-github commented on GitHub (May 16, 2025):

I went back to your previous issue and realized I overlooked something: you are using docker. Metal acceleration is not supported in docker. See also #5652. To use Metal, you need to install ollama in the base system.

<!-- gh-comment-id:2886267838 --> @rick-github commented on GitHub (May 16, 2025): I went back to your previous issue and realized I overlooked something: you are using docker. Metal acceleration is [not supported](https://chariotsolutions.com/blog/post/apple-silicon-gpus-docker-and-ollama-pick-two/) in docker. See also #5652. To use Metal, you need to install ollama in the base system.
Author
Owner

@russell-kitchen commented on GitHub (May 16, 2025):

Sorry that was a typo with the docker ps. I use docker for work so had a brain fart there which I amended in this post. Not using docker, just the command line interface

<!-- gh-comment-id:2886315009 --> @russell-kitchen commented on GitHub (May 16, 2025): Sorry that was a typo with the `docker ps`. I use docker for work so had a brain fart there which I amended in this post. Not using docker, just the command line interface
Author
Owner

@russell-kitchen commented on GitHub (May 16, 2025):

Also installed ollama through the site and not homebrew. Saw that homebrew was causing other people issues with GPUs not being registered so installed it directly through main site's download from:
https://ollama.com/download/mac

<!-- gh-comment-id:2886325287 --> @russell-kitchen commented on GitHub (May 16, 2025): Also installed ollama through the site and not homebrew. Saw that homebrew was causing other people issues with GPUs not being registered so installed it directly through main site's download from: https://ollama.com/download/mac
Author
Owner

@rick-github commented on GitHub (May 16, 2025):

Hmm, in that case I don't know what's going on. ollama is completely overlooking the GPU. I'm not familiar with the Apple ecosystem, but this page seems to indicate that Metal 3 is supported in Sonoma and this document lists AMD Vega as a supported chipset in Metal 3. Does MacOS have additional restrictions on devices? For example, to use GPU devices on Linux, the ollama user is added to the render and video groups.

<!-- gh-comment-id:2886364621 --> @rick-github commented on GitHub (May 16, 2025): Hmm, in that case I don't know what's going on. ollama is completely overlooking the GPU. I'm not familiar with the Apple ecosystem, but [this page](https://support.apple.com/en-gb/102894) seems to indicate that Metal 3 is supported in Sonoma and [this document](https://developer.apple.com/metal/Metal-Feature-Set-Tables.pdf) lists AMD Vega as a supported chipset in Metal 3. Does MacOS have additional restrictions on devices? For example, to use GPU devices on Linux, the ollama user is added to the render and video groups.
Author
Owner

@russell-kitchen commented on GitHub (May 16, 2025):

I don't believe so no with regards to GPU stuff

<!-- gh-comment-id:2887032458 --> @russell-kitchen commented on GitHub (May 16, 2025): I don't believe so no with regards to GPU stuff
Author
Owner

@rick-github commented on GitHub (May 16, 2025):

https://github.com/ollama/ollama/issues/1016

<!-- gh-comment-id:2887073533 --> @rick-github commented on GitHub (May 16, 2025): https://github.com/ollama/ollama/issues/1016
Author
Owner

@russell-kitchen commented on GitHub (May 17, 2025):

Ah. If I'm reading that thread right it looks like it's not something that'll get fixed for a bit then, and even if it does it appears GPU lending a hand in these cases often doesn't help much, or even hampers.

<!-- gh-comment-id:2888444782 --> @russell-kitchen commented on GitHub (May 17, 2025): Ah. If I'm reading that thread right it looks like it's not something that'll get fixed for a bit then, and even if it does it appears GPU lending a hand in these cases often doesn't help much, or even hampers.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#7041