[GH-ISSUE #12815] AMD MI50 cann't load qwen3-vl into GPU VRM on docker ollama/ollama:0.12.7-rc0-rocm #70551

Open
opened 2026-05-04 21:56:53 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @tinyadam on GitHub (Oct 29, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/12815

Originally assigned to: @dhiltgen on GitHub.

What is the issue?

MI50 Only load successfully on CPU mode this version,please fix this

Relevant log output


OS

No response

GPU

No response

CPU

No response

Ollama version

No response

Originally created by @tinyadam on GitHub (Oct 29, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/12815 Originally assigned to: @dhiltgen on GitHub. ### What is the issue? MI50 Only load successfully on CPU mode this version,please fix this ### Relevant log output ```shell ``` ### OS _No response_ ### GPU _No response_ ### CPU _No response_ ### Ollama version _No response_
GiteaMirror added the amdbugdocker labels 2026-05-04 21:56:54 -05:00
Author
Owner

@rick-github commented on GitHub (Oct 29, 2025):

As of 0.12.5 MI50 is not supported by ROCm. The soon-to-be-released Vulkan support will work with this GPU. If you would like to try it out, clone the repo and build the project. Otherwise stick with 0.12.4 and wait for the Vulkan support.

<!-- gh-comment-id:3460588479 --> @rick-github commented on GitHub (Oct 29, 2025): As of [0.12.5](https://github.com/ollama/ollama/releases/tag/v0.12.5) MI50 is not supported by ROCm. The soon-to-be-released Vulkan support will work with this GPU. If you would like to try it out, clone the repo and build the project. Otherwise stick with 0.12.4 and wait for the Vulkan support.
Author
Owner

@tinyadam commented on GitHub (Oct 29, 2025):

As of 0.12.5 MI50 is not supported by ROCm. The soon-to-be-released Vulkan support will work with this GPU. If you would like to try it out, clone the repo and build the project. Otherwise stick with 0.12.4 and wait for the Vulkan support.

how to know Vulkan already supported on future?

<!-- gh-comment-id:3461987938 --> @tinyadam commented on GitHub (Oct 29, 2025): > As of [0.12.5](https://github.com/ollama/ollama/releases/tag/v0.12.5) MI50 is not supported by ROCm. The soon-to-be-released Vulkan support will work with this GPU. If you would like to try it out, clone the repo and build the project. Otherwise stick with 0.12.4 and wait for the Vulkan support. how to know Vulkan already supported on future?
Author
Owner

@rick-github commented on GitHub (Oct 29, 2025):

It will be in the release notes.

<!-- gh-comment-id:3461999294 --> @rick-github commented on GitHub (Oct 29, 2025): It will be in the release notes.
Author
Owner

@tinyadam commented on GitHub (Oct 29, 2025):

very well,best to make some docker image in docker hub

<!-- gh-comment-id:3462435655 --> @tinyadam commented on GitHub (Oct 29, 2025): very well,best to make some docker image in docker hub
Author
Owner

@rick-github commented on GitHub (Oct 29, 2025):

If you are using docker, it is very simple to build an image that you can use:

$ git clone https://github.com/ollama/ollama.git
$ docker build -f ollama/Dockerfile -t ollama/ollama:vulkan ollama
<!-- gh-comment-id:3462480545 --> @rick-github commented on GitHub (Oct 29, 2025): If you are using docker, it is very simple to build an image that you can use: ```console $ git clone https://github.com/ollama/ollama.git $ docker build -f ollama/Dockerfile -t ollama/ollama:vulkan ollama ```
Author
Owner

@dhiltgen commented on GitHub (Nov 14, 2025):

In 0.12.11 Vulkan is now included in the official binaries, but still experimental. To enable, set OLLAMA_VULKAN=1 for the server. https://github.com/ollama/ollama/blob/main/docs/docker.mdx#vulkan-support

<!-- gh-comment-id:3530361226 --> @dhiltgen commented on GitHub (Nov 14, 2025): In 0.12.11 Vulkan is now included in the official binaries, but still experimental. To enable, set OLLAMA_VULKAN=1 for the server. https://github.com/ollama/ollama/blob/main/docs/docker.mdx#vulkan-support
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#70551