[GH-ISSUE #6506] Ollama with RX6600, Openwebui and win 11 support #29856

Closed
opened 2026-04-22 09:07:37 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @kevinleijh on GitHub (Aug 25, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6506

Originally assigned to: @dhiltgen on GitHub.

I'm a big fan of running Ollama on Windows 11 with OpenWebUI, as it offers a seamless and feature-rich experience. However, I'm facing a challenge with my rx6600 graphics card, which isn't currently supported. I'd greatly appreciate it if you could add it to the list of supported cards, along with all other AMD cards. I'm excited to see the project grow and improve, and I'm confident that with your support, we can make it even better!

Originally created by @kevinleijh on GitHub (Aug 25, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6506 Originally assigned to: @dhiltgen on GitHub. I'm a big fan of running Ollama on Windows 11 with OpenWebUI, as it offers a seamless and feature-rich experience. However, I'm facing a challenge with my rx6600 graphics card, which isn't currently supported. I'd greatly appreciate it if you could add it to the list of supported cards, along with all other AMD cards. I'm excited to see the project grow and improve, and I'm confident that with your support, we can make it even better!
GiteaMirror added the feature requestamdwindows labels 2026-04-22 09:07:38 -05:00
Author
Owner

@dhiltgen commented on GitHub (Aug 27, 2024):

Let's track this via #4464

Unfortunately the override variable only works on Linux, so we don't currently have a solution for Windows given ROCm doesn't support this GPU.

<!-- gh-comment-id:2313568265 --> @dhiltgen commented on GitHub (Aug 27, 2024): Let's track this via #4464 Unfortunately the override variable only works on Linux, so we don't currently have a solution for Windows given ROCm doesn't support this GPU.
Author
Owner

@kevinleijh commented on GitHub (Aug 27, 2024):

Thank you, but if you look at kobold AI cpp, I can run any Q6 K quants GGUF with no problem and it's very fast for RX6600. Unfortunately, I can't run it with OpenWebUi because I don't know yet.

<!-- gh-comment-id:2313651852 --> @kevinleijh commented on GitHub (Aug 27, 2024): Thank you, but if you look at kobold AI cpp, I can run any Q6 K quants GGUF with no problem and it's very fast for RX6600. Unfortunately, I can't run it with OpenWebUi because I don't know yet.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#29856