[GH-ISSUE #2697] Please add KoboldCPP support #12985

Closed
opened 2026-04-19 19:47:27 -05:00 by GiteaMirror · 0 comments
Owner

Originally created by @FelipeLujan on GitHub (Jun 1, 2024).
Original GitHub issue: https://github.com/open-webui/open-webui/issues/2697

Is your feature request related to a problem? Please describe.
Yes, Some last gen AMD graphics cards are not supported by vanilla Ollama, other implementation have added support for that hardware, such as https://github.com/YellowRoseCx/koboldcpp-rocm
In my case Radeon 6750XT with 12gb vram

Describe the solution you'd like
Be able to connect Open webui to a running OpenAI endpoint served by KoboldCPP

Describe alternatives you've considered
Forcing Ollama to run on AMD unsupported hardware, without success

Originally created by @FelipeLujan on GitHub (Jun 1, 2024). Original GitHub issue: https://github.com/open-webui/open-webui/issues/2697 **Is your feature request related to a problem? Please describe.** Yes, Some last gen AMD graphics cards are not supported by vanilla Ollama, other implementation have added support for that hardware, such as https://github.com/YellowRoseCx/koboldcpp-rocm In my case Radeon 6750XT with 12gb vram **Describe the solution you'd like** Be able to connect Open webui to a running OpenAI endpoint served by KoboldCPP **Describe alternatives you've considered** Forcing Ollama to run on AMD unsupported hardware, without success
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#12985