Please add KoboldCPP support #1089

Closed
opened 2025-11-11 14:37:02 -06:00 by GiteaMirror · 0 comments
Owner

Originally created by @FelipeLujan on GitHub (Jun 1, 2024).

Is your feature request related to a problem? Please describe.
Yes, Some last gen AMD graphics cards are not supported by vanilla Ollama, other implementation have added support for that hardware, such as https://github.com/YellowRoseCx/koboldcpp-rocm
In my case Radeon 6750XT with 12gb vram

Describe the solution you'd like
Be able to connect Open webui to a running OpenAI endpoint served by KoboldCPP

Describe alternatives you've considered
Forcing Ollama to run on AMD unsupported hardware, without success

Originally created by @FelipeLujan on GitHub (Jun 1, 2024). **Is your feature request related to a problem? Please describe.** Yes, Some last gen AMD graphics cards are not supported by vanilla Ollama, other implementation have added support for that hardware, such as https://github.com/YellowRoseCx/koboldcpp-rocm In my case Radeon 6750XT with 12gb vram **Describe the solution you'd like** Be able to connect Open webui to a running OpenAI endpoint served by KoboldCPP **Describe alternatives you've considered** Forcing Ollama to run on AMD unsupported hardware, without success
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/open-webui#1089