[GH-ISSUE #7627] support multiple lora adapters #30628

Closed
opened 2026-04-22 10:27:51 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @lyingbug on GitHub (Nov 12, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7627

llama.cpp support multiple adapters, see https://github.com/ggerganov/llama.cpp/blob/master/examples/server/README.md

why ollama support only one adapter?
65973ceb64/llm/server.go (L203-L206)

Originally created by @lyingbug on GitHub (Nov 12, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7627 llama.cpp support multiple adapters, see https://github.com/ggerganov/llama.cpp/blob/master/examples/server/README.md why ollama support only one adapter? https://github.com/ollama/ollama/blob/65973ceb6417c2e2796fa59bd3225bc7bd79b403/llm/server.go#L203-L206
GiteaMirror added the feature request label 2026-04-22 10:27:51 -05:00
Author
Owner

@ItzCrazyKns commented on GitHub (Nov 14, 2024):

Should be closed by #7667

<!-- gh-comment-id:2476361339 --> @ItzCrazyKns commented on GitHub (Nov 14, 2024): Should be closed by #7667
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#30628