[GH-ISSUE #9147] Does ollama support multi-node pipeline inference? #5951

Closed
opened 2026-04-12 17:17:47 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @AmazeQiu on GitHub (Feb 16, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/9147

I‘d like to use Deepseek-r1-671b-in4 model. But I can just use serval L20s in different machines. Does ollama support multi-node pipeline inference? I didn't find any instructions about this in docs? Thanks!

Originally created by @AmazeQiu on GitHub (Feb 16, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/9147 I‘d like to use Deepseek-r1-671b-in4 model. But I can just use serval L20s in different machines. Does ollama support multi-node pipeline inference? I didn't find any instructions about this in docs? Thanks!
GiteaMirror added the feature request label 2026-04-12 17:17:47 -05:00
Author
Owner

@rick-github commented on GitHub (Feb 16, 2025):

ollama doesn't currently support distributed inference. See #6729 for ongoing work.

<!-- gh-comment-id:2661453935 --> @rick-github commented on GitHub (Feb 16, 2025): ollama doesn't currently support distributed inference. See #6729 for ongoing work.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#5951