[GH-ISSUE #10189] 双服务器显卡共用 #68741

Closed
opened 2026-05-04 15:03:10 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @maotia on GitHub (Apr 9, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/10189

部署大模型想让一个大模型同时使用两台服务器的显卡,有什么方法可以操作吗

Originally created by @maotia on GitHub (Apr 9, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/10189 部署大模型想让一个大模型同时使用两台服务器的显卡,有什么方法可以操作吗
GiteaMirror added the feature request label 2026-05-04 15:03:10 -05:00
Author
Owner

@rick-github commented on GitHub (Apr 9, 2025):

Distributed computing is not supported, see #6729 for an effort using the llama.cpp backend.

<!-- gh-comment-id:2788838664 --> @rick-github commented on GitHub (Apr 9, 2025): Distributed computing is not supported, see #6729 for an effort using the llama.cpp backend.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#68741