[GH-ISSUE #6977] To configure Ollama to run multiple models simultaneously #4416

Closed
opened 2026-04-12 15:21:20 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @DavidAlpha007 on GitHub (Sep 26, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/6977

Originally assigned to: @dhiltgen on GitHub.

if the design of Ollama can support calling multiple models simultaneously. For example, can it be used in evaluation scenarios? Thanks for your support.

Originally created by @DavidAlpha007 on GitHub (Sep 26, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/6977 Originally assigned to: @dhiltgen on GitHub. if the design of Ollama can support calling multiple models simultaneously. For example, can it be used in evaluation scenarios? Thanks for your support.
GiteaMirror added the question label 2026-04-12 15:21:20 -05:00
Author
Owner
<!-- gh-comment-id:2376440358 --> @rick-github commented on GitHub (Sep 26, 2024): https://github.com/ollama/ollama/blob/main/docs/faq.md#how-does-ollama-handle-concurrent-requests
Author
Owner

@dhiltgen commented on GitHub (Sep 26, 2024):

@DavidAlpha007 if the docs above don't clear it up, please clarify your question on what you're trying to accomplish.

<!-- gh-comment-id:2377337729 --> @dhiltgen commented on GitHub (Sep 26, 2024): @DavidAlpha007 if the docs above don't clear it up, please clarify your question on what you're trying to accomplish.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#4416