[GH-ISSUE #9334] need QVQ-72B-Preview #68151

Closed
opened 2026-05-04 12:38:51 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @OnceCrazyer on GitHub (Feb 25, 2025).
Original GitHub issue: https://github.com/ollama/ollama/issues/9334

Does Ollama already support the Multi-modality model QVQ-72B-Preview?
When I use joefamous/QVQ-72B-Preview: latest, the error message is as follows:
[500] Internal Server Error - {"error": "POST predict: Post" http://127.0.0.1:63855/completion\ ": EOF"}

Originally created by @OnceCrazyer on GitHub (Feb 25, 2025). Original GitHub issue: https://github.com/ollama/ollama/issues/9334 Does Ollama already support the Multi-modality model QVQ-72B-Preview? When I use joefamous/QVQ-72B-Preview: latest, the error message is as follows: [500] Internal Server Error - {"error": "POST predict: Post\" http://127.0.0.1:63855/completion\ ": EOF"}
GiteaMirror added the model label 2026-05-04 12:38:51 -05:00
Author
Owner

@olumolu commented on GitHub (Feb 25, 2025):

https://github.com/ollama/ollama/issues/8362

<!-- gh-comment-id:2681651431 --> @olumolu commented on GitHub (Feb 25, 2025): https://github.com/ollama/ollama/issues/8362
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#68151