[GH-ISSUE #3620] Mixtral 8x22b - v0.1 #27990

Closed
opened 2026-04-22 05:41:37 -05:00 by GiteaMirror · 1 comment
Owner

Originally created by @igorschlum on GitHub (Apr 12, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3620

What model would you like?

BTW Mixtral released a new model: https://huggingface.co/mistral-community/Mixtral-8x22B-v0.1

The Mixtral-8x22B Large Language Model (LLM) is a pretrained generative Sparse Mixture of Experts.

Originally created by @igorschlum on GitHub (Apr 12, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3620 ### What model would you like? BTW Mixtral released a new model: https://huggingface.co/mistral-community/Mixtral-8x22B-v0.1 The Mixtral-8x22B Large Language Model (LLM) is a pretrained generative Sparse Mixture of Experts.
Author
Owner

@pdevine commented on GitHub (Apr 16, 2024):

You can find the 8x22B model in the registry. :-D

ollama run mixtral:8x22b

<!-- gh-comment-id:2060070264 --> @pdevine commented on GitHub (Apr 16, 2024): You can find the 8x22B model in the [registry](https://ollama.com/library/mixtral/tags). :-D `ollama run mixtral:8x22b`
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#27990