[GH-ISSUE #3707] what is the difference in this two models i think this is a bug #28043

Closed
opened 2026-04-22 05:46:49 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @olumolu on GitHub (Apr 17, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/3707

What are you trying to do?

Screenshot_20240418-000543~2
Screenshot_20240418-000553~2
One 8x7b is 4.1 gb and another is 26 gb what is the difference as both are 8x7b of mixtral model can anyone fix this. If this is a bug.

How should we solve this?

No response

What is the impact of not solving this?

No response

Anything else?

No response

Originally created by @olumolu on GitHub (Apr 17, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/3707 ### What are you trying to do? ![Screenshot_20240418-000543~2](https://github.com/ollama/ollama/assets/162728301/e2a47c3b-800c-4688-a6d2-ec0cfa705af7) ![Screenshot_20240418-000553~2](https://github.com/ollama/ollama/assets/162728301/a035bd3c-9494-4a8c-b167-eda5aad361fa) One 8x7b is 4.1 gb and another is 26 gb what is the difference as both are 8x7b of mixtral model can anyone fix this. If this is a bug. ### How should we solve this? _No response_ ### What is the impact of not solving this? _No response_ ### Anything else? _No response_
Author
Owner

@mchiang0610 commented on GitHub (Apr 17, 2024):

Hey @userforsource sorry for the confusion. One is the Mistral model by MistralAI in 7 billion parameter form, and this is 4.1GB.

The other model you have is Mixtral, by MistralAI in 8x7B parameters as an MoE (mixture of experts) model.

They are two different models. Sorry for the confusion!

<!-- gh-comment-id:2061996553 --> @mchiang0610 commented on GitHub (Apr 17, 2024): Hey @userforsource sorry for the confusion. One is the Mistral model by MistralAI in 7 billion parameter form, and this is 4.1GB. The other model you have is Mixtral, by MistralAI in 8x7B parameters as an MoE (mixture of experts) model. They are two different models. Sorry for the confusion!
Author
Owner

@olumolu commented on GitHub (Apr 17, 2024):

Thanks for clarification
Don't be sorry but i think that should be written instead of mixtral one should be mixtral moe and one is mixtral

<!-- gh-comment-id:2062078012 --> @olumolu commented on GitHub (Apr 17, 2024): Thanks for clarification Don't be sorry but i think that should be written instead of mixtral one should be mixtral moe and one is mixtral
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#28043