[GH-ISSUE #2446] Ollama server stuck using Mixtral on M3 #47941

Closed
opened 2026-04-28 05:59:24 -05:00 by GiteaMirror · 6 comments
Owner

Originally created by @galleon on GitHub (Feb 11, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/2446

Ollama stopped serving my requests after %hours Part of the log is here

The prompt is large but the quite the same everytime.
Quick and dirty code if you want to reproduce it is there

Let me know if you need more information.

Originally created by @galleon on GitHub (Feb 11, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/2446 Ollama stopped serving my requests after %hours Part of the log is [here](https://gist.github.com/galleon/d538c6d7df7f276bf93861422eb71605) The prompt is large but the quite the same everytime. Quick and dirty code if you want to reproduce it is [there](https://gist.github.com/galleon/9c7e4f42e58e4ab686c461b514f60080) Let me know if you need more information.
Author
Owner

@igorschlum commented on GitHub (Feb 11, 2024):

Hi @galleon
How much memory you have? Did you tried with other Models than Mixtral? I have a m2 with 192gb and will try to reproduce the issue.
Thank you for the shared code.

<!-- gh-comment-id:1937463335 --> @igorschlum commented on GitHub (Feb 11, 2024): Hi @galleon How much memory you have? Did you tried with other Models than Mixtral? I have a m2 with 192gb and will try to reproduce the issue. Thank you for the shared code.
Author
Owner

@igorschlum commented on GitHub (Feb 11, 2024):

The script is running, I have to wait 6 hours or more to see if it crashes. I will let you know.

<!-- gh-comment-id:1937611795 --> @igorschlum commented on GitHub (Feb 11, 2024): The script is running, I have to wait 6 hours or more to see if it crashes. I will let you know.
Author
Owner

@galleon commented on GitHub (Feb 11, 2024):

Hi @igorschlum thanks for your help. My Mac has a max memory possible i.e. 128GB.

the program will not crash it will just stop.ah ah ah and if it does not … I am interested by the outcome :-)

<!-- gh-comment-id:1937661916 --> @galleon commented on GitHub (Feb 11, 2024): Hi @igorschlum thanks for your help. My Mac has a max memory possible i.e. 128GB. the program will not crash it will just stop.ah ah ah and if it does not … I am interested by the outcome :-)
Author
Owner

@galleon commented on GitHub (Feb 11, 2024):

Also wondering if it is possible to have a log more verbose

<!-- gh-comment-id:1937662521 --> @galleon commented on GitHub (Feb 11, 2024): Also wondering if it is possible to have a log more verbose
Author
Owner

@stevengans commented on GitHub (Feb 12, 2024):

This is a duplicate seen here: https://github.com/ollama/ollama/issues/2339

<!-- gh-comment-id:1938680415 --> @stevengans commented on GitHub (Feb 12, 2024): This is a duplicate seen here: https://github.com/ollama/ollama/issues/2339
Author
Owner

@galleon commented on GitHub (Feb 12, 2024):

Closing as it seems to have been resolved. I will test asap

<!-- gh-comment-id:1939527013 --> @galleon commented on GitHub (Feb 12, 2024): Closing as it seems to have been resolved. I will test asap
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#47941