[GH-ISSUE #4066] Support IPEX-LLM #64562

Open
opened 2026-05-03 18:08:06 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @shawnshi on GitHub (Apr 30, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/4066

Will add IPEX-LLM support default

Originally created by @shawnshi on GitHub (Apr 30, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/4066 Will add IPEX-LLM support default
GiteaMirror added the model label 2026-05-03 18:08:06 -05:00
Author
Owner

@oatmealm commented on GitHub (Oct 18, 2024):

Wondering what's the status of this... I was able to get it started with HD Graphics 620 tough 'ollama run' fails with few of the models I've tried.

<!-- gh-comment-id:2421526115 --> @oatmealm commented on GitHub (Oct 18, 2024): Wondering what's the status of this... I was able to get it started with HD Graphics 620 tough 'ollama run' fails with few of the models I've tried.
Author
Owner

@oatmealm commented on GitHub (Oct 18, 2024):

Stand corrected. Set layers to 24 and it seems to load the model.

<!-- gh-comment-id:2421544587 --> @oatmealm commented on GitHub (Oct 18, 2024): Stand corrected. Set layers to 24 and it seems to load the model.
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#64562