[GH-ISSUE #7366] Add AirLLM or similar to allow running big models with low RAM #4682

Closed
opened 2026-04-12 15:36:25 -05:00 by GiteaMirror · 2 comments
Owner

Originally created by @danividalg on GitHub (Oct 25, 2024).
Original GitHub issue: https://github.com/ollama/ollama/issues/7366

I see that project and seems very interesting.
Could you please take a look on it and try to add this or similar feature to Ollama?
Thanks a lot 😊

https://github.com/lyogavin/airllm

Originally created by @danividalg on GitHub (Oct 25, 2024). Original GitHub issue: https://github.com/ollama/ollama/issues/7366 I see that project and seems very interesting. Could you please take a look on it and try to add this or similar feature to Ollama? Thanks a lot 😊 https://github.com/lyogavin/airllm
GiteaMirror added the feature request label 2026-04-12 15:36:25 -05:00
Author
Owner

@danividalg commented on GitHub (Oct 25, 2024):

I close it because seems the same that #6294

<!-- gh-comment-id:2438924528 --> @danividalg commented on GitHub (Oct 25, 2024): I close it because seems the same that #6294
Author
Owner

@danividalg commented on GitHub (Oct 25, 2024):

Duplicate of #6294
Sorry

<!-- gh-comment-id:2438925440 --> @danividalg commented on GitHub (Oct 25, 2024): Duplicate of #6294 Sorry
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: github-starred/ollama#4682