mirror of
https://github.com/vinta/awesome-python.git
synced 2026-05-08 06:38:26 -05:00
[PR #2807] [CLOSED] Adds RamaLama #4168
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
📋 Pull Request Information
Original PR: https://github.com/vinta/awesome-python/pull/2807
Author: @ieaves
Created: 11/24/2025
Status: ❌ Closed
Base:
master← Head:master📝 Commits (1)
9a2e171adds ramalama📊 Changes
1 file changed (+1 additions, -0 deletions)
View changed files
📝
README.md(+1 -0)📄 Description
What is this Python project?
RamaLama is an open-source CLI tool and runtime helper that simplifies running inference over AI models by leveraging container-based workflows. It treats models similarly to container images: you can pull models from various registries, run them in containers (with the correct runtime for your hardware), serve them via REST or chat interfaces, and manage them with familiar container commands.
GitHub
Key features:
What's the difference between this Python project and similar ones?
RamaLama differs from projects like Ollama and other local-model runners by being container-native, registry-agnostic, and runtime-agnostic. Instead of shipping a bespoke runtime, RamaLama orchestrates OCI containers that encapsulate llama.cpp, vLLM, TensorRT-LLM, or any other backend, and it auto-selects the correct image for your hardware (CPU, CUDA, ROCm, Metal, etc.). Because models are treated as OCI “transports,” it can pull from Hugging Face, ModelScope, generic registries, or Ollama itself using one interface. It also runs everything with hardened defaults (rootless, network-off, capability-dropped).
--
Anyone who agrees with this pull request could submit an Approve review to it.
🔄 This issue represents a GitHub Pull Request. It cannot be merged through Gitea due to API limitations.