Self-Hosted Intelligence
Choose your model tier - Lite, Standard, or Premium. All run locally via llama.cpp. Your cluster metadata never leaves the VPC.
No SaaS. No telemetry egress. No success tax.
COMING SOON
Gerty is in active development. The first public release is not yet available. Star the repo to get notified when it ships.