Self-hosted OpenAI-compatible LLM inference with a setup wizard. Deploy any HuggingFace model in minutes.
About
Self-hosted, OpenAI-compatible LLM inference with a guided setup wizard. Deploy any model from HuggingFace and expose it on your network in minutes — no command-line tuning required.