Open Source · AGPLv3

Run open-source AI on
your own hardware.

One command to go from bare hardware to a fully working local AI API and management dashboard. No cloud required. No API keys. No data leaving your network.

curl -fsSL https://warphost.io/install | bash

How it works

Three steps from bare hardware to a working AI API. No PhD required.

1

Install

Run the one-line installer. WarpHost detects Docker, checks for NVIDIA GPUs, and sets up everything automatically.

2

Detect & Recommend

WarpHost scans your hardware — GPU model, VRAM, CPU, RAM — and recommends the best models for your setup.

3

Run

Pull a model and start serving. You get an OpenAI-compatible API and a management dashboard instantly.

Everything you need to run AI locally

OpenAI-Compatible API

Drop-in replacement for OpenAI's API. Point any client at localhost:8811/v1 and it just works.

Hardware Auto-Detection

Automatically detects your NVIDIA GPU, VRAM, and system specs. Recommends the best models for your hardware.

Management Dashboard

Clean web UI to monitor your system, manage models, and test with a built-in chat playground.

One-Click Model Management

Browse a curated catalog, pull models with one click, switch between them instantly.

Docker Native

Runs in Docker with NVIDIA Container Toolkit for GPU passthrough. Clean, isolated, easy to update.

100% Local & Private

No data leaves your network. No API keys. No cloud dependency. Your hardware, your models, your data.

Supported Models

A curated selection of the best open-source models, optimized for local hardware.

Llama 3.1 8B

8 GB

Meta's flagship open model. Great all-rounder.

Qwen 2.5 (3B-32B)

4-24 GB

Alibaba's models. Excellent coding and multilingual.

Mistral 7B

8 GB

Efficient with a massive 32K context window.

DeepSeek R1 Distill

24 GB

Exceptional reasoning. Chain-of-thought built in.

Phi-3 Mini

4 GB

Microsoft's small powerhouse. Runs on anything.

Llama 3.3 70B

48 GB

Top-tier quality for those with the hardware.

Ready to get started?

WarpHost is free, open source, and ready to run on your hardware today.