Air-gapped deployment (Ollama)

For regulated environments, Autopilot can run with local models via Ollama. This keeps reasoning on-prem while maintaining the same approval and evidence pack workflow.

Approach

Tip: start with a smaller local model for cost, then upgrade the model when your workflow is stable.

Install without Tailscale

sudo ./install/install.sh --skip-tailscale

Then set your local provider details in the OpenClaw config.

Still safe