Quickstart
Deploy Autopilot on Linux or Docker. You’ll end with a running runtime service + Prometheus metrics.
Prerequisites
- Wazuh Manager installed and running (4.8.0+)
- Wazuh MCP Server for Wazuh API access
- OpenClaw agent framework
- Node.js 18+ (runtime service)
- LLM API key (Claude / GPT / Groq / Mistral / Gemini) or Ollama for air-gapped
Install (recommended)
Run the hardened installer. It can set up Tailscale (optional), install the MCP Server, configure OpenClaw, deploy agents, and optionally configure Slack.
git clone https://github.com/gensecaihq/Wazuh-Openclaw-Autopilot.git cd Wazuh-Openclaw-Autopilot sudo ./install/install.sh
Bootstrap / no-Tailscale environments:
sudo ./install/install.sh --skip-tailscaleConfigure
Edit the runtime environment file (path may vary by deployment method):
sudo nano /etc/wazuh-autopilot/.env # Wazuh connection WAZUH_HOST=localhost WAZUH_PORT=55000 WAZUH_USER=wazuh-wui WAZUH_PASS=your-password # At least one LLM provider ANTHROPIC_API_KEY=sk-ant-... OPENAI_API_KEY=sk-... # Optional: Slack SLACK_APP_TOKEN=xapp-... SLACK_BOT_TOKEN=xoxb-...
Tip: Start with one provider, then add fallbacks once the workflow is stable.
Docker deployment
docker-compose up -d
Or build/run the runtime service container:
cd runtime/autopilot-service docker build -t wazuh-autopilot . docker run -d -p 127.0.0.1:9090:9090 --env-file .env wazuh-autopilot
Verify installation
./scripts/health-check.sh curl http://localhost:9090/health curl http://localhost:9090/metrics
If /metrics responds, Prometheus can scrape it and you can start measuring impact.