Self-hosted AI hosting in 2026 spans free Cloudflare Workers, €5/month VPS, €1,899 Mac Mini and €4,500 Mac Studio. Each has a specific shape of workload it fits. This hub maps which is which.
The free tier first. Cloudflare Workers covers 100K requests per day at zero recurring cost — enough for a small Moltworker-based agent serving a single user. Workers runtime constraints rule out browser automation and native binaries; everything else fits.
The €5-15/month tier is where most self-hosters should sit. Hetzner CX22 (€5, 2 vCPU, 4 GB RAM) runs Hermes Agent without browser tool comfortably. Add the browser tool and you want CCX13 (€38, dedicated CPU, 16 GB RAM) instead. Hostinger and Contabo offer similar tiers.
The €30-100/month tier is where managed-hosting services compete with VPS. ClawRift, NitroClaw, DeployHermes and similar all sit here. For anyone who wants the agent without the sysadmin work, these are reasonable. For anyone who values the control of self-hosting, they're not.
Above that, you're in workstation territory. Mac Mini M4 (€1,099-1,899) and Mac Studio M3 Ultra (€4,500+) are the small-form-factor leaders for local-LLM-heavy workloads. Custom workstations with discrete GPUs go further but trade the unified-memory advantage.
Pick the tier that matches your workload, not the tier that matches your aspirations.
Guides
- Edge AI hardware buyer's guide 2026 — Raspberry Pi 5 vs Mini PC vs Mac Mini vs Framework — Honest hands-on hardware buyer's guide for self-hosted AI agents in 2026. Raspberry Pi 5, Intel NUC and clones, Mac Mini M4, Framework Laptop, Orange Pi 5 Plus — real benchmarks, real bills, concrete recommendations by budget.
- Pocket AI 2026 — the complete guide to running self-hosted AI on portable hardware — The reference guide on Pocket AI: running self-hosted AI agents and local LLMs on Raspberry Pi, Mac Mini, mini PCs, Framework laptops and edge devices. Hardware comparison, agent compatibility, real-world benchmarks, and the manifesto.
Agents on this topic
- Hermes Agent — Post-OpenClaw safe default. Docker-sandboxed by default, multi-LLM, opinionated. The agent we'd hand a colleague today.
- Moltworker — Self-hosted AI agent on Cloudflare Workers. Free at low volume. Workers runtime constraints apply.
- ZeroClaw — Privacy-first. Local LLMs only. Network egress denied at iptables. AGPL-3.0.
Hardware on this topic
- Raspberry Pi 5 — The default starting point for pocket AI in 2026. 4–8 GB of LPDDR4X, ARM Cortex-A76, sub-€100, runs Hermes Agent (no browser tool) or Nanobot comfortably.
- Intel NUC 13 / Mini PC — Mini PCs at €300–600 with i5/i7 + 16–32 GB RAM. The sweet spot for self-hosted AI agents that need browser automation and decent local model performance.
- Mac Mini M4 / M4 Pro — The single best small-form-factor host for local LLMs in 2026. Apple Silicon unified memory makes 70B-class models tractable on a desk-sized machine.
- Mac Studio M3 Ultra — 192 GB unified memory ceiling. The local-LLM workstation. Llama 3.3 70B at 22 tok/s. €4,500+.
- Lenovo ThinkCentre M75q (used) — €150–250 used Ryzen mini PC. 16 GB RAM, NVMe slot, runs Hermes Agent comfortably. The budget winner.
Providers
- Hetzner — German VPS provider. CX22 plan at €5/month is our standard test rig. The cheapest credible EU VPS in 2026.
- Cloudflare Workers — Edge serverless platform. Free tier covers serious traffic. The reference deployment target for Moltworker.
- Tailscale — Mesh VPN built on WireGuard. The standard 2026 way to keep self-hosted dashboards off the public internet.
Terms
VPS · Hetzner · Cloudflare Workers · Tailscale · Caddy · Docker Compose