hardware.
Everything we've published on hardware across guides, agents, hardware reviews and glossary entries — 12 entries in total.
Guides (3)
- Pocket AI 2026 — the complete guide to running self-hosted AI on portable hardwareAI Agents · 2026-04-29
The reference guide on Pocket AI: running self-hosted AI agents and local LLMs on Raspberry Pi, Mac Mini, mini PCs, Framework laptops and edge devices. Hardware comparison, agent compatibility, real-world benchmarks, and the manifesto.
- Edge AI hardware buyer's guide 2026 — Raspberry Pi 5 vs Mini PC vs Mac Mini vs FrameworkAI Agents · 2026-04-30
Honest hands-on hardware buyer's guide for self-hosted AI agents in 2026. Raspberry Pi 5, Intel NUC and clones, Mac Mini M4, Framework Laptop, Orange Pi 5 Plus — real benchmarks, real bills, concrete recommendations by budget.
- GPU vs CPU for self-hosted AI inference — when each genuinely wins in 2026AI Agents · 2026-05-26
When does a GPU actually pay for itself in self-hosted AI inference, and when is a modern CPU genuinely the better answer? Real benchmarks across Mac Mini M4, Intel NUC 13, Raspberry Pi 5, and a single-GPU box. Watts per token, euros per million tokens, and the surprising places CPU wins.
Agents (2)
- ZeroClaw
Privacy-first. Local LLMs only. Network egress denied at iptables. AGPL-3.0.
- ZeroClaw Lite
Stripped-down ZeroClaw fork for resource-constrained hosts. Phi-3 mini default, runs comfortably on a Pi 5.
Hardware (4)
- Framework Laptop 13 / 16
Repairable, modular laptop with strong Linux support. The right choice if you want a portable agent host that doubles as a daily driver.
- Geekom IT13 / generic Intel mini PC
Sub-€500 mini PC with i7-13620H, 32 GB RAM, 1 TB SSD. The pragmatic alternative to the Intel NUC.
- Mac Studio M3 Ultra
192 GB unified memory ceiling. The local-LLM workstation. Llama 3.3 70B at 22 tok/s. €4,500+.
- Khadas Edge2 Pro
Premium SBC with 16 GB RAM and the same RK3588 as Orange Pi 5 Plus. Better build, smaller community.
Glossary (3)
- Self-hosted AI — AI software you install on hardware you control, rather than consuming as a hosted product.
- Local LLM — Language model running on local hardware rather than via cloud API. Llama, Qwen, Mistral local variants.
- Pocket AI — Self-hosted AI agents and language models running on portable, low-power hardware you own.