PocketClawvol. 1 · 2026
Hobbyistreviewed

Old Android phone (Moto E2 et al.)

The PocketClaw origin story device. €15–30 used. Genuinely runs lightweight agents via Termux + proot, with serious caveats.

Specs at a glance

CPUSnapdragon 410 / 615 era — ARM Cortex-A53
RAM options1 GB (Moto E2) — up to 8 GB on newer used phones
Storage8–32 GB internal + microSD
Power draw1–3 W
Form factorfits in a pocket (the original)
Local LLM capabilityNo local LLM
Agent score3/10
Price point€15–30 used

Overview

Repurposing a 2015-era Android phone as an agent host is the project that started PocketClaw. It works, with severe constraints: Termux + proot-distro for a Linux userland, custom Node.js builds because the official binaries don't target old ARMv8, V8 heap tuning for the 1 GB RAM ceiling. The result is a phone that runs a stripped-down Nanobot agent that calls a cloud LLM API. It is a hobbyist project — not a viable production setup. We document the journey under /archive/ for the engineers who'll find it useful.

Best for

  • Hobbyist proof-of-concept
  • Demonstrations of low-power AI hosting
  • Educational projects

Not for

  • Anything production
  • Browser automation
  • Local LLMs (period)
  • Multi-user workloads

Compatible self-hosted agents

Tested working on Old Android phone (Moto E2 et al.) (with the caveats from “Best for” / “Not for” above):

Nanobot
Specialist · MIT

See: all pocket AI hardware · edge AI hardware buyer's guide · how we test.

Other devices to consider
default
Raspberry Pi 5
The default starting point for pocket AI in 2026. 4–8 GB of LPDDR4X, ARM Cortex-A76, sub-€100, runs Hermes Age…
default
Intel NUC 13 / Mini PC
Mini PCs at €300–600 with i5/i7 + 16–32 GB RAM. The sweet spot for self-hosted AI agents that need browser aut…
specialist
Mac Mini M4 / M4 Pro
The single best small-form-factor host for local LLMs in 2026. Apple Silicon unified memory makes 70B-class mo…