sed.
Everything we've published on sed across guides, agents, hardware reviews and glossary entries — 15 entries in total.
Guides (1)
- Git Wrapper sed RewriteRuntime · 2026-02-18
Using sed in the git wrapper to rewrite all three SSH URL formats (git+ssh://, ssh://, git@) to HTTPS in a single pass, since proot has no SSH agent for GitHub authentication.
Agents (2)
- ZeroClaw
Privacy-first. Local LLMs only. Network egress denied at iptables. AGPL-3.0.
- ZeroClaw Lite
Stripped-down ZeroClaw fork for resource-constrained hosts. Phi-3 mini default, runs comfortably on a Pi 5.
Hardware (5)
- Raspberry Pi 5
The default starting point for pocket AI in 2026. 4–8 GB of LPDDR4X, ARM Cortex-A76, sub-€100, runs Hermes Agent (no browser tool) or Nanobot comfortably.
- Mac Mini M4 / M4 Pro
The single best small-form-factor host for local LLMs in 2026. Apple Silicon unified memory makes 70B-class models tractable on a desk-sized machine.
- Mac Studio M3 Ultra
192 GB unified memory ceiling. The local-LLM workstation. Llama 3.3 70B at 22 tok/s. €4,500+.
- Lenovo ThinkCentre M75q (used)
€150–250 used Ryzen mini PC. 16 GB RAM, NVMe slot, runs Hermes Agent comfortably. The budget winner.
- Dell OptiPlex 3070 / 5070 Micro (used)
€100–180 used Intel mini PC. 8th gen i5, 8–16 GB RAM, NVMe. The cheapest credible mini PC for self-hosted AI.
Glossary (7)
- Hermes Agent — Open-source self-hosted AI agent from Nous Research, released February 2026. Sandboxed by default, multi-LLM.
- Moltworker — Cloudflare Workers-based self-hosted AI agent reference implementation.
- VPS — Virtual private server — a virtualised slice of a physical server, typically rented by the month.
- Edge AI — AI workloads running close to where data is generated rather than in centralised cloud datacenters.
- Qwen — Alibaba's open-weight LLM family. Qwen 2.5 Coder 7B is widely used for self-hosted code-focused agents.
- GGUF — File format for storing quantised LLM weights, used by llama.cpp and Ollama.
- Vector database — Database optimised for similarity search over high-dimensional embeddings.