LIVE TAPE
OpenClaw 88,412 stars·CVE-2026-25898 disclosed (HIGH, Hermes)·Hermes Agent v2026.4.7 published·Hermes Agent +182 stars (last hour)·OpenClaw v2026.4.6 — credential vault hardening·CVE-2026-26133 patched (NanoClaw)·Pi 5 16GB rumoured for Q3 — recheck guidance·Nanobot +47 stars (last hour)·ZeroClaw v0.4.2 — Apple container fixes·Mac Mini M4 wins quarterly hardware survey·OpenClaw 88,412 stars·CVE-2026-25898 disclosed (HIGH, Hermes)·Hermes Agent v2026.4.7 published·Hermes Agent +182 stars (last hour)·OpenClaw v2026.4.6 — credential vault hardening·CVE-2026-26133 patched (NanoClaw)·Pi 5 16GB rumoured for Q3 — recheck guidance·Nanobot +47 stars (last hour)·ZeroClaw v0.4.2 — Apple container fixes·Mac Mini M4 wins quarterly hardware survey·
PocketClawvol. 1 · 2026

Self-hosted AI Docker Compose stack

A full self-hosted AI stack as one docker-compose file: Hermes Agent + Ollama + Qdrant + Caddy. Bring up the whole rig in two commands.

Prerequisites

  • A VPS or local box with Docker 24+ and Compose v2
  • Minimum 8 GB RAM (16 GB strongly recommended)
  • A domain pointing at the box
  • Anthropic and/or OpenRouter API keys

Steps

  1. Create the project tree

    Keep volumes under one directory so backups are simple.

    mkdir -p ~/agent-stack/{data,logs}
    cd ~/agent-stack
  2. Write the compose file

    This stack runs four services: Hermes Agent (orchestration), Ollama (local LLM), Qdrant (vector store), Caddy (TLS + routing). Save as docker-compose.yml.

    cat > docker-compose.yml <<'EOF'
    services:
      hermes:
        image: nousresearch/hermes-agent:2026.4.4
        container_name: hermes
        restart: unless-stopped
        environment:
          ANTHROPIC_API_KEY: ${ANTHROPIC_API_KEY}
          OLLAMA_HOST: http://ollama:11434
          QDRANT_HOST: http://qdrant:6333
          HERMES_BIND_ADDR: 0.0.0.0:8765
          HERMES_BROWSER_TOOL: "true"
        volumes:
          - ./data/hermes:/data
        networks: [stack]
        depends_on: [ollama, qdrant]
    
      ollama:
        image: ollama/ollama:0.4.7
        container_name: ollama
        restart: unless-stopped
        volumes:
          - ./data/ollama:/root/.ollama
        networks: [stack]
    
      qdrant:
        image: qdrant/qdrant:v1.10.0
        container_name: qdrant
        restart: unless-stopped
        volumes:
          - ./data/qdrant:/qdrant/storage
        networks: [stack]
    
      caddy:
        image: caddy:2-alpine
        container_name: caddy
        restart: unless-stopped
        ports:
          - "80:80"
          - "443:443"
        volumes:
          - ./Caddyfile:/etc/caddy/Caddyfile:ro
          - ./data/caddy:/data
        networks: [stack]
        depends_on: [hermes]
    
    networks:
      stack:
        driver: bridge
    EOF
  3. Caddyfile

    Caddy handles TLS automatically via Let's Encrypt. Replace agent.example.com with your domain.

    cat > Caddyfile <<'EOF'
    agent.example.com {
        reverse_proxy hermes:8765
        encode gzip
    }
    EOF
  4. Set environment variables

    Keep API keys out of the compose file.

    cat > .env <<'EOF'
    ANTHROPIC_API_KEY=sk-ant-...
    EOF
    chmod 600 .env
  5. Bring it up

    First boot pulls all four images (~3 GB). Subsequent restarts are fast.

    docker compose up -d
    docker compose logs -f hermes
    # Pull a default Ollama model:
    docker compose exec ollama ollama pull qwen2.5-coder:3b-instruct-q4_K_M

Troubleshooting

Hermes can't reach Ollama
Inside the stack network, services resolve by container name. Check the OLLAMA_HOST env var is http://ollama:11434, not http://localhost:11434.
Caddy can't get a TLS cert
Confirm DNS A/AAAA records are correct and ports 80 and 443 reach the box. ufw or cloud security groups may be blocking inbound traffic.
Qdrant fails to start with 'permission denied' on volume
Set ownership: sudo chown -R 1000:1000 ./data/qdrant. Some hosts default to root-owned bind mounts.

Where to go from here

Add a Watchtower service to the stack for automatic security updates — but be cautious with auto-updating Hermes itself; pin minor versions.

Other tutorials
intermediate
Hermes Agent on a Raspberry Pi 5
End-to-end install of Hermes Agent on a fresh Raspberry Pi 5 (8 GB), accessed via Tailscale, with Cl…
beginner
Tailscale for self-hosted AI dashboards
Set up Tailscale to access your agent dashboard from anywhere without exposing it on the public inte…
intermediate
Ollama + Phi-3 mini on a Raspberry Pi 5
Install Ollama and the smallest credible local LLM (Phi-3 mini 3.8B Q4) on a Raspberry Pi 5. Useful …
beginner
Caddy reverse proxy with HTTPS for a self-hosted AI dashboard
Front your agent dashboard with Caddy on port 443 with automatic HTTPS via Let's Encrypt — no certbo…