ollama.

Everything we've published on ollama across guides, agents, hardware reviews and glossary entries — 7 entries in total.

Guides (2)

Agents (1)

  • ZeroClaw

    Privacy-first. Local LLMs only. Network egress denied at iptables. AGPL-3.0.

Hardware (1)

  • Intel NUC 13 / Mini PC

    Mini PCs at €300–600 with i5/i7 + 16–32 GB RAM. The sweet spot for self-hosted AI agents that need browser automation and decent local model performance.

Glossary (3)

  • Ollama Local LLM runtime that exposes an OpenAI-compatible API over local model weights.
  • llama.cpp C++ inference engine for LLMs. The runtime under most local-LLM setups, including Ollama.
  • GGUF File format for storing quantised LLM weights, used by llama.cpp and Ollama.