< Back to all hacks

#18 Lazy Loading v3 (Proxy-based deferred require)

RAM
Problem
OpenClaw loads 1547 modules at startup. Most are never used (Discord when using Telegram, etc.).
Solution
Proxy-based deferred require in hijack.js. 37 package prefixes return ES6 Proxy that loads real module on first property access.
Lesson
Proxy getOwnPropertyDescriptor must return configurable: true for __esModule, or bundler interop breaks.

Context

OpenClaw loads every channel SDK (Discord, Telegram, Slack, WhatsApp), every AI provider SDK (Anthropic, OpenAI, Cohere, Mistral, Google), and every utility library at startup. On a standard server this is invisible — 1547 modules load in 2 seconds. On the Moto E2 with 1 GB RAM, those modules consume ~40 MB of heap before a single message is processed. Most are never used: if you connect Telegram, the Discord SDK sits in memory doing nothing.

This is v3 of the lazy loading system. v1 used regex-based require path interception (fragile, broke on deep imports). v2 used a Module._load hook (worked but couldn't handle ESM interop correctly — Webpack-style __esModule checks failed). v3 uses ES6 Proxy objects that intercept property access, deferring the real require() until the first actual property is accessed. The Proxy approach handles all access patterns: destructuring, property reads, method calls, and ESM interop checks.

Implementation

The lazy loading system lives in hijack.js, which is loaded before the gateway via Node's -r (require) flag. It patches Module.prototype.require to intercept calls for known-deferrable packages.

// hijack.js — lazy loading v3 (proxy-based)
const Module = require("module");
const _origRequire = Module.prototype.require;

// 37 package prefixes that can be safely deferred
const _LAZY_PKGS = [
  "@anthropic-ai", "@aws-sdk", "@buape/carbon", "@google/generative-ai",
  "@larksuiteoapi", "@mistralai", "@slack/web-api", "@whiskeysockets/baileys",
  "axios", "cheerio", "cohere-ai", "discord.js", "form-data",
  "google-auth-library", "googleapis", "groq-sdk", "libsignal",
  "node-fetch", "openai", "pdf-parse", "pdfjs-dist", "puppeteer",
  "sharp", "tesseract.js", "undici", "ws",
  // ... 37 total
];

function _isLazy(id) {
  return _LAZY_PKGS.some(p => id === p || id.startsWith(p + "/"));
}

function _makeLazyProxy(id) {
  let _real = null;
  const _load = () => {
    if (!_real) {
      const before = process.memoryUsage().rss;
      const t0 = Date.now();
      _real = _origRequire.call(this, id);
      const after = process.memoryUsage().rss;
      const delta = ((after - before) / 1048576).toFixed(1);
      console.log(`[lazy] ${id} +${delta}MB (${Date.now() - t0}ms)`);
    }
    return _real;
  };

  return new Proxy(Object.create(null), {
    get(_, prop) {
      // __esModule check: return true without loading the module
      if (prop === "__esModule") return true;
      if (prop === Symbol.toPrimitive) return undefined;
      return _load()[prop];
    },
    getOwnPropertyDescriptor(_, prop) {
      if (prop === "__esModule") {
        // CRITICAL: must be configurable, or Object.defineProperty throws
        return { value: true, writable: false, enumerable: false, configurable: true };
      }
      return Object.getOwnPropertyDescriptor(_load(), prop);
    },
    has(_, prop) {
      if (prop === "__esModule") return true;
      return prop in _load();
    },
    ownKeys() {
      return Reflect.ownKeys(_load());
    },
    apply(_, thisArg, args) {
      return Reflect.apply(_load(), thisArg, args);
    },
    construct(_, args) {
      return Reflect.construct(_load(), args);
    }
  });
}

Module.prototype.require = function(id) {
  if (_isLazy(id)) return _makeLazyProxy.call(this, id);
  return _origRequire.call(this, id);
};

The gateway launches with the hijack preloaded:

HIJACK=/data/data/com.termux/files/home/root/hijack.js
LD_PRELOAD=$PREFIX/lib/libapi23compat.so \
  node22-icu -r $HIJACK --max-old-space-size=112 --max-semi-space-size=2 \
  $PREFIX/lib/node_modules/openclaw/dist/gateway.js

Verification

# Watch the gateway log during startup — deferred packages show no [lazy] lines
# Only when a Telegram message triggers an AI call:
# [lazy] openai +8.2MB (340ms)
# [lazy] axios +3.1MB (120ms)

# Check heap usage at startup (should be ~93 MB, not ~133 MB):
curl -s http://localhost:9000/api/status | grep heap
// Test the proxy manually in a Node REPL:
const proxy = _makeLazyProxy("discord.js");
console.log(typeof proxy);        // "object" — no load yet
console.log(proxy.__esModule);    // true — no load yet
console.log(proxy.Client);        // [lazy] discord.js +12MB — loads now

Gotchas

  • The getOwnPropertyDescriptor trap for __esModule MUST return configurable: true. Webpack and Rollup interop code calls Object.defineProperty(exports, "__esModule", ...) after checking the existing descriptor. If the descriptor is non-configurable, Object.defineProperty throws a TypeError and the import fails silently
  • Symbol.toPrimitive must return undefined, not trigger a load. Some logging frameworks call String(value) which triggers Symbol.toPrimitive. Loading a 12 MB SDK just to print a debug string defeats the purpose
  • The apply and construct traps are needed for packages that export a function as the default (e.g., const fetch = require("node-fetch")). Without these traps, calling the proxy as a function throws "proxy is not a function"
  • Logging on load is essential for debugging. Without it, RAM spikes are invisible and you cannot tell which code path triggered the load
  • Some packages with native addons (sharp, tesseract.js) take 500ms+ to load. The lazy approach turns a 45s startup into a 25s startup with occasional 500ms hitches on first use

Result

MetricWithout Lazy LoadingWith v3 Lazy Loading
Modules loaded at startup1547~400
Deferred packages037 prefixes (~8 loaded on-demand)
Startup RSS~195 MB~155 MB
RAM deferred0~40 MB
Startup time~45s~25s