I think it works now. Too slow on CPU so can't actually test it without waiting ages. Ollama will only work with a model that supports tool use, so no tinyllama.
1. Node.js version override
Cloudron base image ships Node 22.14, OpenClaw requires 22.16+. We symlink /usr/local/node-22.14.0/bin/node → /usr/bin/node (NodeSource 22.22) so both the gateway
and openclaw CLI use the correct version.
2. Gateway config for reverse proxy
OpenClaw doesn't expect to run behind a proxy by default. start.sh creates /app/data/config/openclaw.json on first run with:
gateway.mode: "local" — skips pairing requirement
gateway.trustedProxies: ["172.18.0.0/16"] — trusts Cloudron's Docker network
gateway.controlUi.dangerouslyAllowHostHeaderOriginFallback: true — allows web UI behind Cloudron's reverse proxy
gateway.controlUi.dangerouslyDisableDeviceAuth: true — skips device pairing for web UI
3. Ollama auth proxy
OpenClaw's Ollama model discovery (/api/tags) doesn't send auth headers. Cloudron's Ollama package requires Bearer token auth. start.sh starts a Node.js HTTP proxy on localhost:11434 that:
- Intercepts all requests to Ollama
- Injects
Authorization: Bearer <OLLAMA_API_KEY> header
- Forwards to the real
OLLAMA_BASE_URL
- Overrides
OLLAMA_BASE_URL to http://127.0.0.1:11434 so OpenClaw talks to the proxy
Activated when both OLLAMA_API_KEY and OLLAMA_BASE_URL are set in /app/data/.env.
4. Auth auto-provisioning
start.sh auto-creates /app/data/agents/main/agent/auth-profiles.json from:
CLAUDE_SETUP_TOKEN → type: "token" profile
ANTHROPIC_API_KEY → type: "api_key" profile
Not sure if this is actually valuable for anyone but it should work.