Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.dedaluslabs.ai/llms.txt

Use this file to discover all available pages before exploring further.

OpenClaw on a Dedalus Machine. No SSH. Everything goes through the execution API. Full script: dedalus-labs/openclaw-ddls.
npm install dedalus-labs dotenv
.env
DEDALUS_API_KEY=<your-dedalus-key>
ANTHROPIC_API_KEY=<your-anthropic-key>

1. Create the machine

import "dotenv/config";
import Dedalus from "dedalus-labs";

const client = new Dedalus({ xAPIKey: process.env.DEDALUS_API_KEY });

const ws = await client.machines.create({ vcpu: 2, memory_mib: 4096, storage_gib: 10 });
let m = ws;
while (m.status.phase !== "running") {
  await new Promise((s) => setTimeout(s, 2000));
  m = await client.machines.retrieve({ machine_id: ws.machine_id });
}
const mid = ws.machine_id;
exec(mid, cmd) below is a thin wrapper around client.machines.executions.create that polls until succeeded/failed and returns stdout. See the reference repo for the 15-line helper.

2. Install OpenClaw into persistent storage

Root fs is small and ephemeral. Redirect everything npm writes (prefix, cache, tmp) into /home/machine/.
await exec(mid, "curl -fsSL https://deb.nodesource.com/setup_22.x | bash - && apt-get install -y nodejs");

await exec(mid,
  "mkdir -p /home/machine/{.npm-global,.npm-cache,.tmp,.openclaw} && " +
  "NPM_CONFIG_PREFIX=/home/machine/.npm-global " +
  "NPM_CONFIG_CACHE=/home/machine/.npm-cache " +
  "TMPDIR=/home/machine/.tmp " +
  "npm install -g openclaw@latest"
);

3. Configure, then launch the gateway

Enable the HTTP endpoint before starting the gateway or you restart it. setsid is how you detach: the execution API waits on foreground processes, and nohup & alone inherits the session.
const ENV = "export PATH=/home/machine/.npm-global/bin:$PATH HOME=/home/machine OPENCLAW_STATE_DIR=/home/machine/.openclaw";

await exec(mid, `${ENV} && openclaw config set gateway.mode local`);
await exec(mid, `${ENV} && openclaw config set env.vars.ANTHROPIC_API_KEY "${process.env.ANTHROPIC_API_KEY}"`);
await exec(mid, `${ENV} && openclaw config set gateway.http.endpoints.chatCompletions.enabled true`);

await exec(mid,
  `pgrep -f openclaw-gateway > /dev/null || ` +
  `(setsid bash -c '${ENV} && exec openclaw gateway run --auth none > /home/machine/.openclaw/gateway.log 2>&1' </dev/null &>/dev/null & disown; sleep 10)`
);

4. Chat

const r = await exec(mid,
  `curl -sS http://127.0.0.1:18789/v1/chat/completions ` +
  `-H 'Content-Type: application/json' ` +
  `-d '{"model":"openclaw/default","messages":[{"role":"user","content":"Hello!"}]}'`
);
console.log(JSON.parse(r).choices[0].message.content);
model is an agent target. Override the backing LLM with x-openclaw-model: anthropic/claude-sonnet-4-6. Streaming with "stream": true. Sessions share history by user.

Full reference

openclaw.ts end-to-end + chat.ts replayer.
Last modified on May 2, 2026