Research Project This is a free AI research project. No warranties, SLAs, or company associations. Learn more
Install

One Command

# npm npm install -g openclaw # pnpm pnpm add -g openclaw # bun bun add -g openclaw

The CLI package is named openclaw for backward compatibility. It is the official BrainstormRouter CLI.


Commands

What You Can Do

openclaw send

Send a chat completion request through the router with model selection, streaming, and full header output.

openclaw send "Explain TCP" \
  --model auto \
  --stream

openclaw models

List all available models across providers with health status and stats.

openclaw models
openclaw models --provider anthropic
openclaw models --format json

openclaw gateway

Start the self-hosted gateway server. Full routing, memory, and MCP in your infrastructure.

openclaw gateway --port 3000
openclaw gateway --watch

openclaw config

View and modify router configuration. Set model preferences, budget limits, and provider keys.

openclaw config set routing.default auto
openclaw config get budget
openclaw config list

Self-Hosted

Run the Gateway Locally

The CLI includes the full BrainstormRouter gateway. Run it locally for development, or deploy it in your infrastructure for air-gapped environments.

# Start with your provider keys export ANTHROPIC_API_KEY=sk-ant-... export OPENAI_API_KEY=sk-... openclaw gateway --port 3000 # Now route through localhost curl http://localhost:3000/v1/chat/completions \ -H "Content-Type: application/json" \ -d '{"model":"auto","messages":[{"role":"user","content":"Hello"}]}'

Get Started

npm install -g openclaw