IronClaw is built on a simple principle: your AI assistant should work for you, not against you.
In a world where AI systems are increasingly opaque about data handling and aligned with corporate interests, IronClaw takes a different approach:
IronClaw is the AI assistant you can actually trust with your personal and professional life.
Visit Releases page to see the latest updates.
Install via Windows Installer (Windows):
Download the Windows Installer and run it.
Install via powershell script (Windows):
irm https://github.com/nearai/ironclaw/releases/latest/download/ironclaw-installer.ps1 | iex
Install via shell script (macOS, Linux, Windows/WSL):
curl --proto '\=https' --tlsv1.2 -LsSf https://github.com/nearai/ironclaw/releases/latest/download/ironclaw-installer.sh | sh
Install via Homebrew (macOS/Linux):
brew install ironclaw
Compile the source code (Cargo on Windows, Linux, macOS):
Install it with cargo, just make sure you have Rust installed on your computer.
# Clone the repository
git clone https://github.com/nearai/ironclaw.git
cd ironclaw
# Build
cargo build --release
# Run tests
cargo test
For full release (after modifying channel sources), run ./scripts/build-all.sh to rebuild channels first.
# Create database
createdb ironclaw
# Enable pgvector
psql ironclaw -c "CREATE EXTENSION IF NOT EXISTS vector;"
Run the setup wizard to configure IronClaw:
ironclaw onboard
The wizard handles database connection, NEAR AI authentication (via browser OAuth), and secrets encryption (using your system keychain). Settings are persisted in the connected database; bootstrap variables (e.g. DATABASE_URL, LLM_BACKEND) are written to ~/.ironclaw/.env so they are available before the database connects.
IronClaw defaults to NEAR AI but works with any OpenAI-compatible endpoint. Popular options include OpenRouter (300+ models), Together AI, Fireworks AI, Ollama (local), and self-hosted servers like vLLM or LiteLLM.
Select "OpenAI-compatible" in the wizard, or set environment variables directly:
LLM\_BACKEND\=openai\_compatible
LLM\_BASE\_URL\=https://openrouter.ai/api/v1
LLM\_API\_KEY\=sk-or-...
LLM\_MODEL\=anthropic/claude-sonnet-4
See docs/LLM_PROVIDERS.md for a full provider guide.
IronClaw implements defense in depth to protect your data and prevent misuse.
All untrusted tools run in isolated WebAssembly containers:
WASM ──► Allowlist ──► Leak Scan ──► Credential ──► Execute ──► Leak Scan ──► WASM
Validator (request) Injector Request (response)
External content passes through multiple security layers:
┌────────────────────────────────────────────────────────────────┐
│ Channels │
│ ┌──────┐ ┌──────┐ ┌─────────────┐ ┌─────────────┐ │
│ │ REPL │ │ HTTP │ │WASM Channels│ │ Web Gateway │ │
│ └──┬───┘ └──┬───┘ └──────┬──────┘ │ (SSE + WS) │ │
│ │ │ │ └──────┬──────┘ │
│ └─────────┴──────────────┴────────────────┘ │
│ │ │
│ ┌─────────▼─────────┐ │
│ │ Agent Loop │ Intent routing │
│ └────┬──────────┬───┘ │
│ │ │ │
│ ┌──────────▼────┐ ┌──▼───────────────┐ │
│ │ Scheduler │ │ Routines Engine │ │
│ │(parallel jobs)│ │(cron, event, wh) │ │
│ └──────┬────────┘ └────────┬─────────┘ │
│ │ │ │
│ ┌─────────────┼────────────────────┘ │
│ │ │ │
│ ┌───▼─────┐ ┌────▼────────────────┐ │
│ │ Local │ │ Orchestrator │ │
│ │Workers │ │ ┌───────────────┐ │ │
│ │(in-proc)│ │ │ Docker Sandbox│ │ │
│ └───┬─────┘ │ │ Containers │ │ │
│ │ │ │ ┌───────────┐ │ │ │
│ │ │ │ │Worker / CC│ │ │ │
│ │ │ │ └───────────┘ │ │ │
│ │ │ └───────────────┘ │ │
│ │ └─────────┬───────────┘ │
│ └──────────────────┤ │
│ │ │
│ ┌───────────▼──────────┐ │
│ │ Tool Registry │ │
│ │ Built-in, MCP, WASM │ │
│ └──────────────────────┘ │
└────────────────────────────────────────────────────────────────┘
| Component | Purpose |
|---|---|
| Agent Loop | Main message handling and job coordination |
| Router | Classifies user intent (command, query, task) |
| Scheduler | Manages parallel job execution with priorities |
| Worker | Executes jobs with LLM reasoning and tool calls |
| Orchestrator | Container lifecycle, LLM proxying, per-job auth |
| Web Gateway | Browser UI with chat, memory, jobs, logs, extensions, routines |
| Routines Engine | Scheduled (cron) and reactive (event, webhook) background tasks |
| Workspace | Persistent memory with hybrid search |
| Safety Layer | Prompt injection defense and content sanitization |
# First-time setup (configures database, auth, etc.)
ironclaw onboard
# Start interactive REPL
cargo run
# With debug logging
RUST\_LOG=ironclaw=debug cargo run
# Format code
cargo fmt
# Lint
cargo clippy --all --benches --tests --examples --all-features
# Run tests
createdb ironclaw\_test
cargo test
# Run specific test
cargo test test\_name
./channels-src/telegram/build.sh before cargo build so the updated WASM is bundled.IronClaw is a Rust reimplementation inspired by OpenClaw. See FEATURE_PARITY.md for the complete tracking matrix.
Key differences:
Last modified 22 March 2026