Key Features
- Durable Object Execution Layer: Each agent runs as one Durable Object instance, providing stateful micro-servers with global addressability and millisecond-scale cold-start.
- Built-in Persistence: D1 SQL dialect via this.sql and key/value state via this.setState() with automatic WebSocket client synchronization.
- Real-time WebSocket Support: Native WebSocket handlers (onConnect, onMessage, onClose) for real-time agent interactions.
- Cloudflare Workflows Scheduling: Built-in scheduling via this.schedule() backed by durable cron and delayed tasks.
- Multi-LLM Integration: Plug-in any provider (OpenAI, Workers AI, Anthropic) via ai-sdk with included React hooks.
- Model Context Protocol Support: McpAgent & McpServer classes expose agents as MCP endpoints for integration with Claude Desktop and other MCP clients.
- Multi-Agent Composability: Agents can fetch() each other via Durable Object ID, enabling complex multi-agent workflows.
Code Examples
Quick Start Setup
bash
npm create cloudflare@latest my-agent --template=cloudflare/agents-starter
cd my-agent
npm install
cp .dev.vars.example .dev.vars # add OPENAI_API_KEY
npm run dev # local dev on http://localhost:8787
npx wrangler deploy # global deploy
Minimal Agent Skeleton
typescript
import { Agent } from "agents";
export class MyAgent extends Agent<Env> {
async onStart() {
await this.schedule("0 9 * * *", this.morningReport);
}
async onConnect(conn) {
conn.send(`Hello! Memory: ${await this.sql`SELECT count(*) FROM notes`}`);
}
private async morningReport() {
const notes = await this.sql`SELECT * FROM notes WHERE created > now() - interval '1 day'`;
// send digest …
}
}
Use Cases
- Real-time chatbots with persistent memory & sub-100 ms latency
- Cron-based background workers (e.g., nightly ETL)
- Multi-agent systems (triage → specialist agents)
- Human-in-the-loop approvals via Knock or custom UI
- Serve MCP tool servers to Claude Desktop or other MCP clients
Pros & Cons
Advantages
- Persistent state out-of-the-box (Durable Objects)
- Global edge deployment (< 50 ms for 95% users)
- Built-in scheduler & WebSockets – zero infra glue
- Starter template + React hooks reduce boilerplate
Disadvantages
- Tied to Cloudflare platform (vendor lock-in)
- Cold-start cost for DOs (≈ 5–15 ms)
- SQL dialect differences vs. Postgres
- Pricing scales with DO requests & storage
Future Outlook & Integrations
- WebRTC Audio/Video [In Development]: Branch webrtc-media targeting August 2025
- Email Integration [Available]: Inbound/outbound via Workers Mailchannels binding
- Deeper OpenAI Synergy [Planned]: Roadmap issue #78 'OpenAI Agents SDK native adapter'
- Vectorize DB Integration [Available]: Already usable today; upcoming helper for embedding pipelines
- Jetstream Scheduler [Private Alpha]: Private alpha for long-running (> 24 h) tasks