Ai Agents 4 min read

Cloudflare Ships Dynamic Workers for AI Code Execution

Cloudflare shipped Dynamic Workers, an isolate-based sandbox that starts in milliseconds and uses a fraction of container memory, now in open beta.

Cloudflare shipped Dynamic Workers, an isolate-based sandboxing system for executing AI-generated code, now in open beta for all paid Workers users. Dynamic Workers start in milliseconds and use a few megabytes of memory, making them roughly 100x faster to boot and 10x-100x more memory efficient than Linux containers.

The core idea: a Cloudflare Worker can spin up another Worker at runtime, with code specified on the fly, in its own isolated sandbox. The sandbox runs on V8 isolates, the same engine behind Chrome and the entire Workers platform for the past eight years.

How It Works

Dynamic Workers support two loading modes. load(code) creates a fresh sandbox for one-time execution. get(id, callback) caches a sandbox by ID so it stays warm across requests.

A parent Worker provides the code, chooses which bindings the Dynamic Worker can access, and controls network access through a globalOutbound option. Setting globalOutbound to null blocks all outbound requests. You can also intercept outbound HTTP to inject credentials, rewrite requests, or enforce access policies without the agent code ever seeing secrets.

let worker = env.LOADER.load({
  compatibilityDate: "2026-03-01",
  mainModule: "agent.js",
  modules: { "agent.js": agentCode },
  env: { CHAT_ROOM: chatRoomRpcStub },
  globalOutbound: null,
});

await worker.getEntrypoint().myAgent(param);

The sandbox runs on the same machine and often the same thread as the parent Worker, so there is no cross-network latency. Dynamic Workers are available in every Cloudflare location globally, with no limits on concurrent sandboxes or creation rate.

Code Mode and the Cloudflare MCP Server

Alongside Dynamic Workers, Cloudflare released a unified MCP server that exposes the entire Cloudflare API (2,500+ endpoints) through just two tools: search() and execute(), consuming around 1,000 tokens total.

This is built on Code Mode, a pattern where agents write code against typed APIs instead of making sequential tool calls. The agent writes JavaScript that chains multiple API calls together, runs it in a Dynamic Worker, and returns only the final result. Cloudflare reports this cuts token usage by up to 81% in general use and 99.9% for their full API surface compared to exposing each endpoint as a separate MCP tool.

The MCP server uses OAuth 2.1 via Workers OAuth Provider, scoping tokens to the permissions the user explicitly grants.

Helper Libraries

Cloudflare released three companion libraries:

LibraryPurpose
@cloudflare/codemodeWraps MCP servers into Code Mode with a DynamicWorkerExecutor, or builds an MCP server from an OpenAPI spec with built-in search() and execute()
@cloudflare/worker-bundlerBundles TypeScript, resolves npm dependencies, and returns module maps for the Worker Loader
@cloudflare/shellGives agents a virtual filesystem inside a Dynamic Worker with search, replace, diff, glob, and transactional batch writes backed by SQLite + R2

The @cloudflare/shell library is notable for agent workflows that need file manipulation. It provides typed methods on a state object (read, write, search, replace, diff, glob, JSON query/update, archive) with structured inputs and outputs rather than string parsing. Batch writes are transactional by default: if any write fails, earlier writes roll back.

Why Isolates Over Containers

The tradeoff is language support. Dynamic Workers run JavaScript (and Python/WebAssembly), not arbitrary container workloads. But for AI-generated code, this is the intended use case. LLMs are strong JavaScript generators, and JavaScript is designed to be sandboxed.

Cloudflare argues the security model is stronger than containers for this purpose. V8 isolates have a narrower attack surface when combined with their defense-in-depth layers: automatic V8 security patches deployed within hours, a custom second-layer sandbox with dynamic tenant cordoning, hardware-level protections (MPK), and novel Spectre defenses developed with TU Graz researchers.

Containers still make sense when you need full OS-level environments. Cloudflare offers both: their container runtime for heavier workloads, and Dynamic Workers for lightweight, high-frequency code execution.

Pricing

Dynamic Workers cost $0.002 per unique Worker loaded per day, plus standard Workers CPU time and invocation pricing. The per-load charge is waived during the beta period.

For one-off code execution (the typical AI agent pattern), this means $0.002 per execution. Cloudflare notes this is typically negligible compared to the inference cost of generating the code.

Getting Started

Dynamic Workers are available now on Workers Paid plans. Add a worker_loaders binding to your wrangler.jsonc and use env.LOADER.load() or env.LOADER.get(). Cloudflare provides a starter template and a playground for testing.

The unified Cloudflare MCP server is available at https://mcp.cloudflare.com/mcp and supports both user tokens and API tokens for CI/CD.

Get Insanely Good at AI

Get Insanely Good at AI

The book for developers who want to understand how AI actually works. LLMs, prompt engineering, RAG, AI agents, and production systems.

Keep Reading