Cloudflare Dynamic Workers Ships AI Code Sandboxes
Cloudflare Dynamic Workers is now in open beta, letting paid Workers users run AI-generated code in isolated sandboxes at runtime.
Cloudflare has moved Dynamic Workers into open beta for paid Workers users, giving AI applications a native way to run generated code inside a separate Worker isolate at runtime. The March 24 launch matters if you build coding agents, tool-using assistants, or browserless execution backends, because it turns code sandboxing into a first-class Workers primitive instead of a container orchestration problem.
Runtime sandboxing inside Workers
Dynamic Workers let a parent Worker create another Worker from code supplied at request time. In Cloudflare’s Dynamic Workers launch, the core APIs split cleanly between one-shot and reusable execution.
env.LOADER.load(code) creates a fresh sandbox for each call. env.LOADER.get(id, callback) reuses a Dynamic Worker by ID so the same code path can stay warm across requests.
That distinction matters for agent design. If your model emits disposable snippets for data transforms or scraping logic, load() maps directly to per-task execution. If your system has stable tools or long-lived mini-apps, get() gives you cacheable runtime instances closer to application hosting than ephemeral code eval.
Why Cloudflare is positioning this against containers
Cloudflare’s performance claim is explicit. A Dynamic Worker isolate starts in a few milliseconds, uses a few megabytes of memory, and is positioned as roughly 100x faster to start and 10x to 100x more memory efficient than a typical container.
This is a product segmentation move as much as a feature launch. Cloudflare Containers remain the option for full Linux-like environments, arbitrary languages, and filesystem-heavy workloads. Dynamic Workers target short-lived, untrusted, AI-generated code where startup latency dominates the user experience.
| Runtime option | Best fit | Startup profile | Environment |
|---|---|---|---|
| Dynamic Workers | AI-generated code, short-lived tool execution, JS-first paths | Few milliseconds | Workers isolate |
| Containers | Arbitrary runtimes, Linux dependencies, filesystem needs | Heavier lifecycle | Linux-like container |
If you are already comparing hosted code sandboxes for stateful AI agents, this launch sharpens the tradeoff. Use isolates when the task is brief and capability-scoped. Use containers when the code needs an operating system.
Security controls are the real product
The most important part of Dynamic Workers is not the loader API. It is the capability boundary.
Each Dynamic Worker can be configured with globalOutbound, which controls network egress. If you leave it unset, the sandbox inherits the parent’s network access. If you set it to null, both fetch() and connect() throw. You can also route outbound traffic through a service binding so every request is intercepted by a gateway Worker.
For agent platforms, this is the practical pattern: block the internet, then expose only the tools the sandbox needs through bindings or RPC. Cloudflare also supports Workers RPC, described as Cap’n Web, so the sandbox can receive typed capability objects instead of raw HTTP endpoints.
This is the same architectural direction showing up across agent systems more broadly. The useful boundary is not “tool access” in the abstract, but narrowly scoped capabilities, which fits closely with how function calling and agent skills are becoming the stable interface for production agents.
Language support and developer workflow
Dynamic Workers support JavaScript, including ES modules and CommonJS, plus Python. There is no build step in the runtime loader itself, so TypeScript must be compiled before loading. Cloudflare points developers to @cloudflare/worker-bundler for bundling TypeScript and npm dependencies into something load() or get() can execute.
Cloudflare is also steering AI-generated snippets toward JavaScript. That is the path with the lowest startup and execution overhead in this model.
If your coding agent currently emits Python by default, this launch gives you a reason to reconsider the target language for short sandboxed tasks. Runtime ergonomics now depend on platform startup behavior, not just model familiarity.
Pricing and beta availability
Dynamic Workers are available on the Workers Paid plan. At launch, Cloudflare said the $0.002 per unique Worker loaded per day charge is waived during beta, with normal invocation and CPU charges still applying.
The standing pricing model now includes 1,000 unique Dynamic Workers per month, 10 million requests per month, and 30 million CPU ms per month, then overages of $0.30 per million requests, $0.02 per million CPU ms, and $0.002 per Dynamic Worker per day.
| Metric | Included | Overage |
|---|---|---|
| Unique Dynamic Workers | 1,000 / month | $0.002 per worker per day |
| Requests | 10M / month | $0.30 / million |
| CPU time | 30M ms / month | $0.02 / million ms |
For agent workloads, the billing unit to watch is not only requests. It is unique worker creation. If your system generates many one-off code variants, sandbox identity becomes part of cost control, much like token and tool budgets in agent evaluation.
Observability and agent integration
Dynamic Workers support console capture, exceptions, and request metadata, with Tail Workers collecting logs asynchronously after execution. Cloudflare also shipped this alongside a broader agent-runtime push. @cloudflare/codemode v0.1.0 already includes DynamicWorkerExecutor, which defaults to globalOutbound: null, captures console output, and uses a default 30-second timeout.
That combination is important. Cloudflare is not only offering a lower-level sandbox primitive. It is threading the primitive into an agent execution stack where generated code can run with default isolation, bounded runtime, and post-run telemetry.
If you are building code-executing agents on Workers, the immediate design change is straightforward: move generated code out of the host Worker, default network access to off, expose tools through bindings, and decide early whether your workload needs load() for one-off execution or get() for warm reuse.
Get Insanely Good at AI
The book for developers who want to understand how AI actually works. LLMs, prompt engineering, RAG, AI agents, and production systems.
Keep Reading
How to Speed Up Regex Search for AI Agents
Learn how Cursor uses local sparse n-gram indexes to make regex search fast enough for interactive AI agent workflows.
Cloudflare Ships Dynamic Workers for AI Code Execution
Cloudflare shipped Dynamic Workers, an isolate-based sandbox that starts in milliseconds and uses a fraction of container memory, now in open beta.
Cursor Self-Hosted Cloud Agents Now Keep Code In-Network
Cursor self-hosted cloud agents are now GA, letting teams run agent execution in their own infrastructure while Cursor handles orchestration.
OpenAI has Shut Down Sora and a Billion-Dollar Disney Deal
OpenAI is shutting down Sora, calling it a 'side quest.' The framing tells you where AI companies think the real value is.
Claude Code Gets Auto Mode for Uninterrupted Agent Runs
Anthropic launched Auto mode for Claude Code, a research-preview permissions feature that lets coding agents run longer tasks with fewer approvals.