IT Brief Canada - Technology news for CIOs & IT decision-makers
Flux result a8f4c3ad c11b 4e5a ac73 2ddf871076ad

Cloudflare expands Agent Cloud for long-running AI agents

Mon, 13th Apr 2026

Cloudflare has expanded its Agent Cloud with new tools for building and running AI agents, broadening its push to provide infrastructure for long-running autonomous software.

The update adds compute, storage, sandboxing and development tools intended to move AI agents from early experiments into broader operational use. It targets developers building software that can carry out multi-step tasks over longer periods, rather than responding only to single prompts.

The announcement reflects a broader shift in the AI market from chatbots toward agents that can write code, interpret context and take actions across systems. That shift is raising new questions about where these systems run, how they store state over time and how developers manage costs as usage grows.

Compute model

One of the main additions is Dynamic Workers, an isolate-based runtime for running AI-generated code in a sandboxed environment. It is intended for shorter tasks such as calling an API, transforming data or linking several tool calls together before shutting down.

Cloudflare positions this model as an alternative to running each agent inside its own container or on an always-on virtual server. Those approaches can become costly when developers want large numbers of personal or workplace agents running at the same time.

Cloudflare is also adding Artifacts, a Git-compatible storage system for code and files created by agents. Developers will be able to create large numbers of repositories, fork from remote sources and make stored code and data available through standard Git clients.

Sandboxes are also now generally available. These persistent Linux environments include a shell, file system and background processes, and are designed for more involved jobs where an agent needs a fuller operating environment to clone repositories, install packages, run builds and continue working across sessions.

Longer tasks

Another addition is Think, a framework within the Agents SDK designed to support persistence for longer-running work. It is intended to help developers build agents that can handle multi-step tasks over time rather than short, one-off interactions.

That focus on persistence addresses a practical constraint in current AI systems. Many agents can generate responses or carry out limited actions, but often struggle to retain state, resume work and operate reliably across different tools and extended workflows.

Cloudflare is also expanding its model catalogue following its acquisition of Replicate, giving developers access to both proprietary and open-source models through one platform. The update includes models from OpenAI, including GPT-5.4, alongside open-source options, all accessible through a single interface.

The aim is to let customers switch providers with minimal code changes instead of committing to a single model vendor. Cloudflare said developers can move between models by changing a single line of code, avoiding the need to manage multiple vendors as model performance continues to evolve.

"Cloud agents are quickly becoming a foundational building block for how work gets done, and with Cloudflare, we're making it dramatically easier for developers to deploy production-ready agents powered by GPT-5.4 and Codex to run real enterprise workloads at scale," said Rohan Varma, Product, Codex, at OpenAI.

The commercial logic behind the move is clear. As businesses test AI agents for software engineering, operations and internal automation, infrastructure providers are competing to become the place where these systems are built, run and monitored.

Cloudflare has been arguing that its existing Workers platform gives it a head start in that race. The company says the architecture it has built over several years is suited to large numbers of lightweight, distributed workloads, which it now wants to apply to AI agent execution.

"The way people build software is fundamentally changing. We are entering a world where agents are the ones writing and executing code," said Matthew Prince, Co-founder and Chief Executive Officer of Cloudflare.

He added that the company sees a need for infrastructure that can support security, scale and persistence for those systems. "But agents need a home that is secure by default, scales to millions instantly, and persists across long-running tasks. We've spent nine years building the foundation for this with Cloudflare Workers. Today, we are making Cloudflare the definitive platform for the agentic web," Prince said.

Cloudflare's pitch comes as cloud providers, model companies and developer platforms all try to define the software stack for the next stage of AI adoption. The contest is no longer only about access to models, but also about runtime environments, storage, orchestration and the economics of operating autonomous systems at scale.

For developers, the practical question is whether these tools reduce the complexity of running agents in production while keeping costs predictable. For Cloudflare, the opportunity is to turn that need into a larger role in the AI software infrastructure layer, with Sandboxes now generally available and Dynamic Workers, Artifacts, Think and an expanded model catalogue at the core of that effort.