The artificial intelligence (AI) storm triggered by the launch of OpenAI's ChatGPT caught plenty of companies off guard. Economic moats went from being rock solid to questionable. Even Alphabet, dominant in search for decades, found itself facing the prospect of real competition.

Cloudflare (NET 1.44%) was not caught off guard. The cloud company, which specializes in content delivery and security but has been expanding its platform over the years to include serverless computing and other services, has taken its time rolling out AI products. Workers AI, which allows customers to run AI models on Cloudflare's network, is still in open beta and not quite ready for prime time. But the foundation for Cloudflare's AI products started being laid six years ago, an act of foresight and a demonstration of the optionality built into the company's global network.

Leaving space for AI

Cloudflare operates data centers in 310 cities spanning more than 120 countries around the world. This fleet of data centers has expanded over the years as the company brought more internet users closer to its network. Every Cloudflare service runs in every data center, allowing the company to deliver a consistent experience everywhere.

Beginning six years ago, Cloudflare began purposely leaving at least one expansion slot empty in every server it installed in its data centers. As CEO Matthew Prince explained in the third-quarter earnings call, the company expected that it would eventually make sense to fill those slots with graphics processing units, which are widely used to accelerate AI workloads. Cloudflare knew it would someday offer AI services, so it left the door open to simply add that capability to its existing data center footprint down the road.

That day has now come. Demand for AI cloud services is exploding, and Cloudflare is in the process of installing GPUs into those empty slots. The company had GPUs running in 75 cities at the end of October, and it plans to hit 100 cities by the end of 2023. By the end of 2024, essentially all of Cloudflare's locations will feature GPUs optimized for AI workloads.

There are two big benefits of Cloudflare's decision six years ago to leave space for GPUs. First, it allows the company to rapidly roll out GPUs to its network. In the first five weeks of offering AI services, customers used those services to process 18 million requests. The long-term potential is enormous. Cloudflare has customers interested in bringing hundreds of billions of AI tasks per month to its platform.

Second, the ability to install GPUs in existing servers has allowed Cloudflare to roll out its AI services without needing to dramatically ramp up capital spending. The company doesn't need to build new servers or new data centers. Instead, it just needs to buy GPUs and pop them in. "Right now, there are members of the Cloudflare team traveling the world with suitcases full of GPUs, installing them throughout our network," said Prince during the earnings call.

Focusing on inference

Cloudflare is optimizing its infrastructure to handle AI inference workloads, which involves running an AI model that's already been trained on data. The company is betting that many inference tasks will be too heavy to run on devices. These heavier workloads, which require GPUs but still need to be completed quickly, are the sweet spot for Cloudflare.

Cloudflare isn't focused on AI training workloads partly because it believes inference will be the larger opportunity in the long run. This focus on inference brings a key benefit: Cloudflare doesn't need to scramble to buy top-of-the-line GPUs at exorbitant prices. The company can instead spend far less on older GPUs and still deliver solid performance. Speaking of Nvidia's latest H100 data center GPU and the older A100 GPU, Prince said during the second-quarter earnings call: "And maybe we don't need the H100. Maybe we can live with A100 or, you know, whatever is, again, a generation or two behind."

By not needing to build out new servers to offer AI services, and by leveraging older, cheaper GPUs, Cloudflare can quickly scale its AI capacity without greatly increasing its capital spending. The company hasn't been as quick as others to roll out AI products, but it's been far more methodical. Cloudflare built optionality into its network years ago, and now it's leveraging it to go after the portion of the AI market where it can gain a competitive advantage.

Cloudflare's AI offerings are still young, but they represent a potentially enormous long-term growth opportunity for the cloud company.