Using the most advanced AI models generally requires powerful AI accelerators installed in cloud data centers. The memory and compute requirements are enormous.

The market for such AI accelerators is soaring, and Intel (INTC 7.81%) has multiple ways to tap into that growth. Its specialized Gaudi AI chips are garnering plenty of interest, and they are capable of outperforming Nvidia's market-leading GPUs in some cases. Intel also sells data center GPUs that can accelerate a wider variety of workloads.

But Intel CEO Pat Gelsinger doesn't believe that AI will remain in the data center. The company is putting AI into everything, and the long-term AI opportunity for Intel extends far beyond AI accelerators.

Pervasive AI

Beyond dedicated AI accelerators, Intel is going after data center AI workloads through its server CPU business. Intel's Sapphire Rapids chips, which launched after multiple delays earlier this year, come with built-in AI accelerators aimed at AI inference tasks. While powerful AI chips are needed to train AI models, Sapphire Rapids provides a way to run some of those models efficiently on a data center CPU. Intel estimates that as many as one-third of Sapphire Rapids sales are being driven by AI use cases.

Intel is competing for data center AI workloads, but the company is also betting that AI will move closer to end users. In an interview at Deutsche Bank's 2023 Technology Conference, Intel CEO Pat Gelsinger pointed out some downsides of relying on the cloud for AI processing:

Are you going to do real-time human tracking in stores and manufacturing and supply chain locations with AI? Could you run that in the cloud? Absolutely. Are you going to? Absolutely not. The laws of economics, the laws of physics, the laws of privacy will force it to the edge.

Another example Gelsinger gave was real-time language translation in video calls. The latency introduced by running the AI model doing the translation in a cloud data center would make for a bad user experience. And, of course, someone would have to pay for the cloud resources to run the AI model. It would be better and cheaper if that AI work could be done right on each user's device.

Intel is set to launch its Meteor Lake PC CPUs this year, and AI is a focal point. Meteor Lake will feature dedicated AI hardware capable of accelerating the types of AI workloads that make sense to run directly on a user's PC. You won't be able to run a large language model on your laptop, but plenty of lighter-duty AI workloads can be offloaded from the CPU and sped up considerably.

Intel's AI hardware needs to be supported by software to function, but software providers have a strong incentive to leverage Meteor Lake's AI smarts. A video conferencing software provider, for example, can save money by using a customer's device to power AI features rather than relying on pricey AI accelerators in the cloud.

This is also a numbers game. The install base for desktops and laptops in the United States alone is around 300 million, and Intel remains the dominant provider of CPUs. As Intel brings AI hardware to the PC with Meteor Lake and future chip families, the sheer volume of devices will induce developers and software providers to take advantage of the new hardware.

While rival AMD offers dedicated AI hardware in some of its latest laptop chips, Intel's share of the x86 CPU market, excluding game consoles and IoT, currently tops 80%. For software providers, supporting Intel's AI hardware will make the most sense.

AI is a multi-faceted opportunity

While Nvidia is benefiting the most from the surge in demand for AI accelerators, Intel has more ways to tap into AI demand in the long run. The company sells its own AI accelerators and data center CPUs with AI capabilities, it will soon start shipping PC CPUs with AI hardware, and its foundry business could be manufacturing AI chips for others within a few years.

There will be many winners in AI, and Intel looks destined to be one of them.