All eyes have been on Nvidia this year as the GPU company has been providing the raw computing power for the artificial intelligence (AI) revolution. Its ultra-powerful chips have become the gold standard for training and running the most advanced AI models. The company reportedly sold half a million AI GPUs in the third quarter, mostly to hyperscale cloud companies racing to boost their AI capacity.

Nvidia isn't the only AI stock that has rewarded investors this year. Intel (INTC -9.20%), known best for its PC and server CPUs, is tapping into the booming AI market in numerous ways. The company hasn't yet recorded blockbuster growth from the AI market like Nvidia has, but it's in a great position to be a major player in all things AI in the years ahead.

Investors are starting to see that potential -- they've sent Intel stock up more than 80% so far in 2023. That's the best performance since 2003 for the iconic chip stock.

AI everywhere

Intel is betting that enabling AI workloads to run anywhere, from powerful cloud servers to laptops, will be its path to victory in the AI market.

The company already competes in the AI accelerator market. While Nvidia's A100 and H100 GPUs are by far the most widely used accelerators, Intel has drummed up plenty of interest in its Gaudi2 AI chip. Intel acquired Habana Labs, the company behind Gaudi, in 2019, and it has had the good sense to keep the product line around as it killed off non-core businesses left and right. Intel's AI accelerator pipeline, which should eventually translate into revenue, was nearly $2 billion at the end of the third quarter.

Gaudi2 can't match Nvidia's chips in performance, but it does offer a solid mix of performance and efficiency. Gaudi3, the follow-up to Gaudi2, is officially due sometime in 2024 and should deliver market share gains for Intel. Following Gaudi3, Intel plans to merge its Gaudi and AI GPU lineups.

Intel's powerful AI accelerators are a promising line of business, but the company is betting that AI inference (the act of running an already-trained AI model) will be the biggest opportunity. "Fundamentally, the inference market is where the game will be at," CEO Pat Gelsinger said at a recent event. 

Intel is now baking AI hardware into its CPUs. In the server market, its Sapphire Rapids chips and now its Emerald Rapids chips include built-in AI accelerators. While the largest AI models still require powerful GPUs or other stand-alone accelerators, smaller models can be capably run on Intel's CPUs alone.

Its PC chips are also getting in on the action. Meteor Lake, the latest generation of Intel's PC CPUs, which launched this month, includes a new AI processor that can take on inference workloads like blurring the background in a video call or powering advanced features in creative software. By removing the load from the CPU and GPU, Meteor Lake can run AI workloads without tying up the system or destroying the battery life.

Outside of making its own AI chips, Intel's foundry business is positioned to manufacture AI chips for third parties. The company remains on track to bring its advanced Intel 18A manufacturing process online by the end of 2024. If Intel can pull it off and surpass foundry leader TSMC in terms of process technology, AI chips ranging from powerful accelerators to smartphone SoCs with built-in AI smarts could be rolling off Intel's production lines in 2025 and beyond.

Many ways to win

For investors looking to profit from the AI revolution, Intel has more ways to win than Nvidia, AMD, or any other AI chip company. Nvidia will be tough to beat in the AI accelerator market, and AMD is making plenty of noise with its newest AI chips. But a few years from now, both companies could be seriously considering using Intel's foundries to manufacture their chips if it takes the technological lead over TSMC.

Built-in AI hardware could also trigger a PC refresh cycle or shift some data center spending back to CPUs capable of running AI workloads. Regardless of how the AI market evolves, Intel will enjoy an AI tailwind for years to come.