While Nvidia has become synonymous with artificial intelligence (AI) chips, both Intel (INTC -9.20%) and Advanced Micro Devices (AMD 2.37%) are potent competitors that will challenge the market leader in the coming years. These stocks have produced incredible returns since they went public decades ago, and they could continue to deliver for investors as the market for AI accelerators explodes.

Intel

While chip-giant Intel has faced significant challenges in the past few years, including a tumbling stock price, the company has still delivered incredible returns to shareholders over its decades as a public company. Intel went public in 1971 at a split-adjusted price of just $0.02 per share. If an investor were lucky enough to buy $10,000 of Intel stock at that price and hold on for more than 50 years, that stake would be worth about $17.5 million today.

Intel is going after the artificial intelligence market in multiple ways. The company sells AI accelerators that rival market leader Nvidia, including data center GPUs and its specialized Gaudi chips. Intel's Gaudi 2 chip is garnering plenty of interest from potential customers, with the company's AI accelerator pipeline expanding sixfold in the second quarter to top $1 billion. While Gaudi 2 can't match Nvidia's H100 in raw performance, it's an economical option for those struggling to acquire enough Nvidia GPUs.

Intel is also building AI capabilities into both its server and PC chips. Intel's Sapphire Rapids server CPUs, which launched earlier this year, include AI hardware capable of running AI inference workloads. While training something like ChatGPT requires powerful GPUs, plenty of AI models can be run efficiently with less powerful hardware.

In the PC market, Intel's upcoming Meteor Lake chips will be its first with AI hardware built in. Meteor Lake chips will include a Neural Processing Unit that's capable of running AI inference tasks directly on the device. Software developers that support the hardware can use it to speed up AI tasks, avoid overloading the CPU or GPU, and save money by eliminating the need to call out to cloud services to process AI workloads.

On top of building its own AI-enabled chips, Intel's foundry business could one day manufacture AI accelerators for other companies. By the end of 2024, Intel expects to have regained its manufacturing advantage over TSMC Semiconductor with its Intel 18A process. The company is also greatly expanding its advanced packaging capacity, which is critical for the most advanced chips. Any of the tech giants working on their own AI chips will have to seriously consider Intel for manufacturing in 2025 and beyond.

While Intel stock has been stumbling, the company is in a great position to benefit from the AI revolution.

Advanced Micro Devices

AMD's fortunes over the years have ebbed and flowed. From about 2011 through 2017, AMD was in dire straights as its CPUs were simply not competitive with those of Intel. The Zen architecture, along with plenty of missteps on Intel's part, ushered in a renaissance for the company. At the moment, AMD has a slightly higher market capitalization than Intel.

While the story would have looked different a decade ago when the company was struggling to survive, AMD has generated incredible returns for shareholders since it first sold stock to the public in 1972 at a split-adjusted price of $0.57. If an investor had loaded up with a $10,000 stake at that time and held on tight, their holdings would now be worth approximately $1.8 million.

Like Intel, AMD finds itself playing catch-up with Nvidia in the AI accelerator market. The company's upcoming line of MI300 accelerators, particularly the ultra-powerful MI300X aimed at generative AI workloads, has the potential to be in high demand. Nvidia is selling every high-end data center GPU it can, as tech giants and start-ups snap up GPUs to train advanced AI models. AMD should be able to make some headway once these products become widely available.

One of Nvidia's biggest advantages is the vast ecosystem of software around its GPUs. The company launched its CUDA platform back in 2007, and since then, its GPUs have become the de facto standard for accelerating data center workloads. AMD is working on an open alternative called ROCm, which already has support for popular AI frameworks, but it will take time for AMD to chip away at Nvidia's software advantage.

While long-term demand for AI accelerators is mostly a guessing game, some estimates put annual growth over the next decade as high as 38%. There will be multiple winners in this potentially massive market, and AMD has a good chance of being one of them.