Nvidia (NVDA 1.28%) has been the biggest beneficiary of the tremendous demand for artificial intelligence (AI) chips. This is not surprising, as the parallel computing ability of its graphics processing units (GPUs) makes them ideal for AI model training and inference.

It is worth noting that Nvidia has been running away with the data center GPU market, with an estimated market share of 92% at the end of last year. That's miles ahead of second-place Advanced Micro Devices' (AMD -0.45%) share of just 4%. However, it won't be surprising to see AMD taking market share away from Nvidia thanks to its upcoming chips.

Let's see why that may be the case.

The letters "AI" written on a circuit board.

Image source: Getty Images.

AMD is aggressively closing the gap

AMD recently unveiled its AI chip roadmap at an investor event. The chipmaker used the event to launch its MI350 series of data center GPUs, promising a "4x generation-on-generation AI compute improvement and up to 35x leap in inferencing performance." This new line of chips is manufactured on a 3-nanometer (nm) process node. That gives it an advantage over Nvidia's latest generation of Blackwell processors, which are reportedly manufactured using a 4nm node.

Chips manufactured using a smaller process node are theoretically more powerful and power-efficient, since they pack a higher number of transistors into a smaller area. As a result, the electrons need to travel in a smaller area to perform computing tasks, and they generate less heat. So, AMD's latest generation of AI accelerators could be better than Nvidia's offerings, at least on paper.

AMD points out that its MI350 series processors have stronger specifications compared to Nvidia's Blackwell offerings, packing in 1.6 times more memory. It also points out that the MI355X processor is 1.2 times to 1.3 times faster than Nvidia's Blackwell chips while running inference models such as DeepSeek R1 and Llama 3.1.

More importantly, AMD is aiming to hurt Nvidia big time with its MI400 series of accelerators that are set to be launched next year. The company is going to increase the high-bandwidth memory (HBM) capacity to 432 gigabytes (GB) on the MI400 from 288GB on the MI350 series chips. What's more, AMD is going to more than double the memory bandwidth in the MI400 processors, to 19.6 terabytes per second.

AMD points out that the MI400 will mark a significant leap in AI compute performance, which will be far greater than what it saw in the previous generational upgrade. Meanwhile, Nvidia's next generation of AI GPUs is expected to pack similar memory compared to the current generation Blackwell processors, which means that AMD could enjoy a spec advantage on paper.

A bigger memory stack will allow AMD's chips to transfer more data, while the higher bandwidth will enable faster transmission, at least theoretically. This potential advantage could spur stronger sales of AMD's AI GPUs, helping the company capture a bigger share of the market. Now, AMD doesn't need to overtake Nvidia to supercharge its growth. It simply needs to capture a double-digit share of the AI GPU market in the next three years.

Assuming AMD can capture 10% of the AI GPU market by 2028, its revenue from this segment could hit $50 billion (based on the projected $500 billion annual revenue that the AI accelerator market is expected to hit by 2028). This could be enough to help it deliver more upside than Nvidia.

Why AMD seems primed for more upside

AMD finished 2024 with just under $26 billion in total revenue. The company says that it sold more than $5 billion worth of data center GPUs last year, which means that the non-AI data center GPU businesses contributed $21 billion. Assuming it doesn't witness any growth in the rest of its segments over the next three years (and generates $21 billion in annual revenue from its non-AI GPU business in 2028), but manages to hit $50 billion in AI GPU revenue, its annual top line could hit $71 billion after three years.

The shares are currently trading at 8.4 times sales, which is almost in line with the U.S. technology sector's average sales multiple of 8.1. Assuming that AMD trades in line with the technology sector's multiple after three years, its market capitalization could hit $575 billion based on the revenue it is expected to generate in 2028. That points toward potential gains of 150% from current levels.

Nvidia, on the other hand, is trading at an expensive price-to-sales ratio of 26. However, with the company's top-line growth rate expected to slow down over the next couple of years, it won't be surprising to see Nvidia trading at a discount. As such, the possibility of AMD delivering more gains than Nvidia thanks to its upcoming chips cannot be ruled out. That's why AMD looks like a solid AI stock to buy right now.