Nvidia's (NVDA 6.18%) development of high-end data center GPUs for processing machine learning and artificial intelligence (AI) tasks has turned it into a hypergrowth stock over the past couple of years. All of the world's leading AI companies -- including ChatGPT's developer OpenAI, Microsoft, Alphabet's Google, and Amazon -- use Nvidia GPUs.

Nvidia's first-mover advantage and its lack of meaningful competitors made it one of the easiest ways to profit from the expansion of the AI market. But Advanced Micro Devices (AMD 2.37%), Nvidia's biggest competitor in the gaming GPU market, is also turning its gaze toward the AI market. Could AMD eventually dethrone Nvidia as the leader in AI chips?

An illustration of a semiconductor.

Image source: Getty Images.

The rivalry between AMD and Nvidia

AMD entered the GPU market by acquiring ATI for $5.6 billion in 2006. It subsequently expanded ATI's Radeon brand to challenge Nvidia in discrete GPUs for gaming PCs, and it leveraged that technology to create APUs -- which bundle CPUs and GPUs on a single die -- for PCs and gaming consoles.

AMD's main strategy against Nvidia, which mirrored its approach to challenging Intel (NASDAQ: INTC) in the CPU market, was to constantly sell chips that were cheaper than those of the market leader. AMD's GPUs could often provide gaming performance comparable to Nvidia's chips, but they usually consumed more power because they used less efficient designs.

Nvidia still controlled about 80% of the discrete GPU market last year, according to Jon Peddie Research, while AMD remained a distant second with a 17% share. The remaining small share belongs to Intel, which returned to the discrete GPU market in 2022.

Nvidia didn't make the same mistakes as Intel, which ceded a large portion of the PC CPU market to AMD through its own blunders over the past decade. Instead, Nvidia held AMD at bay with cheaper chips that targeted lower-end gamers.

AMD's advantages in the AI market

Unlike Intel, which manufactures most of its chips at its own foundries, AMD and Nvidia are both fabless chipmakers that outsource their production to the contract chipmaker Taiwan Semiconductor Manufacturing (NYSE: TSM). As a result, their chip designs (in terms of size and power efficiency) are ultimately constrained by TSMC's technical limitations.

Those constraints might make it easier for AMD to catch up to Nvidia with comparable GPUs for data centers. AMD launched its first batch of Instinct GPUs (the MI6, MI8, and MI25) in 2017, followed by more 7-nanometer and 6nm MI chips over the subsequent years. AMD rolled out its newest MI300 Instinct chips, which were built on TSMC's 5nm and 6nm nodes, in late 2023.

In the latest industry benchmarks, its top-tier MI300X actually beats Nvidia's H100 -- which is widely used for processing generative AI tasks -- in terms of raw processing power and memory bandwidth.

More importantly, data center operators can actually buy four MI300 GPUs for the price of a single H100 GPU -- which costs more than $40,000 as Nvidia struggles to deliver enough chips to meet demand.

Nvidia claims the H100 still outperforms the MI300X when it's running on optimized software, but the massive price gap, the narrow performance gap, and the persistent supply chain constraints could drive more of its customers toward AMD.

AMD is well-positioned to carve out its own niche

Tech giants like Microsoft, Meta Platforms, Oracle, Dell, and Hewlett Packard Enterprise have already been testing or running Instinct GPUs as cost-efficient alternatives to Nvidia's chips.

AMD also sells its Epyc server CPUs, which compete against Intel's Xeon CPUs, and programmable chips (through its acquisition of Xilinx). It can bundle those chips with its Instinct GPUs in cost-efficient packages for data centers.

During AMD's fourth-quarter conference call in January, CEO Lisa Su said its MI300 deployments were "accelerating" and on track to be the "fastest revenue ramp of any product" in the company's history. Su also pointed out that AMD's combined sales of data center and embedded chips (including its Epyc, Instinct, Xilinx, and gaming APUs) grew by $1.2 billion in 2023 and accounted for over 50% of its full-year revenue.

Those healthy growth rates suggest AMD can carve out a high-growth niche in the AI market. AMD's share of the global data-center CPU market already grew from 12% in 2021 to 20% in 2022, according to Counterpoint Research, and Lisa Su estimated that its share had risen to about 25% during an interview last year.

But can it actually dethrone Nvidia as the AI leader?

AMD's Instinct business will likely expand over the next few years, but it probably won't overtake Nvidia as the market leader. Nvidia's prices should stabilize as it resolves its supply chain constraints, and many data center operators could use a mix of chips from the two rivals and their own first-party chipmaking divisions instead of dedicating themselves to a single vendor.

Therefore, I believe AMD will carve out a high-growth niche in the AI market in the same way it expanded into the gaming GPU and data center GPU markets. But I don't think it will replace Nvidia as the AI kingpin.