Semiconductor stocks have taken off in 2023 thanks to artificial intelligence (AI) as chips are playing a mission-critical role in the proliferation of this technology in multiple ways. This explains why the PHLX Semiconductor Sector index has recorded impressive gains of 37% so far this year.

Training large AI models requires a lot of computing power, which is currently provided by graphics processing units (GPUs) and central processing units (CPUs). Similarly, AI servers need fast memory for both storage and computing. All this explains why the market for AI chips is expected to grow from just $15 billion last year to almost $384 billion a year in 2032, according to Allied Market Research.

Not surprisingly, investors have been piling into AI stocks such as Nvidia (NVDA -0.09%) and Advanced Micro Devices (AMD -2.27%) this year. While Nvidia stock has surged a massive 220% in 2023, AMD is up 70%. This huge difference between the performances of these two AI stocks isn't surprising as Nvidia is already delivering outstanding growth thanks to the huge demand for its AI chips, while AMD's products have yet to gain traction.

Does this mean Nvidia is the better AI stock of these two chipmakers? Let's find out.

The case for Nvidia

Nvidia has quickly moved to capture the lion's share of the market for chips that are used for training AI models. According to one estimate, Nvidia controls over 90% of the AI training market. The company's A100 data center GPUs were deployed in the thousands to train the popular chatbot ChatGPT last year. And now, the demand for its latest flagship H100 data center GPU is so strong that Nvidia's foundry partner Taiwan Semiconductor Manufacturing, popularly known as TSMC, is finding it hard to make enough chips.

Nvidia's AI GPUs are in such high demand that TSMC says that they could be in short supply for the next year and a half. But the good part is that the company is working to get its hands on more AI chips and supply levels are improving.

At the same time, the company is diversifying its presence in the AI chip space by moving into the AI server processor market. As a result, don't be surprised to see Nvidia sustaining the jaw-dropping pace of growth that it has attained thanks to AI. Its revenue doubled last quarter on a year-over-year basis. The current quarter's revenue forecast of $16 billion points toward a jump of 170% from the year-ago period's figure of $5.9 billion.

Analysts were originally expecting Nvidia to report $12.6 billion in revenue for the current quarter, but they have been scrambling to raise their growth forecasts. Toshiya Hari of Goldman Sachs, for instance, has bumped up the revenue projection by 42% for this year. The analyst has also raised Nvidia's top-line forecast by 43% and 41% for the next couple of years, suggesting that this hot AI stock could continue delivering explosive gains to investors.

The case for AMD

AMD's standing in AI chips is currently nowhere near that of Nvidia, as we have already seen that the latter has a near-monopoly in the AI training chip market. But at the same time, it is worth noting that chips used for training AI models are only a part of the overall AI chip market.

It is estimated that training chips reportedly make up 10% to 20% of the overall AI chip market. On the other hand, the market for chips used for AI inference -- which is the phase in which the trained AI models are deployed for the purpose for which they were trained -- is expected to account for the remainder of this space.

This explains why AMD is positioning its upcoming MI300X data center accelerator to target the AI inference market. AMD CEO Lisa Su claims that the MI300X can significantly lower the number of GPUs needed for AI inference applications thanks to the larger memory it is equipped with as well as faster bandwidth, thereby lowering operating and capital costs.

According to Su, the MI300X has 2.4 times denser high-bandwidth memory (HBM) than Nvidia's H100, while it also comes with 1.6 times more bandwidth than its rival. The bigger memory and the faster bandwidth mean that AMD's offering has more computing power, at least in theory. AMD also points out that it can run large language models on its own memory (fast) instead of transferring data from an external source (slow), which should ideally allow for faster inferencing at lower costs.

Not surprisingly, AMD is now witnessing strong interest in its AI chips, with the number of AI projects that it initiated or expanded in the second quarter of 2023 growing sevenfold from the first quarter. So, there is a solid chance that AMD's AI efforts could start paying off once the company launches its MI300 accelerators in the fourth quarter.

Wells Fargo analyst Aaron Rakers is also upbeat about the potential of AMD's AI chips, believing that they could send the stock to $150. That would be a 40% jump from current levels. What's more, AMD stock carries a median price target of $145 as per 40 analysts covering the stock -- a 35% increase from current levels.

Given that the company's financial performance could turn around impressively in the coming year thanks to catalysts such as AI, investors may want to keep a tab on AMD as it could win big from the huge revenue opportunity in AI chips.

The verdict

AI is fueling outstanding growth for Nvidia, but AMD is currently in a rut thanks to the tepid personal computer (PC) market. Analysts are anticipating a 3% dip in AMD's top line in 2023 to $22.8 billion, though the chipmaker is expected to regain its mojo next year. Nvidia, meanwhile, could finish the current fiscal year with a 103% spike in revenue to $54.7 billion, driven by AI.

So, investors looking for a tried and tested AI play may want to consider Nvidia for their portfolios over AMD. However, they will have to pay a rich 35 times sales to buy Nvidia, which is way higher than AMD's sales multiple of 8. Of course, Nvidia deserves the richer valuation as it is growing at a breathtaking pace while AMD has yet to take off in AI.

But at the same time, investors looking for a cheaper AI stock may be tempted to keep AMD on their radars as it could turn out to be a solid play on the proliferation of this explosive technology.