Nvidia's valuation has gone through the roof in 2023 thanks to a huge rally in the company's stock price. Shares of the chipmaker have shot up roughly 230% so far this year, driven primarily by the artificial intelligence (AI) arms race that has led to booming demand for its data center graphics cards that are being deployed for training AI models.
Nvidia now commands a price-to-sales (P/S) ratio of 36. Its price-to-earnings (P/E) ratio stands at a whopping 113. For comparison, Nvidia's average five-year sales and earnings multiples stand at 19 and 73, respectively, indicating just how big a premium the stock commands right now.
Those multiples may seem justified considering Nvidia's dominant position in the market for AI chips. After all, the semiconductor giant reportedly holds more than 80% of the AI chip market, according to third-party estimates. This massive market share is expected to drive the company's data center revenue to more than $31 billion in the current fiscal year. That would be more than double Nvidia's fiscal 2023 data center revenue of $15 billion.
Given that the company got 55% of its revenue from selling data center chips last year, the big surge in this segment that's anticipated this year could give a serious boost to Nvidia's business and help it justify its valuation. However, not everyone may be comfortable paying such a rich multiple for Nvidia's anticipated growth.
That's why now would be a good time to take a closer look at Advanced Micro Devices (AMD -1.25%), as potential developments in the semiconductor industry suggest that it could become a big beneficiary thanks to the growing demand for AI chips. Let's look at the reasons why.
AMD's AI chip shipments may jump significantly next year
Nvidia may have cornered a big share of the AI chip market, but Taiwan-based daily newspaper DigiTimes suggests that AMD could play a key role in this space. DigiTimes reports (via Tom's Hardware) that foundry giant Taiwan Semiconductor Manufacturing, popularly known as TSMC, has placed orders for additional tools that will be used in chip-on-wafer-on-substrate (CoWoS) packaging.
CoWoS is an advanced packing process that allows TSMC to integrate high-bandwidth memory (HBM) -- which is used in AI servers to enable fast data transmission and greater storage capacity while keeping power consumption low -- with high-performance computing chips. The demand for this packaging solution is expected to grow between 30% and 40% in 2024, according to market research firm TrendForce.
This explains why TSMC is expected to increase its CoWoS capacity from 8,000 silicon wafers a month currently to 11,000 wafers a month by the end of the year. TSMC is expected to further increase its monthly CoWoS capacity to a range of 14,500 to 16,600 wafers a month by the end of 2024, which would be nearly double the current levels. These wafers are used to make integrated circuits such as graphics processing units (GPUs), central processing units (CPUs), and others.
Nvidia, for instance, reportedly produces around 60 of its A100 and H100 data center GPUs from each wafer packaged using CoWoS. So, if TSMC can double its CoWoS capacity by the end of next year to 16,000 wafers a month, companies like Nvidia and AMD may be able to manufacture around 960,000 AI GPUs a month. DigiTimes' sources indicate that TSMC's CoWoS shipments to AMD, following the launch of its MI300X accelerators later this year, could be half of those that it ships to Nvidia each quarter.
If that's indeed the case, AMD could be shipping around 320,000 data center GPUs a month compared to Nvidia's 640,000. In simpler words, this assumption suggests that AMD could corner a third of the data center GPU market. That could be huge for AMD given that it's currently a very small player in front of Nvidia in the market for AI chips.
A massive revenue jump could be in the cards
AMD hasn't revealed any potential price at which it is going to launch its MI300X accelerators. The specs of the chip, however, indicate that it could give Nvidia stiff competition. The MI300X is reportedly going to be equipped with 192GB (gigabytes) of high-bandwidth memory, which is greater than the 120GB found on Nvidia's H100 data center GPU. This theoretically means that AMD's processor should be able to train bigger large-language models that power generative AI applications.
Now, each H100 GPU is reportedly priced at $30,000 or more. If AMD decides to undercut Nvidia and prices its AI accelerator at, let's say, even $20,000 a chip, it could generate $6 billion a month or $72 billion a year in revenue. There is no doubt that these numbers look very optimistic given that AMD is expected to generate just under $23 billion in revenue this year.
But at the same time, investors shouldn't forget that AI has helped give Nvidia's top line a massive boost, and the same is likely to continue in the coming years.
So, it won't be surprising to see AMD getting a nice shot in the arm, and that would be true even if it corners a quarter of the revenue opportunity projected above. All this indicates that AMD stock could be set up for a big rally in the future thanks to AI, which is why investors should consider buying the stock right away.
AMD's 70% surge in 2023 has brought the stock's P/S ratio to 8, compared to just over 4 at the end of 2022. But that's still way cheaper than Nvidia's sales multiple. Also, AMD is trading at 39 times forward earnings, compared to Nvidia's multiple of 55. In all, investors are getting a relatively good deal on AMD right now considering how AI is likely to supercharge its business.