The novel coronavirus pandemic created a major shortage of semiconductors in 2020 as shelter-in-place orders led to an increase in the demand for consumer electronic items while also hampering chip production.
It took the semiconductor industry a couple of years to get supply back to speed by bringing more capacity online. Meanwhile, the weak demand for personal computers (PCs) and smartphones also helped reduce the semiconductor supply crunch. But now, it looks like another chip shortage is emerging in a lucrative niche of the semiconductor industry, and one company is on track to make the most of the same -- Nvidia (NVDA -1.81%).
Booming demand for AI chips has supercharged Nvidia's growth
There has been a massive spike in demand for artificial intelligence (AI) chips over the past year, driven by the raging popularity of Microsoft-backed OpenAI's chatbot ChatGPT, which triggered a race among companies and governments to develop AI applications. Market research firm Gartner estimates that $53 billion worth of AI chips could be sold this year, up roughly 21% from last year.
However, that number could have been higher if there had been enough supply available to satiate AI chip demand. Nvidia's flagship H100 AI chip, for instance, has a long waiting period of six months. It is worth noting that this is an expensive piece of hardware that costs $40,000 on average. So, if Nvidia could have produced more of this chip along with its foundry partner Taiwan Semiconductor Manufacturing, which is popularly known as TSMC, the AI chip market could have generated stronger revenue growth in 2023.
The reason why I am saying that the H100 alone could have given the AI chip market a big boost is because it is one of the hottest commodities in this space. The H100 has played a key role in helping Nvidia control an estimated 80% to 95% of the AI chip market according to third-party estimates, and it has been critical to the rapid growth the company has been delivering this fiscal year.
Nvidia released fiscal 2024 third-quarter results (for the three months ended Oct. 29) on Nov. 21. The company's revenue tripled year over year to $18.1 billion, crushing the consensus estimate of $16 billion. Nvidia's earnings grew much faster and jumped a whopping 600% to $4.02 per share last quarter. Wall Street would have settled for $3.37 per share in non-GAAP (adjusted) earnings.
The company's data center business, which houses sales of its AI chips, moved the needle in a big way by producing $14.5 billion in revenue last quarter, up 279% from the year-ago period. It produced 80% of the company's top line, and the segment benefited big time from the growing adoption of the company's chips by cloud service providers (CSPs) and enterprises.
Even then, Nvidia CFO Colette Kress said on the latest earnings conference call that the demand for its AI chips is so solid that the company will continue to increase supply. She said:
Nvidia H100 Tensor Core GPU instances are now generally available in virtually every cloud with instances and high demand. We have significantly increased supply every quarter this year to meet strong demand and expect to continue to do so next year.
The chipmaker's focus on increasing the supply of its AI chips explains why its revenue growth is set to get even better. Nvidia anticipates $20 billion in revenue in the fourth quarter of fiscal 2024 (which will end in January next year). That would be a 233% increase over the year-ago quarter, which means that Nvidia's top-line growth is set to accelerate following the 206% year-over-year jump it clocked in the previous one.
The reason why Nvidia is able to procure more supply is because of the sway it holds in the AI chip market, which gives it solid pricing power as well as the ability to win preference from its suppliers. These are precisely the reasons why Nvidia's data center business still has a lot of room for growth.
Why Nvidia's AI chip dominance is here to stay
Assuming Nvidia does generate $20 billion in revenue in the current quarter, and 80% of that comes from the data center business, we can assume that it will sell $16 billion worth of AI chips in the current quarter. This will bring Nvidia's total revenue from selling AI chips in calendar 2023 (which coincides with the company's fiscal 2024) to $45 billion (Nvidia's data center revenue for the first nine months of fiscal 2024 stands at just over $29 billion).
We saw earlier in the article that the AI chip market is anticipated to generate $53 billion in revenue this year, which means that Nvidia's revenue share of this market could stand at a whopping 85% based on the $45 billion revenue projection. This massive revenue share explains why Nvidia's foundry partner TSMC is working to significantly increase its advanced chip packaging capacity -- known as chip-on-wafer-on-substrate -- that's used for manufacturing AI chips.
Nvidia is anticipated to benefit substantially from TSMC's move as third-party reports suggest that the former could corner more than 60% of the Taiwan-based foundry giant's advanced chip packaging capacity in 2024. In the end, it can be said that Nvidia is likely to remain the dominant supplier of AI chips going forward, and that should help the company sustain the terrific growth it has been delivering.
Given that the AI chip market is expected to generate almost $120 billion in annual revenue in 2027, according to Gartner, Nvidia's data center business can keep getting bigger. All this explains why analysts have raised their growth expectations from Nvidia following its latest results and expect the company to hit $105 billion in revenue in fiscal 2026.
If we multiply the projected fiscal 2026 revenue of $105 billion by Nvidia's five-year average sales multiple of 20, its market cap could jump to $2.1 trillion over the next three years. That would be a 78% jump from current levels. However, Nvidia is currently trading at 38 times sales. It is justifying that rich valuation with outstanding growth, so don't be surprised to see this chipmaker jump even higher thanks to its grip over the AI chip market.