The cat's out of the bag: It's not just Nvidia's (NVDA -1.81%) expensive GPUs that are needed to build massively complex generative artificial intelligence (AI) services like ChatGPT. While Nvidia chips act as the "brain" of AI, something else is needed to move all the data around. That's where fellow chip design giant Broadcom (AVGO 5.32%) comes in.

Nvidia was reluctant to provide any specific guidance based on its booming AI business beyond the next quarter, but Broadcom CEO Hock Tan offered a few extra tidbits on the financial outlook for AI that extend into 2024. It means great things for Broadcom's business, but perhaps even more so for Nvidia -- and may even justify that "expensive" stock price.

Broadcom's epic AI prediction for the next year and a half

Naturally, Tan opened his remarks on the last Broadcom earnings call talking about generative AI. With all the buzz these days surrounding ChatGPT and the firestorm of cloud giant AI spending it has spawned, investors want to hear about financial results pertaining to this trend.

Here's what he said:

I know you all want to hear about how we are benefiting from this strong deployment of generative AI by our customers. [To] put this in perspective, our revenue today from this opportunity represents about 15% of our semiconductor business. Having said this, it was only 10% in fiscal '22, and we believe it could be over 25% of our semiconductor revenue in fiscal '24. In fact, over the course of fiscal '23 that we're in, we are seeing a trajectory where our quarterly revenue entering the year doubles by the time we exit '23. And in fiscal third quarter '23, we expect that this revenue [will] exceed $1 billion in the quarter.  

Let's do some simple math. Broadcom's semiconductor business represents about 80% of its total revenue, excluding the enterprise software segment (which was assembled via acquisitions starting in 2017). In its fiscal year 2022 (which ended in October 2022), semiconductor revenue was $25.8 billion. Using Tan's 10% AI reveal, that means Broadcom made about $2.6 billion in AI chip sales last year.

AI revenue is headed to 15% of the total semiconductor base in fiscal 2023 (which again ends this October), and should generate $1 billion in sales in the third quarter alone. Tan and company said semiconductor sales should rise by a mid-single-digit percentage in Q3 of fiscal 2023, which would put the total around $7 billion (up from about $6.6 billion in Q3 2022). That $1 billion divided by $7 billion is right around that 15% mark Tan was hinting at, which would seemingly indicate AI chip revenue will be $4 billion or more this year.

Assuming Broadcom's semiconductor revenue continues to grow at a low- to mid-single-digit percentage next year, total chip sales for the company should be at least $30 billion in fiscal 2024. At 25% of revenue, this would imply nearly $8 billion in AI chip sales for Broadcom. If it all transpires as Tan predicted (he insisted multiple times on the earnings call that generative AI is in the early innings and hard to predict, suggesting this guidance was conservative), Broadcom is poised to have a solid spate of growth for the next year and a half or so.

What does this mean for Nvidia?

Broadcom's networking chips are just one of three primary ingredients when cooking up a new AI system. The other two components are memory chips and processors. Currently, it's the processing power that fetches the highest dollar for the big cloud computing companies (Microsoft, Alphabet's Google Cloud, Meta Platforms) making these AI system purchases.

This is what has the market so exuberant over Nvidia. After reporting $7.2 billion in sales in its last quarter, the GPU designer expects to haul in $11 billion in the next three months. That $4 billion bump will come primarily from its data center segment, which houses its AI hardware sales.

Nvidia was reluctant to provide specific numbers for its data center business for the rest of this year, let alone next year. But if demand for Broadcom chips is any indication, Nvidia's data center business is headed much higher. After all, purchases of Broadcom AI chips aren't made in isolation. Those networking pieces are being demanded so that something can be networked together. That something is almost certainly a lot more Nvidia hardware.

Let's not go so far as to project Broadcom's expected growth rates from AI (possibly a double in AI revenue from this year to next) onto Nvidia. Nevertheless, if Broadcom is correct, it isn't hard to imagine an Nvidia data center segment rolling in about $8 billion worth of sales, or more, every quarter by the end of this year and into next year. 

Nvidia's trending revenue schedule, showing data center sales last quarter at nearly $4.3 billion. These sales are poised to nearly double next quarter alone.

Image source: Nvidia.

Does this seem absurd? Perhaps it is. But remember Nvidia said the majority of its extra $4 billion in sales next quarter will be attributable to its data center segment, and it had to secure more supply from manufacturing partners for the rest of this calendar year.

All told, Nvidia could be an absolute monster that pulls in over $50 billion in revenue in calendar year 2024 based on this guidance. This doesn't include possible incremental (albeit far more modest) growth from its video game and automotive segments.

This explains Nvidia's premium price tag of over 50 times forward earnings for the current year, compared to 20 times forward earnings for Broadcom. At some point, this AI hype should cool off and the cloud giants should ease their spending spree. There is a lot of hype out there, so investors should be mighty cautious right now. But initiating a prudent dollar-cost-average plan into Nvidia, and especially Broadcom, for the long haul might not be a terrible idea if you think a new AI era is only just beginning. Semiconductors are running hot again, and these two companies are leading the charge higher.