Nvidia (NVDA 6.18%) took the S&P 500 by storm in 2023, gaining 239% to become the best-performing stock in the index that year. The trend has continued in 2024 -- Nvidia is up 80% year to date and is again the best-performing S&P 500 component.

There's certainly a degree to which hype is now driving Nvidia's unstoppable run. But the fundamental reason the stock is soaring is earnings growth. Specifically, growth from its compute and networking segment, led by sales to data centers.

Let's consider why the exponential growth from Nvidia's data center business seemed to come out of nowhere, whether the business has room to run, and some headwinds investors should consider before buying this semiconductor stock.

A person holding a laptop while working in a data center.

Image source: Getty Images.

The backbone of AI

You've probably heard a lot about how Nvidia is powering artificial intelligence (AI) but may be wondering how that actually works in practice.

As the internet has expanded, it has demanded more and more computing power. The rapid transfer of information got so intense, and data security became so vital, that localized data centers were no longer good enough. As a result, centralized data centers -- known as cloud infrastructure -- have been developed to handle massive amounts of data as efficiently and securely as possible.

But AI models demand unprecedented amounts of computational power. Companies cannot develop the AI solutions they want without the processing backbone to run those models. And that processing power is exactly what Nvidia is supplying through its data center business.

Nvidia's red-hot data center business

Nvidia's data center business is the foundation of the investment thesis and the core driver of the stock's recent gains. The company reports its results under two segments. Graphics focuses on sales for gaming, PCs, visualization, software for internet applications, and more. Compute and networking includes sales for data centers, AI for the automotive industry, electric vehicle computing platforms, and more.

It wasn't long ago that graphics was Nvidia's larger segment. But in fiscal 2023 (which ended Jan. 29, 2023), compute and networking overtook graphics on both revenue and operating income. Here's a look at how the dynamic has shifted over the years. The growth will astonish you.

Metric

Fiscal 2020

Fiscal 2021

Fiscal 2022

Fiscal 2023

Fiscal 2024

Compute and networking revenue

$3.28 billion

$6.84 billion

$11.05 billion

$15.07 billion

$47.41 billion

Total revenue

$10.92 billion

$16.68 billion

$26.91 billion

$26.97 billion

$60.92 billion

Compute and networking share of revenue

30%

41%

39.4%

55.9%

77.8%

Compute and networking operating income

$750 million

$2.55 billion

$4.60 billion

$5.08 billion

$32.02 billion

Segment operating income*

$4.02 billion

$7.16 billion

$13.09 billion

$9.64 billion

$37.86 billion

Compute and networking share of segment operating income

18.7%

35.6%

35.1%

52.7%

84.5%

Data source: Nvidia. *Segment operating income is the sum of compute and networking operating income and graphics operating income. Unlike total operating income, segment operating income doesn't reflect expenses such as stock-based compensation, corporate infrastructure, and acquisitions because they're related to the overall business, not the functions of the individual segments. Fiscal 2024 ended Jan. 28.

During the fiscal 2024 third-quarter earnings call, Nvidia said its HGX platform, based on its Hopper GPU architecture, had delivered the vast majority of its revenue in the quarter. The supercomputing platform can handle massive datasets and complex simulations. Nvidia has made upgrades to the offering over the years. In November, it announced the HGX H200 with configurations of up to eight GPUs. Nvidia continues to make product improvements to specifically address the AI needs of customers, and it's clearly working as Nvidia's sales are through the roof.

CFO Colette Kress said during the call:

Nvidia HDX with InfiniBand together are essentially the reference architecture for AI supercomputers and data center infrastructures. Some of the most exciting generative AI applications are built and run on Nvidia, including Adobe, Firefly, ChatGPT, Microsoft 365 Copilot, CoAssist, Now Assist with ServiceNow, and Zoom AI Companion. Our data center compute revenue quadrupled from last year and networking revenue nearly tripled. Investment in infrastructure for training and inferencing large language models, deep learning recommender systems, and generative AI applications is fueling strong broad-based demand for Nvidia accelerated computing.

Since Nvidia is at the forefront of AI today, it has unique insight into the industry's trajectory, allowing it to make helpful product improvements that further sustain its growth.

Not all AI investments will pan out

There's a lot of speculation about how high Nvidia can fly; some wonder if the stock is grossly overvalued. But the investment thesis is actually beautifully simple. If its customers are making money on AI, then they will demand faster and safer computing power. As the use of AI grows, so too will data centers.

The issue comes down to figuring out how much of this investment is overkill and how much will prove sticky. AI applications have to be useful and accepted by users and the marketplace. For example, if companies throw billions of dollars at their efforts to make rivals to ChatGPT, then Nvidia will benefit in the short term. But the demand for GPUs created by those efforts could prove short-lived if the returns on those AI investments fail to materialize.

Right now, the greatest risk to Nvidia isn't even competition because it has the best products. Rather, it's the possibility the chip industry will become overextended, setting it up for a sharp cyclical downturn.

Today, investors are buying the stock with the expectation of surging data center growth. If AI investments are here to stay, then Nvidia should be a long-term winner, even from its current valuation. But some investors may want to take a wait-and-see approach with Nvidia to gauge how its data center segment performs during a slowdown rather than judging it solely by how it's doing during the biggest cyclical upswing in the company's history.