Nvidia's (NVDA -0.29%) H100 data center graphics processing unit (GPU) has been a game changer for the company since it was launched a couple of years ago, which is not surprising as the chip arrived just in time for the beginning of the artificial intelligence (AI) revolution.

Nvidia announced the H100 processor in March 2022. The chip went into full production in September of that year, and was made available for customers the following month. Assuming you invested $1,000 in Nvidia stock in October 2022 -- when the sales of the H100 began -- your investment would now be worth just over $7,000.

NVDA Chart

NVDA data by YCharts

Let's look at the reasons why the H100 chip has made Nvidia a multibagger investment in the space of just a year and a half.

Nvidia's data center business has skyrocketed thanks to the H100

The H100 went on sale toward the end of Nvidia's fiscal year 2022 (which ended on Jan. 30, 2022). The chipmaker finished that fiscal year with $27 billion in revenue, a solid jump of 61% from the previous year. Its non-GAAP gross margin stood at almost 67% in fiscal 2022. The data center business specifically produced a record $10.6 billion in revenue for Nvidia that year, and it grew a solid 58% year over year.

The gaming segment was Nvidia's biggest business at that time, followed by the data center segment, which accounted for 39% of its top line. However, the arrival of the H100 has changed Nvidia's revenue mix big time by supercharging its data center business. In fiscal 2023, for instance, the company's data center revenue increased 41% to $15 billion. It accounted for 55% of Nvidia's top line, and was its biggest source of revenue.

Fiscal 2024, which ended in January this year, was even bigger for the data center business, as Nvidia generated a massive $47.5 billion in revenue from this segment, more than triple the previous year. The company now gets 78% of its revenue from selling data center chips. What's more, its non-GAAP gross margin stood at 73.8% last year, a nice improvement from fiscal 2022 levels when the H100 was launched.

The H100 can be given the majority of the credit for this remarkable surge in Nvidia's revenue and margins. The company sold an estimated $38 billion worth of these GPUs last year as companies lined up to buy the H100 for training large language models, catapulting Nvidia to the pinnacle of the AI chip market with a market share of more than 90%. Demand for the chip was so hot that the waiting period stretched to as long as a year.

Also, reports that Nvidia makes a profit of almost 1,000% on each H100 GPU explains why its margins and earnings have simply taken off.

NVDA Profit Margin Chart

NVDA Profit Margin data by YCharts

What next after the H100?

Nvidia's H100 will be a couple of years old in a few months, so it is not surprising that the company is all set to bring out a couple of upgrades to the chip. In November last year, Nvidia announced the H200 GPU.

Equipped with the latest generation of high-bandwidth memory (HBM), the H200 offers a bandwidth of 4.8 terabytes per second, compared to the H100's 3.35 terabytes per second. Additionally, the H200 has 141 gigabytes of HBM3e compared to 80 gigabytes of the previous-generation HBM on the H100. Nvidia plans to start the initial shipments of the new chip in the second quarter of this year, and in its latest earnings conference call described demand for this new chip as "strong."

However, the bigger upgrade will arrive when Nvidia launches the Blackwell architecture-based GPUs later this year. These new chips are based on a more advanced process node, which explains why the upcoming B200 processor is expected to deliver a 4x jump in AI training performance and a whopping 30x jump in inference.

Multiple Nvidia customers have already announced that they will be deploying the B200 chips, which is not surprising as this processor will allow them to train even bigger AI models and even run generative AI models in real time. All this indicates that Nvidia is making the right moves to maintain its dominant position in the AI chip market. Not surprisingly, the company's data center revenue is expected to continue growing at a nice pace.

Market research firm Omdia is forecasting $87 billion in data center revenue for Nvidia this year. The company could witness even stronger data center revenue in the long run thanks to AI, with Japanese investment bank Mizuho forecasting $280 billion in revenue from this segment in 2027.

So if you didn't buy Nvidia when the H100 was launched, you should consider buying it right now as the arrival of its next-generation chips could help sustain its impressive stock market rally. Another reason to do that is the fact that Nvidia is currently trading at 36 times forward earnings, which is a discount to its five-year average forward earnings multiple of 39. Investors are getting a good deal on this AI stock, which they may not want to miss considering that it seems built for more upside.