Nvidia's (NVDA 3.14%) artificial intelligence (AI)-driven surge shows no signs of stopping. The company's results for the second quarter of fiscal 2024 (ended July 30, 2023), which were released last month, suggest that it's just getting started in this lucrative niche.

Nvidia's revenue doubled on a year-over-year basis during the quarter and earnings shot up an astonishing 429%. The company expects even faster year-over-year revenue growth of 171% in the current quarter to $16 billion as it scrambles to meet heavy demand for its H100 data center graphics processing units (GPUs).

It's worth noting that the waiting period for Nvidia's H100 GPUs reportedly is up to more than six months. This points to a solid revenue pipeline for the company, given that each chip is priced at $40,000.

The good part is that the chipmaker has made a smart move to alleviate some of the pressure on its supply chain by introducing a new platform. This will not only help its customers tackle AI workloads, but also ensure that Nvidia enters a new market from which it could mint billions of dollars.

Nvidia's new chip: Unlocking a massive opportunity

Nvidia revealed its GH200 Grace Hopper Superchip last month for accelerated computing and generative AI workloads. Third-party reviews indicate that this chip is way more powerful than competing chips from the likes of AMD and Intel. That's not surprising as this server CPU (central processing unit) is equipped with a massive 282GB of the latest generation of high-bandwidth memory (HBM), known as HBMe3.

Nvidia points out that HBMe3 memory is 50% faster than the current generation of HBM3 memory, delivering a bandwidth of 10 terabytes (TB) per second. As a result, it can deliver much faster memory bandwidth and will allow customers to run 3.5 times larger AI models. This means Nvidia's offering is much more powerful than AMD's and Intel's AI accelerators.

AMD's upcoming MI300X accelerator will be powered by 192GB of HBM3 memory and provide 5.2TB/second in bandwidth. Meanwhile, Intel's Gaudi2 memory has 96GB of HBM and can deliver only 2.45TB/second of bandwidth.

Investors should note that Nvidia is offering this latest version of HBM only on the Grace Hopper Superchip platform, which combines the Hopper GPU with a Grace server CPU. This is a smart strategy from Nvidia as it will enable the company to tap the huge end market for AI server CPUs, especially because the GH200 Superchip could help reduce the number of GPUs needed for AI inferencing purposes by as much as 60%.

The Grace CPU and the more powerful HBMe3 memory will help customers reduce their reliance on H100 Hopper GPUs -- which are already in short supply -- but still generate enough computing power to tackle their AI workloads. This also means customers will have to spend less money on the H100 GPU, which costs $40,000. Consequently, it won't be surprising to see an increase in the adoption of the Grace CPUs, which should ideally expand Nvidia's addressable opportunity in the AI chip market.

Intel estimates that CPUs could account for 60% of the AI chip market in the long run, with GPUs expected to account for the remaining. AMD and Intel are currently the two major players in the server CPU market, providing chips based on the x86 architecture. DigiTimes estimates that they accounted for almost 93% of the server CPU market in 2022, with chips based on the Arm architecture controlling just under 7% of the market.

Nvidia's Grace CPU is based on the Arm architecture, and its adoption is increasing rapidly. Arm server processors accounted for 3.5% of the server processor market in 2021, suggesting that their adoption doubled last year. According to another estimate, the Arm-based server processor market was worth an estimated $12 billion in 2022. That number is expected to jump to $82 billion in 2030, clocking a compound annual growth rate of 27%.

Nvidia's focus on moving its Arm-based Grace server CPUs could turn out to be a smart long-term move, considering the multibillion-dollar opportunity available in this market.

Why investors shouldn't worry about valuation

Analysts are anticipating Nvidia to clock annual earnings growth of 79% for the next five years. This would be more than double the 37% growth it has clocked in the last five.

The company's outstanding pricing power in AI GPUs and its entry into the server processor market should help it achieve such outstanding earnings growth in the future. This is why investors on the hunt for a growth stock should consider looking beyond Nvidia's valuation.

NVDA PE Ratio Chart

Nvidia's price-to-earnings ratio (P/E). Data by YCharts.

As the above chart indicates, Nvidia's trailing price-to-earnings multiple is expensive. But its forward earnings multiple is way lower, pointing toward explosive bottom-line growth. Assuming Nvidia does clock 79% annual earnings growth as analysts project for the next five years, its earnings could jump to almost $198 per share at the end of fiscal 2029 (using fiscal 2024's projected earnings of $10.76 per share as the base).

Multiplying the projected earnings by Nvidia's five-year average forward earnings multiple of 41 would translate into a stock price of more than $8,000 after five years. That's way higher than the company's current stock price of $485, indicating that this AI stock could keep delivering phenomenal gains in the long run.