The most consequential company of the past two years has no doubt been Nvidia (NVDA 6.18%). With a multi-year lead developing artificial intelligence GPUs, there's little doubt Nvidia leads the AI revolution today, and will likely do so for the next five years.

Of course, with its forward P/E ratio now at 36, implying about 100% earnings growth over the 288% earnings growth the company just posted in its past fiscal year, there's a lot of optimism already embedded in Nvidia's price.

To decide if it's worth adding Nvidia today, investors have to judge if all that optimism is warranted... or perhaps if there's not even enough.

A future of AI factories

Nvidia CEO Jensen Huang has outlined his vision for the future in which companies no longer just have regular data centers for storing data and running computing tasks. Rather, he sees all companies deploying "AI factories" both in the cloud and on premises, as generative AI becomes embedded in a wide range of applications and becomes a core aspect in every part of business.

Not only that, but he sees the AI chip market not merely replacing the current CPU-based computing architecture but expanding on it. This is because not only will generative AI require Nvidia GPUs -- Huang also sees traditional computing needing acceleration via GPUs as well:

You can tell by the CSPs extending and many data centers, including our own for general-purpose computing, extending the depreciation from four to six years. There's just no reason to update with more CPUs when you can't fundamentally and dramatically enhance its throughput like you used to. And so you have to accelerate everything.

Nvidia lent some credence to that assertion by telling investors that about 40% of its data center revenue went to inference last year. Inferencing is when a trained model reacts to a prompt -- in other words, when generative AI models are used in the real world. Some had thought that while training needed high-powered GPUs, inferencing could be done by lower-power CPUs from competitors. However, Nvidia's inference disclosure shows its GPUs aren't just needed for training, but may even be the most efficient inferencing choice as well.

And not only will enterprises be building out AI factories, but countries and regions will be as well. Huang calls this "sovereign AI." This comes as governments around the world work with their own sovereign data in their own language reflecting their own culture as AI becomes more human-like and embedded in society.

Add it all up, and numbers get big

What does this all add up to for Nvidia's financials? Key industry participants have put forward their cases for the AI market outlook. For his part, Jensen Huang estimates the roughly $1 trillion worth of data centers across the world will all be replaced by accelerated computers, and that the AI transition plus the broadening of customers (i.e. sovereign AI) will push that total market to $2 trillion.

If one anticipates a five-year lifespan for accelerated computing servers, that equates to about $400 billion per year in AI data center revenue.

Nvidia labeled servers in data center guy with mask.

Image source: Nvidia.

This falls somewhat in line with rival Advanced Micro Devices (AMD 2.37%) CEO Lisa Su's outlook. Back in December, she said she anticipates the AI accelerator market to reach $400 billion by 2027, up from her previous outlook for just $150 billion. And while that includes more edge devices, including AI PC chips, Nvidia also makes GPU chips for PCs too with its RTX line. But data center chips will likely make up the vast majority of that $400 billion.

Obviously, Nvidia totally dominates the AI accelerator market right now, with a first mover advantage in terms of both technology innovation as well as its CUDA software ecosystem. Estimates for its current market share of AI chips run between 80% and 95% of the current market. Of course, rival AMD just unveiled its MI300 accelerators last quarter, its first AI accelerator, and other cloud giants and start-ups alike will no doubt be gunning for a part of this exciting market.

Still, I'd lean toward Nvidia hanging onto a large portion of the market. Maybe not 95%, but certainly a majority. And a lot of it has to do with its software.

A $1 billion run rate just the start?

While Nvidia is known for its chips, its software chops could be what sets it apart and bolsters its economic moat. While Nvidia invented the CUDA software library almost 20 years ago in order to program its graphics chips for data processing, it's also recently come out with other compelling software offerings. These include Nvidia Omniverse for digital twin modeling of physical objects, and Nvidia DRIVE for autonomous driving.

But the most consequential software feature is likely something called Nvidia AI Enterprise. Discussed on the last earnings call, Nvidia AI Enterprise is essentially a new operating system that allows cloud software to run on accelerated GPU chips.

Huang explained that in the cloud era, the largest enterprise software companies had large engineering teams that deployed complex software on traditional CPU-based cloud infrastructure, or they were able to get help via the large cloud service providers (CSPs). But many top software companies don't have the engineers to do so for accelerated GPUs, nor do the cloud CSPs for the most part.

Huang explained Nvidia AI Enterprise as a containerized operating system which will help software companies run their AI-infused applications on Nvidia chips in a seamless fashion. Nvidia charges $4,500 per GPU per year for the service, and it's already at a $1 billion run-rate. One can imagine if every software company embeds generative AI within their offerings, as one should probably expect, this new software could become a big business.

Outlook 2028

If the AI chip market achieves the $400 billion level and Nvidia retains, say, 60% to 80% of the market, that equates to somewhere between $240 billion and $320 billion in revenues. Those levels are multiples higher than last year's $60 billion in total revenue, even though last year's revenues surged a staggering 126%.

But course, Nvidia has also impressed in expanding its product set over time, especially into software, which tends to carry high margins and garner higher multiples in the market. And of course, Nvidia still has incremental revenue and profits from its gaming, visualization, and auto chip segments, too.

A big question is Nvidia's achievable margins. While some expect net income margins to come down from the ridiculous 58% posted last quarter, I'm not so sure the company won't be able to maintain something close to those figures, given Nvidia's massive global scale and pricing power.

At, say, even a 50% net margin, it wouldn't be a surprise to see Nvidia making $150 billion in net earnings five years out, as ridiculous as those numbers may seem today. And today's $2.18 trillion market cap amounts to just 14.6 times those theoretical future profits.

That's actually quite a reasonable valuation should Nvidia continue to dominate the AI market, and if that market evolves as major AI CEOs have projected.