If you didn't think the demand for artificial intelligence (AI) could get any stronger, think again.

Four of the biggest tech companies in the world all reported very strong demand for AI applications, and that's great news for Nvidia (NVDA -1.21%). The chipmaker designs the leading GPUs for training the large language models powering new generative AI applications. It could see a big increase in business based on recent management commentary from Tesla (TSLA 1.46%), Meta Platforms (META -0.33%), Microsoft (MSFT -0.34%), and Alphabet (GOOG 1.04%) (GOOGL 1.04%).

Here's what they had to say and what it all means for Nvidia and its investors.

An Nvidia sign outside of a large glass building.

Nvidia HQ. Image source: Nvidia.

Tesla plans to increase its Nvidia chips by 140%

Tesla plans to unveil its robotaxi in August. The Cybercab relies on Tesla's full self-driving (FSD) capabilities, which use artificial intelligence to make real-time driving decisions.

To train that AI, Tesla has rapidly expanded its AI training infrastructure. CEO Elon Musk says the company doubled its training compute power sequentially in the first quarter. What's more, he expects Tesla's cluster of 35,000 Nvidia H100 GPUs to expand to about 85,000 by the year's end, just for training its AI.

That's a huge investment coming Nvidia's way. It's also a major endorsement for Nvidia from Musk, who had previously considered other chipmakers' GPUs.

Meta needs to spend more to become the leader in AI

Meta has been spending heavily to build out data centers to train its artificial intelligence, but now, it sees a good reason to spend even more.

Alongside its first-quarter earnings release, Meta shared plans to increase its capital expenditures from its original range of $30 billion to $37 billion to somewhere between $35 billion and $40 billion when all is said and done in 2024. And the company expects to spend even more than that in 2025.

That's because CEO Mark Zuckerberg sees an opportunity for Meta to "build leading AI models and be the leading AI company in the world."

Meta has made excellent progress in just a few years developing its LLaMA large language model. It recently released version 3 of the model, rolling out its Meta AI chatbot alongside it. Zuckerberg thinks there are a lot of business opportunities for Meta if it can build an AI model used by hundreds of millions of people around the world.

But to get there, it has to spend heavily, and a lot of that spending will go toward Nvidia's chips, at least for the time being. Meta's also building its own custom silicon to train its AI.

Microsoft is seeing more demand for AI than it can serve

Meta's not the only company that's planning to increase its spending on AI infrastructure. Microsoft also expects a big step up in spending thanks to the growing demand for AI.

"Currently, near-term AI demand is a bit higher than our available capacity," CFO Amy Hood told analysts during Microsoft's fiscal third-quarter earnings call.

Microsoft set itself up as the leading public cloud service provider for AI application developers when it announced a $10 billion investment in OpenAI at the start of 2023. That move has paid off handsomely as the growing demand for AI has led Microsoft to take share from other public cloud competitors. But even as it invests in adding capacity, it seemingly hasn't been able to keep up.

Microsoft now plans to increase spending for the fiscal 2024 fourth quarter and fiscal 2025 in order to meet the growing demand. It's also developing its own AI using OpenAI's GPT model, which powers its Copilot software.

Again, a lot of the step up in spending will go toward buying more Nvidia chips for its cloud customers' needs. Microsoft, however, does design its own chips as well.

Alphabet is already spending big to stay on the cutting edge

There were many market-pleasing surprises in Alphabet's first-quarter earnings report, but one should stand out to Nvidia's investors.

Alphabet poured $12 billion into capital expenditures last quarter. That's nearly double what it spent during the year-ago period.

Alphabet's investments will primarily go toward adding capacity to Google Cloud and building and training more advanced AI models. Alphabet is also integrating AI into search -- it recently rolled out AI-powered answers at the top of certain search queries.

It's worth noting Alphabet uses its own chip designs to train its Gemini model, but it will still commit some spending to Nvidia as its cloud customers demand those chips.

What it all means for Nvidia and its investors

We're currently in the midst of an AI arms race. As Microsoft pointed out, the demand for AI compute is bigger than the supply.

That's been an incredible boon for Nvidia, which can't produce chips fast enough to supply everyone who wants them. As a result, its pricing power has gone through the roof. Its gross margin climbed from 56.9% in fiscal 2023 to 72.7% in fiscal 2024. The comments from its fellow Magnificent Seven companies suggest demand continues to outpace supply, which bodes well for Nvidia going forward.

Even after an incredible run over the last couple of years, Nvidia could still benefit from the increased spending from big tech over the next two years. However, a lot of that spending is going toward developing and deploying those companies' own chip designs. Microsoft, Alphabet, Amazon, and Meta all design their own AI training and inference chips. Apple is reportedly working on its own AI data center chips as well.

As these big companies shift more of their AI workload to their own chip designs, it'll take a huge bite out of Nvidia's demand. And despite strong demand trends from smaller enterprises without in-house designs, it's hard to overcome that challenge.

To be sure, the next few years will likely be very good for Nvidia, but its position at the top of the AI food chain is rather precarious. With shares trading at a forward price-earnings ratio of 35 as of this writing, investors are expecting very high sustained earnings growth for years to come. I'm not sold that Nvidia has the competitive moat to make that happen, even if the near-term demand appears extremely promising.