Nvidia (NVDA 3.34%) has been a massive beneficiary of the surge in demand for artificial intelligence (AI) applications that started at the end of 2022. Cloud computing giants have been lining up to get the company's data center graphics processing units (GPUs) to train and power their large language models (LLMs).

For Meta Platforms (META 0.57%), Microsoft, Amazon, and many others, Nvidia has been the go-to provider of AI chips. It is worth noting that these tech giants have been willing to wait as long as a year between order and delivery to procure Nvidia's chips, and they have been paying top dollar for them. Other chipmakers such as Intel and Advanced Micro Devices (AMD 1.31%) have been left in the dust in this niche: By some estimates, Nvidia controls a whopping 95% of the AI chip market.

As a result, Nvidia's earnings and revenue multiplied rapidly. However, some of its customers are making concerted efforts to reduce their reliance on its chips.

Making AI chips in-house

Nvidia's success in the AI GPU market can be credited to its A100 processor, which it launched in 2020. The graphics chip specialist built this GPU for high-performance computing applications, and it was manufactured using a 7-nanometer (nm) process node. OpenAI reportedly deployed thousands of A100 chips to train ChatGPT.

Interestingly, near the end of 2021, rival AMD started offering a competing data center accelerator that was built on a 6nm process node -- the MI250X. However, the A100 reportedly outperformed the newer AMD chip in LLM training training tasks, per third-party estimates.

Then in 2022, Nvidia upped its game with the H100 processor, which is built on a custom 5nm process. The company packed 80 billion transistors into the chip as compared to 54 billion on the A100. The H100 turned out to be significantly more powerful than its predecessor. AMD, on the other hand, took until the end of 2023 to arrive with its next competing chip, the MI300.

This explains why Nvidia's H100 was in terrific demand last year, driving $47.5 billion in data center revenue for the company in its fiscal 2024 as compared to $15 billion in the previous year. Meta alone shelled out billions of dollars to Nvidia for H100s, and it wasn't the only big buyer to do so.

However, the lack of a potent alternative to the H100, its high pricing, and its thin availability explain why some of Nvidia's top customers started in-house AI chip development efforts to reduce their reliance on the chipmaker. Meta Platforms, for instance, recently announced the second generation of its own AI chip, which is built on a 5nm process node.

According to Meta, the new chip "more than doubles the compute and memory bandwidth of our previous solution while maintaining our close tie-in to our workloads. It is designed to efficiently serve the ranking and recommendation models that provide high-quality recommendations to users."  

Moreover, Meta plans to continue its in-house chip development program as it looks to reduce the operating and development costs of its AI servers.

Something similar is happening at Microsoft. The tech giant revealed two custom AI chips toward the end of 2023, one of which is a 5nm AI accelerator called Maia 100. This AI chip reportedly has 105 billion transistors and has been built for running AI workloads in the cloud, including LLM training and inference.

Amazon, too, has gone down the in-house AI chip development route. It revealed its latest offering, the Trainium2, in November, which it claims is 4 times more powerful than its predecessor. Amazon Web Services customers have the option of using these chips to train AI models. Meanwhile, Alphabet has jumped onto this bandwagon with its newly revealed Axion custom AI processor.

Given that Meta, Microsoft, Google, and Amazon were among the top buyers of H100 processors last year, their focus on in-house chip development is no doubt a threat to the semiconductor giant's bottom line.

Investors, however, should focus on the bigger picture

While it is true that Nvidia's customers are looking to reduce their reliance on it, the fact remains that they are expected to continue buying its powerful GPUs. For instance, when Nvidia announced the launch of its next-generation Blackwell AI GPUs last month, all the companies mentioned above said they would deploy the new chips once they were available.

That's not surprising. Nvidia's upcoming GPUs are expected to be significantly more powerful, enabling customers to train even bigger LLMs. The chipmaker claims that the Blackwell GPU can run LLMs "at up to 25x less cost and energy consumption than its predecessor." Given that it is likely to price these new GPUs competitively compared to the H100, Nvidia's customers could witness stronger returns on their AI hardware investments with Blackwell processors.

As a result, the demand for Nvidia's AI chips could continue to remain robust. Another reason why Nvidia could remain the dominant player in the AI chip market is because of its control over the supply chain. Nvidia's customers and rivals are turning to foundry giant TSMC to manufacture their own AI chips, but Nvidia reportedly consumes 60% of TSMC's advanced chip packaging capacity.

Of course, TSMC is looking to increase its capacity to meet the demand from Nvidia and other customers, but the GPU specialist is likely to lock up the biggest share of the foundry's added output, considering the massive lead it already enjoys in the AI chip market.

So, even if other big tech players continue their chip development efforts, Nvidia is likely to remain the top AI chip player for quite some time. Japanese investment bank Mizuho estimates that Nvidia could sell $280 billion worth of AI chips in 2027 as it projects the overall market hitting $400 billion. So while Mizuho is forecasting that Nvidia's share of the AI chip market will come down over the next three years, it expects its data center revenue to rise significantly.

As such, Nvidia's data center revenue is likely to keep growing at a healthy pace thanks to the secular growth opportunity in AI chips, even if the company loses some market share. That's why investors shouldn't worry a lot about the chip development moves of Nvidia's customers. Instead, considering the impressive catalysts it is sitting on, they should view its recent pullback as an opportunity to buy more shares.