For the past three decades, Wall Street and investors have been on a never-ending quest to locate the next big investment trend. Some of these trends have completely changed the growth trajectory of the U.S. economy, such as the advent of the internet. For the time being, no innovation is captivating the attention of professional and everyday investors quite like artificial intelligence (AI).

Broadly speaking, AI involves the use of software and systems to handle tasks that would normally fall to humans. What makes AI so intriguing is machine learning, which is what allows software and systems to grow smarter over time, which can translate into higher proficiency at assigned tasks, or perhaps the learning of new tasks.

The reason AI stocks have skyrocketed is because the technology (in theory) has applications across virtually every sector and industry. It's what compelled the analysts at PwC to predict that AI will add close to $16 trillion to global gross domestic product by 2030.

A human face emerging from a sea of pixels, which is representative of artificial intelligence.

Image source: Getty Images.

No company has directly benefited from the AI revolution more than Nvidia

But make no mistake about it: semiconductor company Nvidia (NVDA 6.18%) is the undeniable face of the artificial intelligence revolution.

Over the course of a little more than a year, Nvidia has established itself as the infrastructure foundation of AI-accelerated data centers. The company's high-powered A100 and H100 graphics processing units (GPUs) are staples of enterprise-operated high-compute data centers. According to lofty expectations issued last year by analysts at Citigroup, Nvidia could account for a 90% share of GPUs in use in AI-accelerated data centers in 2024.

Another factor helping Nvidia is that enterprise demand has overwhelmed the supply of its AI GPUs. The law of supply and demand in economics states that if the supply of a good is constrained and demand is high, the price of that good will increase. The simple fact that data center sales growth has increased many multiples faster than Nvidia's cost of revenue demonstrates that the company is enjoying exceptional pricing power on its top-notch GPUs.

The potential good news in the current fiscal year -- Nvidia's fiscal 2025 began on Jan. 29, 2024 -- is that GPU production should meaningfully ramp up. Leading chip fabrication company Taiwan Semiconductor Manufacturing is sizably increasing its chip-on-wafer-on-substrate capacity (CoWoS). High-bandwidth memory systems are a practical necessity for AI-accelerated data centers, and they're packaged on CoWoS. With this supply chain issue easing, Nvidia should be able to meet the demand of more enterprise customers this year.

But in spite of this seemingly perfect scenario, Wall Street's AI darling may be headed for disaster.

This is the Nvidia statistic that should have optimists doing a double-take

There is no shortage of reasons why Nvidia could struggle to sustain a market cap in excess of $2 trillion. Just this week I pointed to its historically lofty valuation relative to trailing-12-month sales, as well as the tendency of next-big-thing trends to navigate early-stage bubbles, as viable reasons to be skeptical of Nvidia's surging stock.

But arguably nothing is scarier than Nvidia's customer concentration.

Prior to Nvidia reporting its fiscal fourth-quarter results on Feb. 24, Bloomberg and Barclays Research revealed Nvidia's top customers as a percentage of total sales. The company's four largest customers account for 40% of total revenue and include:

  • Microsoft: 15%
  • Meta Platforms (META 0.43%): 13%
  • Amazon: 6.2%
  • Alphabet: 5.8%

Customer concentration isn't, by itself, an issue. In fact, you can make a solid argument that having some of the world's most influential innovators as your top customers is an advantage for Nvidia.

A neat row of graphics processing units.

Image source: Getty Images.

The problem for the infrastructure backbone of the AI movement is that its four largest customers by revenue are actively developing AI chips of their own to complement the Nvidia GPUs they've been purchasing for their high-compute data centers. It's a pretty clear indication that these four companies aim to lessen their reliance on Nvidia's GPUs in subsequent years, or would prefer to completely ditch Nvidia's infrastructure in favor of internally developed technology.

The other issue is that even if Microsoft, Meta, Amazon, and Alphabet continue to purchase Nvidia's GPUs, we're more than likely witnessing a peak in orders in 2024.

For example, Meta spent almost $27.3 billion on property and equipment last year. The 350,000 H100 GPUs the company is purchasing from Nvidia will come at a cost of up to $10.5 billion. That's a pretty sizable percentage of Meta's annual capital expenditures coming from a single purchase, which almost certainly isn't going to be duplicated in future years. CEO Mark Zuckerberg has noted that his company will have "around 600,000 H100 equivalents of compute if you include other GPUs" by the end of 2024. Presumably, Meta's chief is intimating the use of Meta's in-house-developed AI chips along with Nvidia's H100 GPUs.

The point being that Nvidia's top customers are either moving away from its GPU technology, or are highly unlikely to sustain their existing order activity beyond the current year. This is a potentially scary scenario for Wall Street's hottest AI stock that's receiving very little attention.