In 2023, Nvidia's (NVDA -1.55%) H100 graphics processing unit (GPU) for the data center was the world's most popular artificial intelligence (AI) chip, granting the company a staggering 98% market share. Nvidia still has an edge over every other chip maker, but some of its biggest customers are now also buying AI chips from Advanced Micro Devices and Broadcom, which are quickly catching up from a technology perspective.
However, growing competition might not be Nvidia's greatest threat. Potentially more concerning is the fact that, during the fiscal 2026 second quarter (which ended on July 27), the company's top two customers accounted for a whopping 39% of its total revenue, which was significantly higher than the year-ago period. This high degree of revenue concentration could be Nvidia's greatest long-term risk, because the company's incredible run of growth could reach a very abrupt end if its largest customers decide to cut back on their AI data center spending.
Here's what investors need to know.

Image source: Nvidia.
Nvidia's chips remain the gold standard in AI development
Demand for data center GPUs probably won't slow down for the foreseeable future, because each new generation of AI models requires significantly more computing power than the last, in order to process higher volumes of data at faster speeds.
According to Nvidia CEO Jensen Huang, the latest reasoning models (like OpenAI's GPT-5 and Anthropic's Claude 4) consume up to a thousand times more tokens (words and symbols) than older one-shot large language models (LLMs), because they spend more time "thinking" in the background to weed out errors before generating outputs.
Nvidia's latest Blackwell Ultra GB300 GPU is the gold standard right now for these new models, delivering up to 50 times more performance in certain configurations than the old H100. These chips just started shipping to customers, but demand from some of the world's biggest tech companies could fuel significant growth in Nvidia's business over the next few quarters.
Nvidia faces significant concentration risk
Nvidia generated $46.7 billion in total revenue during the fiscal 2026 second quarter, which was an increase of 56% from the year-ago period. The data center segment accounted for 88% of that revenue, so AI GPUs are the company's most important product by a country mile.
Nvidia doesn't disclose who its customers are, but it does report some data on the concentration of its revenue base. During the second quarter, just two mystery customers represented a combined 39% of its total $46.7 billion in revenue:
Customer |
Proportion of Nvidia's Q2 Revenue |
---|---|
Customer A |
23% |
Customer B |
16% |
Data source: Nvidia.
That means Customer A and Customer B spent a combined $18.2 billion with Nvidia during Q2. Only a small number of companies in the entire world have enough financial resources to keep that up. As I mentioned earlier, if these customers decide to cut back on their AI infrastructure spending, Nvidia would be highly exposed because it would be very difficult to replace such a large chunk of revenue.
But here's the more concerning part: Customer A and Customer B together accounted for 25% of Nvidia's total revenue in the year-ago quarter, so the company's concentration risk is actually growing.
Customers A and B could be any of these tech giants
Although we can't definitively identify Nvidia's top customers, we can make some reasonable assumptions based on public disclosures by some of the world's top tech companies:
- Alphabet (GOOG -0.09%) (GOOGL -0.13%) will allocate up to $85 billion to capital expenditures (capex) during calendar year 2025, most of which will go toward AI data centers and chips.
- Meta Platforms (META 1.93%) has an AI capex budget of between $66 billion and $72 billion this year.
- Amazon (AMZN 1.23%) is on track to spend up to $118 billion on AI infrastructure this year based on the company's most recent guidance, which would be a record amount.
- Microsoft's (MSFT -1.13%) capex came in at $88 billion during its fiscal year 2025 (which ended on June 30), and it plans to spend even more in fiscal 2026.
Therefore, Customer A and Customer B could be any of the above companies. OpenAI, Oracle, and even Tesla could also be among Nvidia's top customers, but they have put forward much smaller capex budgets.
Despite their incredible scale, none of the above tech companies can spend at the current pace forever, so Nvidia's AI GPU sales are likely to shrink eventually. Fortunately, it probably won't be any time soon because Jensen Huang thinks AI data center spending will total $4 trillion over the next five years, so there is still a long runway for growth.
Nvidia stock is attractively valued right now, so there could be upside on the table for investors who are willing to hold it for the next few years. However, it's important to stay vigilant by keeping a close eye on capex forecasts from the tech sector, because any weakness there could be an early sign that Nvidia's revenue growth is about to stall.