Nvidia (NVDA 0.56%) has built an artificial intelligence (AI) chip empire over the past several years. Originally known for serving the video game market with chips, Nvidia pivoted to focus on AI, and that has emerged as its key business. This has proven to be a smart move as it's fueled enormous revenue growth for the company -- for example, annual revenue has increased by 2,500% over the past decade.
The tech company's entry into the market ahead of others secured its leadership, and its ongoing innovation has kept that going. Still, some investors have worried about Nvidia facing competition in the coming years, from other chip designers as well as its own customers -- some, such as Amazon, have developed their own chips that they use and sell along with those of other chip companies.
Should you worry about Nvidia's AI market leadership moving forward? The following 21 words from Nvidia chief Jensen Huang offer us a strikingly clear answer.
Image source: Getty Images.
An early bet on AI
Before diving in, though, let's take a quick look at the Nvidia success story and the growth of competition. As mentioned, Nvidia placed an early bet on the potential of the AI chip market by developing graphics processing units (GPUs) suited to the needs of this technology. And Nvidia has broadened its offerings to include a full range of products and services. All of this has resulted in revenue soaring to record levels; for example, revenue reached more than $130 billion in the latest fiscal year. And Nvidia also has reached high profitability on sales, generally maintaining gross margin above 70% in recent quarters.
But, as mentioned, Nvidia isn't the only game in town when it comes to AI chips. It faces competition from chip designers such as Advanced Micro Devices and Broadcom, as well as its own customers, such as Amazon. The company's Amazon Web Services (AWS) unit has developed its own line of chips, offering an option to cost-conscious customers. And Alphabet, owner of Google Cloud, also is a Nvidia customer that's developed its own chips. These players, in recent earnings reports, each have spoken of high demand from AI customers for these products.
So, it's clear that competition could be something investors might worry about as this AI story develops. Before we get to the comment from Jensen Huang, here's one important note about the use of the company's AI chips.

NASDAQ: NVDA
Key Data Points
Training LLMs
Initially, Nvidia's GPUs were used to power the training of large language models (LLMs), so that they would gain the information or knowledge to go on and answer questions or solve problems. This still is the case, but today and moving forward, the company's high-powered GPUs also serve as the power LLMs need to "think" and "reason." This is called the inference process, and Nvidia is betting that it will be a major growth area for AI chips.
Nvidia's Blackwell GPUs, in a recent test of their strength in inferencing, delivered the strongest performance and lowest total cost of ownership across models and use cases. Applied to the DeepSeek R1 model, Blackwell generated a 10 times higher performance per watt and 10 times lower cost per token compared to Nvidia's Hopper-driven chips. (Tokens are bits of data handled during training and inferencing.) These tests are designed by MLCommons, a consortium of AI experts.
Words from Jensen Huang
Jensen Huang's message regarding that performance is clear:
"It's gonna take a long time before somebody is able to take that on," Huang said during the company's earnings call last month. "And our leadership there is surely multiyear."
So, Nvidia continues to offer AI customers the most powerful compute, and that matters to many players that aim to reach their goals fast -- and gain in cost savings and efficiency over time. Of course, there is plenty of room in the market for Nvidia competitors to carve out share -- certain customers may not require the most powerful GPU, and even major customers might prefer other chips, such as Broadcom's custom AI chips, for certain jobs.
But Nvidia chips continue to stand out as the most powerful option, and the company's commitment to innovation should keep that going. It's also important to note that inferencing could be big, since it involves the powering of LLMs as they carry out their tasks.
Let's get back to our question: Should you worry about Nvidia's market dominance? Huang's recent comment leads us to a strikingly clear answer -- "no." The tech giant remains on track to dominate the market and win in the promising area of inferencing over the long term.





