Right now, Nvidia (NVDA -0.01%) is very nearly the only game in town for those wanting to run advanced artificial-intelligence (AI) workloads. The company's H100 data center GPUs are the gold standard, and nearly two decades of work building up the software ecosystem around its chips have given it an enormous advantage.

Demand for AI chips is so intense that Nvidia can charge essentially whatever it wants. The company's data center segment generated more than $10 billion of revenue in the fiscal second quarter, which ended on July 30, more than doubling from the first quarter. Overall gross margin was 70%, up from 44% one year prior, and adjusted earnings per share climbed 148% from the first quarter.

Despite these incredible results that beat analyst estimates by a mile and guided for even stronger revenue in the third quarter, the stock actually declined following the company's earnings report. One reason could be that investors don't believe that Nvidia's AI chip business will remain so lucrative as competition heats up.

Intel, AMD, and the rest

Any market where one company generates outsize profits and where demand is exploding will eventually be flooded with competition. Nvidia's software ecosystem buys it time, but not a permanent monopoly.

Intel (INTC 1.46%) already has a viable competitor on the market. The company acquired Habana Labs in 2019, and that company's line of Gaudi AI chips is now garnering intense interest from prospective customers. The latest Gaudi2 chip isn't nearly as powerful as Nvidia's top-tier H100 GPU, but it's competitive when you factor in performance-per-dollar and total cost of ownership.

Intel disclosed with its latest quarterly report that its AI chip pipeline expanded by a factor of 6 in a single quarter alone to surpass $1 billion. While Intel also sells data center GPUs capable of some AI workloads, it's the Gaudi chips that are leading the way. The company plans to launch next-generation Gaudi3 chips in 2024 to capitalize on this soaring demand.

Advanced Micro Devices (AMD 2.26%) is also gunning for Nvidia. AMD is set to launch a family of MI300 data center GPUs by the end of the year, with the ultra-powerful MI300X meant to go toe-to-toe with Nvidia's H100 for AI training workloads. The company is also doubling down on its open ROCm compute software platform, which rivals Nvidia's proprietary CUDA ecosystem.

Outside of chip companies, there's a strong incentive for any major buyer of AI chips to design their own accelerators. Some of the major cloud platforms have already been doing this for years. Amazon has its AWS Inferentia accelerators, which are available for use through Amazon Web Services; Alphabet's Google is already on the fourth iteration of its Tensor Processing Unit, which it claims beats Nvidia's last-gen A100 GPU in both performance and power efficiency; and Microsoft, a major investor in OpenAI, is reportedly working on its own AI training chip.

Sky-high expectations

Nvidia is valued at around $1.1 trillion. If you take the company's revenue guidance and annualize it, the stock trades for about 17 times sales. If you do the same for its adjusted earnings per share, you get a price-to-earnings ratio of about 43.

These valuation metrics don't look that bad given the soaring demand for AI chips, but you must assume that Nvidia's profit will hold up as competition intensifies. The AI market is in its gold rush phase right now. Money is being hurled at building out GPU clusters no matter the cost, and no matter whether there's a clear return on investment. This bonanza won't last forever.

Demand for AI chips seems likely to grow in the long run, and Nvidia may very well remain the dominant player. But competition will ensure that its profit is kept in check.