Semiconductor stocks have outperformed the broader market in 2023 thanks to the booming demand for chips deployed in artificial intelligence (AI) applications, which explains why the PHLX Semiconductor Sector index's gains of 45% this year are well ahead of the 17% appreciation in the S&P 500 index.

Not surprisingly, PHLX Semiconductor Sector index constituents Nvidia (NVDA 3.34%) and Intel (INTC 0.46%) also clocked solid gains. While Nvidia stock is up a massive 232% in 2023, Intel delivered relatively smaller gains of 38%.

Nvidia's terrific surge is a result of the pioneering role the company is playing in AI. Intel, meanwhile, is struggling on account of weak personal computer (PC) and data center sales, though it is trying to cut its teeth in the AI chip market. But can Intel really make a dent in Nvidia's AI supremacy and give investors a cheaper -- and better -- way to play the AI revolution? Let's find out.

Intel is just getting started in the AI chip race

Intel management spent a considerable amount of time discussing the company's AI strategy on its July earnings conference call. Chipzilla sees AI as one of the five key growth drivers for the broader semiconductor market through 2030. That's not surprising as the demand for AI chips is expected to grow at an annual pace of 29% through 2030, generating an annual revenue of $304 billion at the end of the forecast period.

Next Move Strategy Consulting estimates that this market was reportedly worth $29 billion in 2022. So, the AI chip market is currently in its early phases of growth, and the good part is that Intel has already started gaining traction in this space. The company pointed out on the earnings call that its revenue pipeline from AI chips stands at more than $1 billion through 2024.

More importantly, Intel says that the pipeline is expanding thanks to the growing interest in its AI chips. The company estimates that a quarter of its Xeon data center servers are now being used for tackling AI workloads. Intel is looking to capitalize on this demand by adding dedicated AI capabilities to its next-generation Meteor Lake server processors. This is a smart move, considering that Intel is the dominant player in the server processor market, with a share of 82% in the first quarter of 2023.

With shipments of AI servers expected to increase at an annual pace of 22% through 2026, there is a good chance that Intel's revenue from selling AI chips could increase thanks to the secular growth opportunity in this space as well as its robust market share. But at the same time, it is worth noting that Intel's AI revenue pipeline is quite small when compared to its overall business.

Chipzilla has generated $54 billion in revenue in the trailing 12 months, which means that its AI pipeline is barely 2% of its business right now. Of course, that could continue to increase in the future and exert a bigger influence over Intel's business, but investors should not forget that the company needs to overcome other challenges as well.

Intel's second-quarter revenue was down 15% year over year to $12.9 billion, driven mainly by a 12% drop in revenue from its client computing group (CCG). This is Intel's biggest source of revenue, accounting for 52% of the top line, and it is struggling thanks to weak PC sales. So, while Intel may be gaining traction in AI, there is still some time before it can become a force to reckon with in this market when compared to Nvidia. Here's why.

Nvidia is running away with the AI chip market

Intel's AI revenue pipeline pales in comparison to the sales that Nvidia is generating from this technology already. The chipmaker generated $13.5 billion in revenue in the second quarter of fiscal 2024 (for the three months ended July 30, 2023), and the data center business produced a record $10.3 billion of that thanks to AI.

Nvidia's data center revenue surged 171% from the year-ago period, and the guidance for the current quarter suggests that the impressive growth is here to stay. In the words of Nvidia CFO Colette Kress on the August earnings conference call:

Demand for our data center platform for AI is tremendous and broad-based across industries and customers. Our demand visibility extends into next year. Our supply over the next several quarters will continue to ramp as we lower cycle times and work with our supply partners to add capacity.

In simpler words, Nvidia is focused on increasing the supply of its AI-focused graphics cards to meet the tremendous end-market demand. That's something that the company needs to do aggressively as its H100 data center graphics processing unit (GPU) is in massive demand, with waiting times reportedly running into six months. This $40,000 chip is being sought not only by established cloud players such as Meta Platforms, Amazon, Alphabet, and Microsoft but also by start-ups and governments looking to get into the generative AI race.

Nvidia is expected to sell around 550,000 H100 GPUs this year, according to a report in the Financial Times. With the price of each H100 chip reportedly starting at $30,000, the company is likely to generate at least $16 billion in revenue from this piece of hardware. That number could be significantly higher as $30,000 is the entry-level pricing of the H100 and more powerful versions of the chip command a higher price.

Additionally, Nvidia has also started shipping its Grace server processors aimed at accelerated computing and AI applications. This move could help the company make a dent in the AI server processor market at Intel's expense, as the Grace processors are expected to provide a cost-effective alternative to customers looking to jump into the AI race.

All this explains why Nvidia's top line is set to grow rapidly, following an anticipated jump of 102% in the current fiscal year to $54 billion.

NVDA Revenue Estimates for Current Fiscal Year Chart.

NVDA Revenue Estimates for Current Fiscal Year data by YCharts.

The verdict

Nvidia is way ahead of Intel in the AI chip race, and this is also evident from the financial performances of both companies, as the discussion above indicates. While Nvidia's revenue is expected to double this year, Intel's top line is expected to drop almost 17% to $52 billion. Nvidia, therefore, is on track to become a bigger company than Intel in terms of revenue, and it is also well-placed to race ahead in the future.

As far as the valuation is concerned, Nvidia is expensive at 117 times trailing earnings. However, its forward earnings multiple of 55 represents a nice discount to Intel's forward earnings multiple of 116. All this makes it clear that Nvidia is the better AI stock to buy right now since it is already winning big in the AI race and looks set to maintain its dominance in this market with new products.