The booming demand for artificial intelligence (AI) chips has sent shares of Advanced Micro Devices (AMD 2.37%) flying in the past year, even though the chipmaker's AI-related business had not yet gained traction in 2023. That's probably because investors didn't want to miss out on a chance to capitalize on this fast-growing market currently being dominated by Nvidia.

AMD was late to the AI chip market in 2023, with tech giants lining up to buy Nvidia's graphics processing units (GPUs) to train large language models such as ChatGPT. However, AMD did manage to squeeze into this lucrative market that's expected to generate a massive $384 billion in annual revenue in 2032, clocking a compound annual growth rate of 38%. AMD management said last year that its efforts in this segment of the business should start to pay off in 2024.

So far, Wall Street isn't satisfied with AMD's AI-related revenue guidance for the year (it fell behind expectations). However, there is more than what meets the eye and a closer look at the company's AI business will tell us that it could ramp up significantly this year.

AMD's AI guidance isn't all that bad

In October last year, AMD pointed out that it would generate $400 million in revenue by selling its MI300 series of AI GPUs in the fourth quarter of 2023. The previous quarter marked the beginning of AMD's AI-related revenue ramp as its MI300 family of processors was launched in December.

However, AMD CEO Lisa Su said on the company's fourth-quarter 2023 earnings conference call that AMD's "Data Center GPU business accelerated significantly in the quarter, with revenue exceeding our $400 million expectations, driven by a faster ramp for MI300X with AI customers."

AMD says that it is working with the likes of Microsoft, Meta Platforms, Oracle, and others to deploy its MI300 Instinct family of data center GPUs. This explains why Su now expects AMD's data center GPU revenue to exceed $3.5 billion in 2024. That's a big bump from the prior expectation of $2 billion, clearly indicating that customers are warming up to its new chips.

In other words, AMD's quarterly revenue run rate from the data center GPU business could stand at almost $900 million in 2024 as per the $3.5 billion annual revenue forecast from this segment. That's more than double the revenue AMD generated by selling data center GPUs last quarter.

However, analysts were expecting AMD to call for $4 billion to $8 billion in AI revenue for 2024, and the company's guidance has fallen short of that mark. Savvy investors, however, should note that AMD significantly increased its revenue guidance from AI chip sales in the space of just one quarter. It won't be surprising to see the company further raise its AI revenue guidance as the year progresses given that it has "made significant progress with our supply chain partners and have secured additional capacity to support upside demand."

Why the company could continue to raise its AI guidance

Nvidia is the leading player in the AI chip market with a market share of more than 80%. AMD, however, is expected to make its presence felt in this market in 2024 and corner between 15% and 25% of the AI chip market.

However, there are a couple of factors that could help AMD corner a bigger share of the AI chip space.

First, according to a DigiTimes report, AMD could corner a significant portion of foundry partner Taiwan Semiconductor Manufacturing's advanced packaging capacity so that it can make more AI chips. More specifically, the Taiwanese publication expects that AMD's share of TSMC's advanced packaging capacity could be half of Nvidia's in 2024. If that's indeed the case, AMD could corner a third of TSMC's manufacturing capacity for making AI chips.

Second, if AMD does manage to manufacture a sizable number of AI chips this year, it could fill the supply gap in this market. That's because Nvidia's AI processors reportedly command a waiting period of nine to 12 months. Given the robust specs of AMD's flagship MI300X processor and its ability to provide stiff competition to its Nvidia counterpart, consumers looking to get their hands on fast AI hardware could turn to AMD.

Raymond James analyst Srini Pajjuri estimates that AMD could ship somewhere between 250,000 and 500,000 units of its MI300 Instinct data center chips this year. AMD hasn't revealed the price point of its latest AI chips, but there is a good chance that it has priced them competitively to win share from Nvidia, whose flagship H100 processor is reportedly priced between $25,000 and $40,000 depending on the configuration.

Assuming AMD prices its AI processors at $25,000 -- the lower end of the H100's price range -- and sells 250,000 units this year (based on Raymond James' estimate), it could generate $6.25 billion in revenue from this market. The higher end of Raymond James' shipment estimate indicates that AMD's data center GPU revenue could land at $12.5 billion.

So, there is a good chance that AMD may have played it safe while issuing its AI-related revenue guidance for the year. As the company gets its hands on more supply and receives greater customer commitments to deploy its AI processors, it could continue raising its AI revenue guidance as the year progresses.

So, savvy investors would do well to look past AMD's latest results since its fast-growing AI chip business is anticipated to lead to robust growth in 2024 and beyond. This is evident from the following chart:

AMD Revenue Estimates for Current Fiscal Year Chart

AMD Revenue Estimates for Current Fiscal Year data by YCharts

It is worth noting that AMD's annual revenue was down 4% in 2023 to $22.7 billion. Of that, only around $400 million came from sales of AI chips. So, there is a good chance that AMD may be able to grow at a faster pace than Wall Street's expectations this year considering the significant jump in AI revenue, and that could help this tech stock sustain its rally in the long run.