On Tuesday afternoon, Advanced Micro Devices (AMD 3.04%) released its fourth-quarter 2023 report. The chipmaker's quarterly results didn't contain any notable surprises, but its first-quarter 2024 revenue guidance was lighter than Wall Street had been expecting. That led to investors driving shares down 2.5% on Wednesday.

That stock drop probably would have been steeper had management not provided good news on the earnings call about its recently launched Instinct MI300-series graphics processing units (GPUs) for data centers. These chips are optimized to accelerate artificial intelligence (AI) workloads, which is the fast-growing market that Nvidia dominates.

In Q4, AMD's revenue grew 10% year over year to $6.17 billion, slightly surpassing the analyst consensus estimate of $6.13 billion. Adjusted earnings per share (EPS) increased 12% to $0.77, which was in line with Wall Street's projection. For the first quarter of 2024, management guided for revenue growth of about 1% year over year, missing the 7% growth analysts had expected.

Earnings releases tell only part of the story. Below are two key things from AMD's Q4 earnings call that you should know.

1. Data center AI accelerator chips will have a total addressable market (TAM) of $400 billion by 2027

From CEO Lisa Su's remarks:

In the data center, we see 2024 as a start of a multiyear AI adoption cycle with the market for data center AI accelerators growing to approximately $400 billion in 2027. ... [As to what's included in that TAM,] it is accelerator chips. It is not systems. ... [I]t includes memory and other things that are packaged together with the GPUs.

Before we explore this quote, here is some context: In early December 2023, AMD launched two Instinct MI300 data center AI chips: the MI300X GPU, aimed at generative AI applications, and the MI300A hybrid GPU-CPU, focused on high-performance computing on supercomputers. (Generative AI is a type of AI where computer systems rapidly create new content from a variety of inputs. It's been getting a ton of buzz since OpenAI released its ChatGPT chatbot in late 2022.)

At the time AMD launched these chips, Su pegged the 2023 TAM for data center AI chips at about $45 billion. Growing from $45 billion in 2023 to her estimate of $400 billion in 2027 would equate to an eye-popping compound annual growth rate (CAGR) of 73%.

Granted, Su's 2027 TAM projection might prove to be too high (or low, for that matter). Don't get too hung up on the exact percentage. The main point here, which I agree with, is that sales growth for data center AI chips for at least the next several years will be torrid.

Market leader Nvidia is poised to continue to benefit tremendously from such rapid market growth. But such a quickly ballooning market means there should be plenty of room for AMD to also handily benefit, assuming its MI300 chips at least meet customers' expectations. Early indications are that they do, according to management.

Su said she believes GPUs will remain the "compute element of choice when you're talking about [AI] training and inferencing on the largest language models."

2. 2024 guidance for data center GPU sales raised by 75%

From Su's remarks:

Looking ahead, our prior guidance was for data center GPU revenue to be flattish from Q4 to Q1 and exceed $2 billion for 2024. Based on the strong customer pool and expanded engagements, we now expect data center GPU revenue to grow sequentially in the first quarter and exceed $3.5 billion in 2024.

AMD management issued the initial $2 billion guidance in early December when it launched its MI300 chips. The company just significantly raised its outlook based upon the early market uptake of these chips being notably stronger than it had initially projected.