Artificial intelligence (AI) has turned out to be a key growth driver for semiconductor stocks in 2023 thanks to the critical role that chips are going to play in the proliferation of this technology.

After all, AI servers need more computing power, faster memory, and more storage to train models and run AI applications. The popular chatbot, ChatGPT, is reportedly powered by at least 30,000 graphics processing units (GPUs) from Nvidia. A GPU consists of multiple components that include a processor, a cooling device, connectors, memory, and a memory interface.

Because GPUs are capable of computing huge amounts of data simultaneously, they're ideal for training AI applications. As a result, there has been a massive demand for Nvidia's GPUs of late, with the waiting period for its flagship H100 AI graphics card reportedly running into six months. Nvidia, however, sources the components of its GPUs from different vendors. For instance, the company gets the processors manufactured by Taiwan Semiconductor Manufacturing.

Similarly, it has other vendors for other components. Micron Technology (MU 2.92%) is one of those vendors. Nvidia has been a customer for Micron's specialty dynamic random access memory (DRAM) chips in the past. It is worth noting that Micron has been providing hints lately that it is partnering with Nvidia already to supply memory for Nvidia's AI-focused processors.

On its fiscal 2023 third-quarter earnings call, Micron management pointed out that the memory footprint of Nvidia's Grace Hopper GH200 chip has been jointly developed by the two chipmakers. And now, on the company's fiscal 2023 Q4 earnings call, Micron said that its high-bandwidth memory (HBM) offering is "currently in qualification for Nvidia compute products, which will drive HBM3E-powered AI solutions."

This could be a big deal for Micron in the long run and could significantly drive the company's growth.

HBM demand is expected to grow rapidly

AI servers rely on HBM to feed the processors rapidly with huge amounts of data to train models or for running inference applications. That's because this memory type has a much faster bandwidth of up to 384 GB/second as compared to normal DRAM chips that top out at 136 GB/second.

Not surprisingly, memory player SK Hynix estimates that the HBM market could clock an annual growth rate of 82% through 2027. This could unlock a healthy growth opportunity for Micron given its relationship with Nvidia. As already mentioned, Micron's third-generation HBM product, known as HBM3E, could be deployed by Nvidia in its GH200 processors if Nvidia decides to select it after the qualification process.

Micron management says that it expects to "begin the production ramp of HBM3E in early calendar 2024 and to achieve meaningful revenues in fiscal 2024," suggesting that it may be close to landing Nvidia's business. Another indication that Micron may eventually become the supplier of HBM for Nvidia's processors came from Nvidia's announcement in August that it will start delivering HBM3E systems in the second quarter of 2024.

Given that Micron's fiscal 2024 has just begun and will end in August next year, management's timeline of the production ramp of HBM3E suggests that its memory chips are likely to find a place inside Nvidia's GH200. Also, Micron's DRAM market share of almost 26% indicates that it is in a healthy position to make the most of the end-market opportunity in HBM.

Meanwhile, Micron should also witness an increase in demand for storage chips thanks to AI. The company estimates that an AI server needs three times the NAND flash memory as compared to a regular server. Similarly, an AI server contains six to eight times the DRAM content as compared to a traditional server.

All this indicates that AI is going to be a key growth driver for Micron Technology and should give the company's performance a big boost from the current fiscal year.

Micron Technology stock is cheap right now

Fiscal 2023 was a forgettable year for Micron. The company's revenue fell in half to $15.5 billion, while it swung to a non-GAAP net loss of $4.45 per share as compared to a profit of $8.35 per share in fiscal 2022. The company was hit hard by an oversupply in the memory industry, which was triggered by the weak demand from the personal computer (PC) and smartphone markets.

However, the company's guidance of $4.4 billion in revenue for the first quarter of fiscal 2024 points toward an improvement over the prior-year period's top line of $4.09 billion. The company hasn't issued a full-year forecast, but analysts are expecting its top line to jump nearly 35% in fiscal 2024, a momentum that it is expected to sustain in the next couple of years.

MU Revenue Estimates for Current Fiscal Year Chart

MU Revenue Estimates for Current Fiscal Year data by YCharts

A potential recovery in the smartphone and PC markets, along with the AI tailwind, is likely to give the memory market a shot in the arm in 2024. Gartner expects a 70% spike in memory revenue in 2024 after this year's estimated decline of 35%. With Micron currently trading at 4.8 times sales, investors are getting a good deal on this AI stock right now given that other companies who are benefiting from this technology tend to trade at higher multiples.

Investors should consider buying Micron stock before it flies higher following the 36% jump it has delivered so far in 2023.