Micron Technology (MU 2.92%) is one of only three companies that makes both DRAM and NAND flash memory chips, which will be key components of any artificial intelligence system.

With strong AI tailwinds, one would think Micron's results and stock would be surging. While the stock is up about 26% on the year, it has lagged other AI plays and sold off after its third-quarter report on June 28.

Unfortunately for Micron, AI chip demand is not yet enough to offset the historic declines in the mature PC and smartphone markets, which still make up a large portion of all memory demand. Moreover, Micron appears to be behind one of its major competitors in high-bandwidth memory (HBM), a segment of the memory industry set to grow by leaps and bounds thanks to the advent of generative AI.

Still, on the post-earnings call, Micron pointed to several other AI-related products, as well as it's own HBM3 "plus" product coming next year.

Investors were quick to dismiss the dismal near-term results, but these new products could fuel a big recovery in 2024 and beyond.

How Micron fell behind in HBM memory

It may be surprising to some shareholders that Micron is a tad behind its large competitors, specifically South Korea's SK Hynix, in the production of HBM. That's because one of Micron's biggest selling points is that it had leapt ahead of competitors in traditional memory technology over the past few years.

Last year, the company was the first to mass-produce 1-beta DRAM and 232-layer NAND flash. Micron had been a laggard in terms of technology when CEO Sanjay Mehrotra took over the CEO job in 2017, so the vaulting ahead of competitors over the past five years has been one of the bullish arguments on the stock.

Except, it appears, in HBM. For those who have been following the burgeoning AI chip wars, you'd know AI accelerators tend to use HBM, a special kind of stacked DRAM that can move enormous amounts of data through the chipset quickly.

Because of its costs and complexity, HBM had only been a small niche market until this year; however, with the unveiling of ChatGPT last November and surging demand for generative AI servers, demand is taking off. Research publication Trendforce forecasts HBM demand will soar by 60% this year, with at least another 30% growth forecast for 2024. Trendforce also notes that Micron has a very small share of this high-growth market, with around 10% market share, versus an estimated 50% share for SK Hynix and 40% for Samsung.

Samsung's HBM market share is more or less in line with its overall leading DRAM share, but it's really SK Hynix that is the standout in terms of HBM. That's likely because SK Hynix had an early start in HBM development going all the way back to 2013. That early focus allowed SK to introduce the first HBM3 product one year ago, and the company is now set to introduce its next HBM3 product with a higher 24GB capacity, 50% over the prior generation, in the latter half of 2023.

In contrast, Micron had initially chosen to develop a different technology for its high-performance memory strategy called Hybrid Memory Cube (HMC). While HMC had certain creative advantages, the developer ecosystem evolved around HBM. So in 2018, Micron shifted its strategy from HMC to HBM, and is now playing catch-up.

graphic of brain emanating from tablet held by suited man.

High-bandwidth memory for AI is seeing surging demand. Image source: Getty Images.

Micron promises it will catch up -- and more

While Micron is currently behind, it brought up not one but several AI products and initiatives on the recent Q3 conference call with analysts. On that call, CEO Sanjay Mehrotra noted that AI servers don't just contain HBM, but also high-density DDR5 memory, certain low-power memory that has actually been specially developed by Micron, and a bit of graphics memory.

Micron has leading products in all of those non-HBM categories, and also has a new HBM3 product that it believes will be a generational leap over current offerings.

First, Micron leads in D5-sized DDR5 memory modules built on its 1-beta technology, and Mehrotra noted that 75% of the DRAM in today's AI servers is actually high-density DDR5 memory, not HBM. Micron believes its new 1-beta D5 memory built on a 32GB die will offer a significantly cheaper alternative to other types of high-density memory on the market, and even its current D5 built on today's 24GB die offers cost advantages over today's 128GB modules built on through-silicon-via (TSV) packaging technology. Of note, Micron's D5 memory shipments actually doubled quarter over quarter, so it appears it's hitting the mark on D5.

Micron has also actually developed a new kind of low-power DRAM (LP DRAM), a type of DRAM normally used in mobile handsets, but in this case has been adapted for data center AI applications. In fact, Micron jointly developed this modified LP DRAM memory with Nvidia (NVDA 6.18%), and this LP DRAM is part of Nvidia's new DGX GH200 supercluster. In fact, of the massive 144TB of memory in this Nvidia AI system, 122TB is made up of this new LP DRAM from Micron.

This could be an interesting development, as artificial intelligence servers and systems require massive amounts of electricity and power to run, which negates some of AI's benefits. Therefore, any innovation that helps lower power consumption within these supercomputing systems should be in high demand. And Micron's new product shows its close relationship with the AI leader Nvidia, as well as its creative innovation chops.

Finally, Micron claims that it will not only catch up in HBM3, but also surpass the current leaders. Management noted customers are now sampling its new HBM3 product, saying the product has "significantly higher bandwidth than competing solutions and establishes the new benchmark in performance and power consumption, supported by our 1-beta technology, TSV, and other innovations enabling a differentiated advanced packaging solution." Mehrotra later went so far as to say that "as a product, it is close to a generational leap ahead of anything else that is in the market."

In fact, Mehrotra predicts a very steep ramp for this product in early 2024 and sees generating meaningful revenue from HBM3 next year. Even more surprising, Mehrotra predicted this new product would allow Micron to a eventually achieve higher market share in HBM than its overall DRAM market share of around 23%. That would certainly be a change from today's lagging position.

Promising AI hopes on the other side of the trough

Micron is currently generating outsize losses in one of the worst-ever downturns for the memory industry, but it did generate quarter-over-quarter improvements, and forecast sequential growth in Q4.

Moreover, these new AI products give investors reason for optimism. While some may be skeptical, Micron has shown the ability to catch up and surpass its South Korean counterparts from a technology standpoint, so I would expect Micron to eventually be a stronger player in HBM and AI generally than it is today. 

With all three major memory players now slashing production to balance the market, demand for generative AI taking off, and the economy appearing resilient, Micron could turn around in a big way -- although that may not happen until 2024.