Leading memory producer Micron Technology (MU 2.92%) and its peers are in one of the worst memory downturns in history, but the hope is that memory-hungry artificial-intelligence (AI) applications will help the industry dig itself out.

Micron had impressively caught up to and surpassed Samsung and SK Hynix in the most widely used forms of DRAM and NAND flash, as the first company to develop 1-beta DRAM and 232-layer NAND last year. But Micron had also fallen behind its competitors in the crucial high-bandwidth memory (HBM) market for AI. 

HBM is high-capacity stacked DRAM crucial for training AI models and inferencing quickly, and it's considered one of the main bottlenecks to unlocking more powerful AI. HBM is a tiny 1% of the DRAM market today, but it's expected to grow at a 45% average growth rate or more for several years, because of its need in AI processing.

This is one of the lone growth drivers in the DRAM market today, and investors didn't appreciate that Micron has fallen behind. SK Hynix is thought to be the leader in HBM, having begun its development back in 2013.

However, on July 26, Micron announced its newest HBM product, which appears to blow the competition out of the water.

Micron's new HBM3 Gen2

In the July 26 release, Micron noted that its second-generation HBM3 is sampling with customers, and it disclosed some impressive specs.

Micron seems to have cracked the code in cramming more memory capacity into a smaller "cube" of stacked memory modules. The new HBM3 chip is able to fit 24GB of memory within eight layers of DRAM, with bandwidth over 1.2 TB/s and a pin speed over 9.2 GB/s.

That trounces the current 12-layer, 16GB capacity HBM cubes, with a maximum current speed of just 6.4 GB/s. But perhaps even more importantly, Micron's new chip also outdoes even the new HBM3 SK Hynix. Just announced back in April, that model boasted the ability to fit 24GB of capacity into a 12-layer cube.

Since Micron's new memory cube promises that much capacity with just eight layers, that's a 50% increase in capacity per area, lowering power consumption and allowing inference and training models to process more efficiently. Not only that, but Micron also said it would also be sampling a 12-layer, 36GB HBM cube in early 2024. As SK Hynix is thought to have had a multiyear lead and outsize market share in HBM for AI, Micron's new outperformance is impressive.

In terms of what this means for AI applications, Micron said this new memory would allow developers to train models 30% faster. On the inference side, increased speeds will allow for more queries per day per model, which will allow AI users to use their expensive models more efficiently. And because of increased energy efficiency, Micron believes the new memory will save data center operators $550 million every five years for every 10 million GPUs.

A microchip with blue luminescence flowing through it.

Image source: Getty Images.

Does Micron have a technology process advantage?

It's incredibly difficult to manufacture smaller transistors in more complex designs these days. However, it appears Micron has something going for it that's allowing it to move to the next technology node faster than its peers.

For this HBM product specifically, Micron said its technological superiority comes from Micron's 1-beta DRAM modules, which Micron was first to commercially produce last year. In addition, Micron also noted other technological features playing roles. These include being able to double the through-silicon vias (TSVs) on the chip, or the vertical holes through which an interconnect reaches each layer of the HBM cube. The new HBM also displays superior thermal impedance, or heat resistance, thanks to Micron's ability to increase metal density five times over within the chip and implement more efficient data pathways.

This is all very impressive, and it speaks to some sort of technological unlock Micron has found under CEO Sanjay Mehrotra, who took over in 2017. While not without hiccups in navigating several harsh memory cycles, Mehrotra's tenure is notable for Micron's having progressed from technology laggard to now having the most advanced products on the market. And this ability to progress through subsequent nodes seems to be happening across Micron's portfolio, first in reaching 232-layer NAND ahead of peers last July, then 1-beta DRAM last November, and now apparently in this best-in-class HBM3 product today.

But is it enough?

Again, the memory industry is in a terrible hole, and HBM accounts for only 1% of it. So this news, in and of itself, probably won't lift Micron out of its current losses.

However, the DRAM industry has only three main players, and all have now cut back severely on production and capital expenditures amid the downturn. With all three players reducing supply and demand seeming to have hit a trough, the seeds of the next upturn have probably been sown.

Given that Micron is displaying what appears to be a technological process advantage in this consolidated industry, shareholders should be increasingly optimistic about its future.