Artificial intelligence (AI) is, no doubt, the hottest trend in the technology space over the past three years. That's not surprising, as this technology can substantially boost productivity across multiple industries and applications.
Global consulting giant PwC estimates that AI could contribute a whopping $15.7 trillion to the global economy by the end of the decade. Productivity gains will account for $6.6 trillion of that contribution, with another $9.1 trillion coming from consumer-related applications. This explains why there has been a rush to bring online AI infrastructure, such as data centers, to make access to AI applications easier for consumers, enterprises, and governments.
Nvidia has been the biggest beneficiary of the AI revolution. AI data centers have primarily been powered by graphics processing units (GPUs) designed by Nvidia in the past three years. The company has taken the lead in this market, providing GPUs for training popular large language models (LLMs) such as ChatGPT, Llama, and many others around the globe.
However, Nvidia's AI dominance is now being threatened by a new type of chip that could hurt its dominance in 2026.
Image source: Micron Technology.
Why GPUs may not be the hottest AI trend for 2026
Nvidia's early move into the AI chip market has paid off big time for the company. It has exercised terrific control over this market, occupying a share of more than 90%, per Wall Street analysts. That's not surprising, as GPUs are capable of carrying out vast numbers of calculations simultaneously at incredible speeds. The parallel processing ability of GPUs has made them ideal for training AI models, as they offer a significant performance advantage over other chips, such as central processing units (CPUs).
However, GPUs could be overshadowed by application-specific integrated circuits (ASICs) in 2026. ASICs are custom processors designed to perform specific tasks. Hyperscalers such as Alphabet, Meta Platforms, and others are placing orders for custom AI processors designed by Broadcom and Marvell Technology, as these chips are more powerful and power-efficient while carrying out the tasks they are designed for.
Not surprisingly, Broadcom anticipates its AI revenue to double in the current quarter to $8.2 billion. The company can sustain such impressive growth rates in its AI semiconductor business throughout 2026, especially considering it has received massive contracts from the likes of OpenAI, Meta, and Alphabet's Google. So Broadcom could take share away from Nvidia's GPUs in the AI chip market next year.
Market research firm TrendForce also points out that the shipments of custom AI processors that Broadcom makes could increase by 44% in 2026, as compared to the 16% increase in GPU shipments. GPUs, therefore, may not be the hottest trend in AI in the new year.
You may be thinking that, as GPUs will replace ASICs as the next big play in the AI chip market, the latter will become the best way to capitalize on the booming AI infrastructure space. However, the hottest AI chip trend in the new year won't be either GPUs or ASICs.
Here's the best way to play the AI infrastructure boom in the new year
Both GPUs and custom AI processors manufactured by Nvidia and Broadcom require substantial amounts of powerful memory. These AI accelerators use high-bandwidth memory (HBM) for its fast data transfer speeds, higher bandwidth, better power efficiency, and lower latency, compared to traditional memory chips.
HBM removes the bottlenecks that would have ideally reduced the efficiency of GPUs and ASICs when tackling AI workloads in data centers. This explains why this type of memory is witnessing incredible demand. Micron Technology (MU 0.61%), one of the leading players in the global memory market, estimates that the HBM market's revenue will jump from $35 billion in 2025 to $100 billion in 2028.

NASDAQ: MU
Key Data Points
Leading chip designers such as Nvidia, Broadcom, AMD, Intel, and others have been packing large amounts of HBM into their AI accelerators. The demand for these memory chips is exceeding supply, leading to a sharp increase in server memory prices.
This is the reason why Micron's revenue shot up by 57% year over year in the first quarter of fiscal 2026 (which ended on Nov. 27) to $13.6 billion. Its non-GAAP (generally accepted accounting principles) earnings jumped by almost 2.7x from the year-ago period to $4.78 per share.
Micron management says that it has already "completed agreements on price and volume for our entire calendar 2026 HBM supply," indicating it has sold out its HBM production capacity for next year. The combination of higher volumes and pricing is the reason why analysts forecast a whopping 288% increase in Micron's earnings this year to $32.14 per share.
As such, it would be a good time to buy this AI stock going into 2026, as it will help investors benefit from the next big trend in AI chips -- HBM -- especially considering that it is trading at less than 10 times forward earnings right now.





