There's no denying that Nvidia's (NVDA +1.15%) graphics processing units (GPUs) are tops when it comes to artificial intelligence (AI) processing. Unfortunately, being the king of the hill means there's always someone trying to take your crown.
Microsoft (MSFT +2.23%) just announced the debut of a powerful new AI chip, the latest move in the company's bid to become a greater force in the AI landscape.
Image source: Microsoft.
A chip off the old block
In a blog post released on Monday, Scott Guthrie, Microsoft's executive vice president of Cloud + AI, introduced Maia 200, the company's latest chip designed specifically for AI inference. He calls Maia "a breakthrough inference accelerator engineered to dramatically improve the economics of AI token generation."
The Maia 200 has more high-bandwidth memory, offering three times the performance of Amazon's (AMZN +2.63%) third-generation Trainium chip and above that of Alphabet's (GOOGL +0.39%) (GOOG +0.40%) seventh-generation Ironwood Tensor Processing Unit (TPU). Guthrie called Maia "the most performant, first-party silicon from any hyperscaler." The processor provides both performance and bang for the buck, being "tailored for large-scale AI workloads while also delivering efficient performance per dollar."
Maia also includes a reconfigured memory system designed to prevent bottlenecks when feeding data into the AI model. It's also Microsoft's most efficient inference chip "ever deployed, with 30% better performance per dollar" than similarly priced alternatives.

NASDAQ: MSFT
Key Data Points
One of the most significant benefits for Microsoft is that the Maia 200 has been designed to provide peak efficiency when powering Copilot and Azure OpenAI. It is also being deployed to data centers running Microsoft 365 Copilot and Foundry, the company's cloud-based AI offerings. By using its homegrown AI chips, Microsoft is working to reduce the cost of running AI workloads amid pressure to contain rising energy outlays.
Microsoft said there would be "wider customer availability in the future" for the Maia 200, unlike the previous version, which was never made available to the public. To that end, the company is making its Software Development Kit (SDK) available to developers, AI start-ups, and academics, hoping to give customers a reason to switch.
Will Maia "chip" away at Nvidia's lead?
Maia is the latest in a string of chips released by Nvidia's rivals to decrease their dependence on its GPUs. Despite rising competition, Nvidia still maintains a dominant 92% share of the data center GPU market, according to IoT Analytics. While Maia may offer benefits for running Microsoft's inference workloads, Nvidia's GPUs still provide the greatest degree of computational horsepower and the flexibility needed to run both inference and AI training.

NASDAQ: NVDA
Key Data Points
That said, if Microsoft can deliver more affordable AI options to its cloud customers while reducing its own power consumption, it can lower expenses and boost profits. Furthermore, at 34 times earnings, Microsoft is attractively priced compared to a multiple of 47 for Nvidia.
Don't get me wrong. I think both Microsoft and Nvidia are frontrunners in the AI revolution -- which is why I own both stocks.








