On Wednesday, cloud and enterprise software giant Microsoft (MSFT -0.13%) held its Ignite conference, which included a bunch of exciting announcements. Unsurprisingly, many centered on artificial intelligence (AI), where Microsoft is perceived to have an early lead thanks to its investment in OpenAI, the parent company of ChatGPT.

However, an in-house, custom-designed AI chip is one area Microsoft has lacked. That is, until Wednesday when it unveiled the company's first custom AI accelerator and CPU processor. Given Microsoft's clout as a top AI cloud provider, a new custom chip has the potential to shake up the AI market and the fortunes of several key players.

Meet Maia

At the event, Microsoft unveiled Maia, its new AI accelerator. Like other cloud providers with custom chips, Microsoft will likely market Maia for customers looking to save money on AI training and inference workloads and who don't wish to pay up for expensive Nvidia (NVDA 0.02%) graphics processing units (GPUs).

This year, Nvidia's revenue exploded as its early dominance of the AI market allowed it to capture a huge market share. Some estimate that Nvidia has 80% to 95% of the AI GPU market today. Since it is the clear leader, at least for now, Nvidia has been able to charge as much as $30,000 per chip or more. That has made AI an expensive endeavor for Microsoft and its customers.

Other cloud providers, like Amazon (AMZN 0.68%) and Alphabet (GOOG 0.26%) (GOOGL 0.20%), are multiple generations into their own custom chips, with Amazon's Trainium and Inferentia chips proprietary to Amazon Web Services (AWS) and Google's Tensor Processing Unit (TPU) chips serving as lower-cost alternatives to Nvidia.

Cloud customers using these chips get lower costs in two important ways. First, cloud providers don't have to pay Nvidia's markup -- Nvidia did make 50% net margins last quarter, after all. Second, cloud providers that design their chips in-house can better optimize those chips for their hardware, infrastructure, software, and algorithms, running these chips most efficiently.

Another related cost-saving tactic is that Maia and other non-Nvidia chips can run on lower-cost Ethernet networking equipment versus Nvidia's expensive, customized Infiniband networking infrastructure. For Maia, this was a key point of discussion at Ignite. Brian Harry, a hardware executive at Microsoft, noted, "Azure Maia was specifically designed for AI and for achieving the absolute maximum utilization of the hardware."

Microsoft has had a lot of practice optimizing its tech stack, from its software on top of its cloud platform to its own servers and other related hardware. In fact, Microsoft even disclosed it had developed a new kind of unique server rack with liquid cooling built into it for both Maia and other AI accelerators going forward. Now, with its own AI chip designed in-house, Microsoft will be able to optimize down to the chip level so that, basically, all levels of its cloud AI stack work efficiently together.

OpenAI has actually played a role in helping Microsoft design its chips so OpenAI's leading AI models and algorithms can run in an optimized fashion on new chips. In the release, OpenAI CEO Sam Altman noted:

We were excited when Microsoft first shared their designs for the Maia chip, and we've worked together to refine and test it with our models. Azure's end-to-end AI architecture, now optimized down to the silicon with Maia, paves the way for training more capable models and making those models cheaper for our customers.

Microsoft's Maia chip up close.

Image source: Microsoft.

Meet Cobalt

In addition to Maia, Microsoft also unveiled its own Arm-based data center CPU called Cobalt. Cobalt was designed for general processing and, perhaps, some AI-related tasks, such as inferencing smaller models, akin to generalized data center CPUs from Intel and Advanced Micro Devices (AMD 0.08%).

Cobalt is really Microsoft's answer to Amazon's in-house Graviton CPU. The Graviton CPU has received wide adoption from cloud customers who, again, are looking to save money on cloud computing costs by going to Graviton versus Intel or AMD. An Amazon spokesman recently disclosed that AWS had 50,000 cloud customers running workloads on Graviton chips as customers have looked to optimize their spending and cut costs amid tough economic conditions.

So, Cobalt was developed in response to lower costs for all the same reasons as Maia -- taking out third-party vendor margins while optimizing the chip for one's own infrastructure.

How this affects other AI giants

Obviously, if Microsoft's new chips prove successful, that would be a competitive blow to others in the AI ecosystem. It would increase Microsoft's competitiveness versus AWS and Google, which have been able to sell cloud customers on the unique cost-saving measures of their in-house accelerators and processors.

There will also likely be some impact on AI GPU market share, affecting Nvidia and AMD. Now, that might not be quite so big of a deal if the AI processor market grows as large as some think it will. If the AI accelerator market grows at a 50% annualized rate for the next five years to reach $150 billion by 2027, as some think, that rising tide could lift all boats.

Moreover, even though Microsoft announced Maia on Wednesday, it announced it would be deploying new Nvidia H100 and AMD MI300 systems as well, with a pre-announcement for Nvidia H200 system availability sometime next year. So, there will still be plenty of space for generalized AI accelerators in Microsoft's cloud.

Nvidia has touted its chips' versatility and developer ecosystem as advantages over in-house-designed chips that may only be optimized for a single cloud setting. Given Nvidia's outsize market share, that seems to hold true. Still, if Microsoft could get even a couple of points of market share in GPU workloads, it would benefit, while Nvidia wouldn't be affected much.

However, Maia may be more dangerous for AMD. AMD is now trying to make a name for itself as a competitor to Nvidia's dominant generalized AI chips and is really starting almost from scratch. So, if Microsoft and other cloud companies can execute, that may crowd out the need for AMD's alternative. AMD will undoubtedly gain some share, but another in-house chip from another cloud provider isn't doing it any favors.

In sum, these new Microsoft chips weren't unexpected, but they're still a blow to competitors Amazon and Google, and they also might limit AMD's ability to get a bigger foothold in the AI chip market. While Nvidia may also be affected, it will probably only be around the edges.

So, while Nvidia sold off slightly on Wednesday in the wake of this news, Microsoft's Maia probably shouldn't be too much to worry about for the AI chip leader -- at least not yet.