Now it appears that Qualcomm's data-center silicon ambitions didn't die with Centriq. The company just announced that it will be going after the market for chips designed to handle artificial intelligence (AI) inference workloads with a chip known as the Qualcomm Cloud AI 100.
Let's take a closer look at the details of the chip and what this could mean for the company.
Designed specifically for inference
The company claims the Qualcomm Cloud AI 100 chip offers "more than 10x performance per watt over the industry's most advanced AI inference solutions deployed today," although it didn't provide additional details in the press release.
In addition to touting the capabilities of the upcoming chip hardware, Qualcomm also said it has "support for industry-leading software stacks, including PyTorch, Glow, TensorFlow, Keras, and ONNX." The idea is that it's not enough to simply offer customers a chip -- those customers need to be able to take advantage of those chips through robust software support.
The chip, Qualcomm says, will be manufactured using a 7-nanometer (nm) technology, which the company claims will bring "further performance and power advantages" -- although, to be fair, Qualcomm's competition should have access to the same 7nm technology as well. This isn't something unique to Qualcomm.
Two companies could potentially be manufacturing this chip -- Taiwan Semiconductor (NYSE:TSM), which has been mass-producing 7nm chips for about a year now, and Samsung (NASDAQOTH:SSNLF), which is expected to start volume production of its own 7nm technology later this year.
Qualcomm hasn't named the manufacturer of this chip, but it's worth noting that TSMC is expected to be the sole manufacturer of Qualcomm's Snapdragon 855 mobile applications processor and that Samsung is expected to build future 5G-capable chips for Qualcomm.
As for availability, Qualcomm says it'll start sampling it to customers in the second half of 2019, which means final production availability should happen at some point in 2020. Qualcomm hasn't disclosed the timing of the launch.
What does this mean for Qualcomm?
The good news is that the market for AI hardware is booming. Qualcomm, citing market research company Tractica, sees the opportunity for AI inferencing hardware to grow to $17 billion by 2025.
The bad news is that this is shaping up to be an extremely crowded market. Many other companies, both large and small, are also attempting to build their own dedicated hardware for AI inferencing. Different companies are also trying to go after this market with more established computing architectures, including CPUs, graphics processing units, and field programmable gate arrays.
Another thing working against Qualcomm is that after years of talking up its data-center server CPUs and the large total addressable market it would be pursuing there, the company simply exited that market. That move could lead some customers to question Qualcomm's commitment to the AI inferencing market.
It'll probably be at least a year before the company begins commercial shipments of the Qualcomm Cloud AI 100. Moreover, it's unlikely that these first products will generate enough revenue to be material to the company's financial performance in the immediate aftermath of the launch.
Put simply, this is going to be a marathon, not a sprint, and it's not clear that when the dust settles, Qualcomm will emerge as one of the leading suppliers of AI silicon into the data center.