At a recent investor conference, Intel (INTC -3.08%) CEO Brian Krzanich offered significant insight into the company's view of -- and strategy to win in -- the market for chips that handle artificial intelligence/machine learning workloads.
Krzanich started by noting that artificial intelligence and machine learning (I'll refer to these as simply "machine learning" from here on out) represent a relatively small part of the overall data center market today, but acknowledged that it's the quickest-growing data center workload. Moreover, Krzanich even said that, over the long term, Intel expects it to be the largest workload within the data center.
"Where will it end [up]? Will it be 25% of the workload? 50% of the workload? There's debate, there's debates out there; it's going to be large, so we get it," Krzanich said.
With that background in mind, let's dig into what Krzanich had to say about the company's approach to this market.
Casting a wide net
"We don't think one solution is going to solve this problem," Krzanich said. He then went on to point out Intel's strategy vis-a-vis machine learning based on the different markets/applications.
For the automotive market, Krzanich indicated that the power envelopes that chips targeted at this market need to operate in are right and power efficiency is critical.
"In a car, you need to be three, four, five watts, and you need to have -- you know, the holy grail is a 1 teraflop-per-watt type of application," Krzanich explained.
Teraflop is short for "trillion floating point operations per second." Generally speaking, the larger this number is for a given level of power consumption, the more efficient (and thus better) the chip is.
For the automotive market, the company is betting on its recent acquisition of Mobileye (MBLY) to produce products capable of exceeding 1 TFLOPs/watt "faster than [its] competition."
On the theme that one-size-doesn't-fit-all, Krzanich pointed to the company's acquisition of a company called Movidius to handle machine learning/computer vision in applications where power consumption is measured in the "milliwatt range."
"So, if you're a robot or a drone or something like that, we have those solutions," he said.
Although Intel's machine learning strategy with respect to the automotive, drone, and robot markets is certainly interesting, investors are certainly going to be most interested in the company's data center-oriented machine learning strategy.
Another wide net
With respect to data center-oriented machine learning solutions, Intel is -- as it is doing for machine learning more broadly -- casting another wide net.
Krzanich highlighted the broad portfolio of solutions that the company hopes to bring to bear in this market, including its standard general-purpose Xeon processors, its FPGA technology, and the machine learning-specific technology that it got with its purchase of Nervana.
While Intel seems to be quite interested in building specialized chips for these workloads, Krzanich made it clear that the company plans to integrate specialized silicon designed to accelerate machine learning workloads into its bread-and-butter general-purpose Xeon processors.
"What you're seeing is because machine learning and artificial intelligence is just linear algebra, you're seeing things like [tensor processing unit] come into play -- those are accelerators," Krzanich said. "They're basically little [application specific integrated circuit] engines that allow you to do this math in a very fast way."
Krzanich then said that there are "many accelerators" and noted that investors will see Intel "start adding those [accelerators] into [its] basic Xeon processor[s]."
Foolish takeaway
Intel is clearly taking the market for machine learning applications seriously, as evidenced by the company's rush to build a broad portfolio of technologies to properly address it.
We'll see in the coming years whether Intel's strategy of casting a wide net allows it to maintain its dominant positioning within the machine learning market works and capture a large part of the growth that comes in the future.