When Apple (NASDAQ:AAPL) first acquired Siri in April 2010 for an estimated $200 million, it was among the first tech titans to venture into the world of voice-activated digital assistants. At the time, the app was a novelty with huge potential. Fast forward to 2018, and Siri has quietly languished, while Amazon's Alexa and Alphabet's (NASDAQ:GOOGL) (NASDAQ:GOOG) Google Assistant are widely regarded as smarter and more useful.
While Apple's artificial intelligence (AI) program is obviously much more than just Siri, the apparent lack of progress by the digital darling seems to run parallel to the company's seeming inability to gain momentum against its large competitors in the field of AI. Apple's reluctance to allow its experts to publish scientific papers, as an example, has made the company a less appealing employer for AI researchers.
Apple seems determined to change, however, and has hired one of Google's best and brightest to do just that.
There's a new sheriff in town
Earlier this year, Apple hired John Giannandrea, who previously headed Google's search and AI units during his eight-year tenure with the search behemoth. Apple in July confirmed that Giannandrea will lead the company's recently reorganized AI and machine learning (ML) teams.
Giannandrea's appointment and the resulting reorganization places the machine learning segment, the Siri team, and the company's Core ML team under his charge. (Core ML is the platform Apple launched in 2017 to help partners develop AI-related apps.) Having the bulk of its AI activity under one leader should help unify the company's efforts into one cohesive strategy, and having a leader with this pedigree could help the company attract serious AI talent.
How Apple fell behind
A little background will help explain why Apple has fallen behind its tech brethren in the field of AI. The nature of many machine-learning and deep-learning AI systems require the processing of large amounts of data as the system learns to recognize patterns, make associations, and distinguish differences. These models can then be applied to a growing number of areas such as voice recognition, natural language translation, or image recognition.
Most voice requests made to digital assistants are processed in similar fashion. The request is transmitted to the cloud so the system can analyze the query, sift through potential answers, and draw the correct conclusion from the myriad possibilities before returning a response. Use of these systems is well-suited to cloud computing, which can continue to aggregate data and improve over time.
Apple, however, does things a little differently.
A unique conundrum
With its much publicized stance on privacy protection and data security, Apple faces a unique challenge compared to its tech colleges. While Amazon and Google are content to aggregate their data in the cloud, Apple is determined to store personal user information and data on its devices locally and not transmit it to the cloud. This requires AI systems to be developed for each user on the device itself -- a much trickier prospect.
One of the techniques Apple has employed is adding digital "noise" to individual data and transmitting only identifiable trends, thereby protecting personal information. Additionally, the company created a processor -- the A11 Bionic chip -- that's specifically designed to handle AI applications. These chips began shipping in phones beginning with the iPhone 8.
A path forward
Giannandrea was a key component of Google's efforts to integrate its AI technology across its products and services, including search, Google Assistant, and Gmail. When Giannandrea was hired, Apple told employees the executive shares its "commitment to privacy" according to the New York Times.
Apple is no doubt hoping that his experience at Google and his views on privacy will help the company gain momentum while sticking to its core values. It will be interesting to see how this unfolds.