Late last year, Apple (NASDAQ:AAPL) began selling the iPhone X, an ultrapremium smartphone with a full-screen organic light-emitting diode (OLED) display and a depth-sensing front-facing camera system marketed as a TrueDepth camera.
The TrueDepth camera is used for two key features: Face ID, Apple's facial recognition technology that supplanted its Touch ID fingerprint recognition technology in older iPhone models, and Apple's animated emojis, marketed as Animoji.
While other smartphone vendors are trying to follow Apple's lead in 3D sensing, others are pursuing alternative biometric authentication approaches for full-screen smartphones, like in-display fingerprint sensors. There have been reports that 3D-sensing solutions are simply too expensive for other Android vendors to implement. Meanwhile, vendors of in-display fingerprint sensing solutions will argue that such solutions are superior to Apple's 3D-sensing technology.
There have also been some sketchy rumors suggesting that Apple will add an in-display fingerprint sensing solution to future iPhone models in a somewhat tacit admission that Face ID wasn't a superior successor to Touch ID.
Based on Apple's current job listings, however, it seems that Apple isn't shying away from 3D-sensing technology -- it's doubling down on it.
Apple's recruiting 3D sensing engineers
A quick peek at Apple's job boards shows two openings for 3D-sensing-related positions. The first is a "3D Perception/Computer Vision Algorithm Engineer."
The job description here is a little light, in keeping with Apple's tradition of secrecy, but the company is seeking candidates with "solid foundation in computer vision" and interest in the areas of multiple view geometry, 3D computer vision, activity recognition, and object detection, tracking, and recognition.
It definitely sounds like the engineer that Apple hires to fill that position will be working on 3D-sensing-related technologies.
There's another listing, too, for a "Senior 3D Reconstruction Computer Vision Algorithm Engineer."
This listing is quite a bit more detailed than the previous one. The job summary says that "the 3D Perception team in Video Engineering -- the team who developed the visual intertial odometry for ARKit -- is working on the future of 3D-enabled Apple products."
It's clear that Apple isn't hedging its bets against 3D sensing not taking off -- it's doubling down with the expectation that it'll become a core part of the iOS product experience sooner rather than later.
Enabling the ecosystem
Of course, for 3D sensing to really take off (and for Apple's efforts in the area to prove to be a real competitive advantage), Apple will need to enable use cases beyond facial recognition and (arguably gimmicky) animated emojis.
Whether Apple will be successful there remains to be seen, but the more the company innovates on the development tool side of things (e.g., making ARKit more powerful and easier for developers to use), the higher the odds that a so-called "killer app" emerges for 3D sensing.
While Apple seems to be doing a good job on the software side, another thing that could help spur 3D-sensing adoption within Apple's ecosystem is broader device accessibility. Today, only Apple's highest-end, ultra-expensive iPhone has 3D-sensing capability, but all three of Apple's upcoming iPhones, spanning a wide range of prices, are expected to come with the technology.
A much broader 3D-sensing-enabled iPhone installed base could certainly help incentivize developers to invest in building interesting apps that utilize 3D-sensing technology, as those developers would have a much larger potential customer base to sell those apps to.