The headline feature of Apple's (AAPL -0.57%) new iPhone X is its 3D-sensing capability, which enables improved augmented reality (AR) experiences as well as the Face ID facial recognition feature. You might think that Apple borrowed the idea from Alphabet subsidiary Google, seeing as how Google unveiled a similar idea with Project Tango back in 2014. Tango devices (there are currently only two on the market) use the same types of vertical-cavity surface-emitting lasers (VCSELs) that are found in the iPhone X.

But in fact, Apple's 3D sensing ambitions predate Tango. The Mac maker likely dreamed up putting 3D sensing into the iPhone half a decade ago.

Visualization of Face ID

Face ID creates a depth map of a user's face. Image source: Apple.

Prime time for PrimeSense

Back in November 2013, Apple confirmed that it had acquired Israel-based 3D-sensing start-up PrimeSense. The reported price tag was $360 million, though Apple never disclosed an official total in any subsequent regulatory filings. PrimeSense was working on various types of 3D-sensing technologies, and while Apple promptly removed PrimeSense's official presence from the web (like its site and YouTube channel), some remnants of its older concept videos are still available through third-party channels:

PrimeSense is probably best known for helping Microsoft develop its Kinect, the 3D-sensing device for its Xbox gaming consoles. Kinect was built using PrimeSense's reference designs, and used a combination of color cameras and infrared projectors (sound familiar?) to sense depth. The new TrueDepth camera system leverages the technology that Apple acquired via PrimeSense.

Kinect 360 on white background

Original Kinect for Xbox 360. Image source: Microsoft.

For example, the structured light projector patent that Apple was granted in July 2016 was originally filed by PrimeSense engineers based in Israel in June 2014, less than a year after the acquisition. About a month ago, Apple was granted another patent related to depth mapping. That patent was originally filed by PrimeSense in February 2013, prior to the acquisition.

Patent drawing of 3D sensing a hand

Image source: U.S. Patent and Trademark Office.

Here's the thing. If the acquisition occurred in November 2013, Apple must have been thinking about 3D sensing for quite some time before then. Acquisitions typically take a while to consummate. Apple has to conceive the idea first, then evaluate the space and look for the most promising acquisition targets with the best technology. Then it has to approach the target, negotiate terms, perform due diligence, and more. It's probably fair to say that Apple thought of adding 3D sensing to the iPhone in 2012, if not earlier.

Compare that timeline to the AuthenTec acquisition from 2012 that made it abundantly clear that fingerprint recognition was in the pipeline, which incidentally carried a comparable price tag of $356 million. Touch ID debuted the following year in the iPhone 5s. But fingerprint recognition is largely a hardware-based feature, while 3D facial recognition is predominantly software, which can take much longer to develop.

That's why the introduction of Face ID is such an important milestone. Apple has probably been working on 3D sensing for over five years, and we're just now seeing the fruits of all that labor.