Google's (GOOGL -1.28%) Advanced Technology and Projects Group (ATAP) just announced availability of a milestone technology with the promise to do for indoor navigation what the GPS did for highway motorists.

The development platform will bring to the mobile device multiple 3D vision sensors and a new power-efficient chip technology that can create sophisticated 3D maps that until now only were available on the desktop. With this breakthrough, Google and independent software developers can create a whole new application development platform around "Smart Vision" that can usher in the age of mobile computer vision much like the smartphone enabled the first wave of mobile apps.

Google's Project Tango is based on 3D visioning sensors and a co-processor chip technology, plus new applications based on the Android mobile platform. Its hardware is centered around a prototype Android-based 5-inch smartphone that ships from the company with a software development kit (SDK.) The Tango device takes one-quarter million (250K) 3D measurements per second then combines the measurements with software in what the company calls "advanced sensor fusion" it then uses to augment computer vision.

What does it do?

The technology opens new user experiences in 3D scanning, indoor (and outdoor) navigation, and immersive gaming, all considered to be the next wave in smartphone intelligence.  Tango has the potential to break new ground in the next wave of intelligent services that can be particularly useful to mobile devices, including Google Glass.

On the company blog site, Google's Johnny Lee, who runs the program at ATAP, said, "Project Tango strives to give mobile devices a human-like understanding of space and motion through advanced sensor fusion and computer vision, enabling new and enhanced types of user experiences."  

One way this technology is enabled is through a new chip design (not just library set added to standard ASICs) developed by Movidus. 

It's called the "Myriad 1," and much like the M7 motion processor used by Apple (AAPL -1.22%) for motion on the iPhone 5S, Myriad 1 offloads all 3D sensor activity on the Tango in what it calls the "vision processor." It will handle complex and computations related to 3D vision applications that were previously too power-hungry to go mobile.

This whole development platform creates augmented services, as the Myriad 1 co-processor will tie in Tango's advanced 4MP camera, integrated depth sensor, and motion tracking camera to create Smart Vision for the smartphone. This is 3D-like perception in a mobile device that can "see" like never before. It's "digital awareness" of where a device is in space, and exactly what is around it.  

One example offered by the company is the ability to capture the exact dimensions of your home simply by walking around inside, from room to room.  After the sensors and co-processor do the work, you have a complete 3D map of the inside that can be used next time you are in the Ikea store looking to see if that new bedroom set will fit in the kids' room. Taking that further, Ikea could then take this new platform to new heights by developing its own Android App that can instantly match its products to your specific situation.

The possibilities are endless, really. Want a new TV, but not sure it will fit in your family room? You can bet that Best Buy will have an app developed in much the same way. 

Google said as of Feb. 20 it is allowing developers to sign up for access to the new Tango prototype phone devices with its first batch going to 200 lucky developers who will get a jump start on the 3D vision platform. Devices are shipping to independent software developers (ISDs) now, the company said. 

So with a goal no less than a "human-like" understanding of space, the next wave in mobile development is upon us, and for the moment, it's powered by Google and Project Tango.