Welp. It happened. Google’s Project Tango smartphone has been unveiled with Intel’s RealSense camera built in. And, now, whatever term Google comes up with to represent the emerging 3D ecosystem will be the phrase that everyone uses. Today, at the Intel Developer’s Forum, the computing company announced the news of the Project Tango smartphone.
Until now, the two companies had been working separately on depth sensing technology and integrating that tech into smartphones. Google’s Project Tango had previously made its way into a tablet, which, after being made available for developers, has been making appearances and tech conferences here and there and showcasing its 3D scanning abilities. Meanwhile, Intel had developed its RealSense 3D camera, is now being embedded into numerous computers, tablets, and notebooks, but had yet to be shrunk down enough to fit into a smartphone. Now, the two tech powerhouses have united around a common goal and found a way to fit the RealSense into the Project Tango Android phone.
Engadget gives a complete rundown of the phone’s specifications (as you’ll notice, all of the photos in this article are from Engadget). The device is 8.2mm thick and weighs 165 gram, featuring a 6-inch QHD display. The phone runs on Android 5.0.1 Lollipop and is powered by an Intel Atom X5 processor. While it’s not the first phone capable of capturing depth, as the HTC Evo has a stereoscopic camera, this is, I believe, the first truly depth sensing smartphone. Much more than a stereoscopic camera, the RealSense device includes the RealSense R200 camera, including a fish-eye lens and 8-megapixel sensor, two RGB cameras, IR cameras for calculating distance, and a laser to capture texture for 3D scanning. Other devices feature the RealSense camera and there are peripherals for transforming smartphones into 3D sensing devices, like the Structure Sensor, but, if they get it out to market quickly enough, the Project Tango smartphone will open up a Pandora’s Box of possibilities for consumers and industrial users alike.
Until now, our smartphones have translated the vibrant 3D world into something flat, in which we share 2D photos on Facebook and Instagram, our videos are in 2D, and the ads that are shoved down our throats lay lifeless against our screens. Project Tango, and similar products that will follow suit, could bring 3D sensing abilities to ordinary individuals. No longer will 3D movies be limited to Hollywood, but anyone with a smartphone will be able to capture 3D data. So, not only will YouTube users be able to vlog in 3D and share their 3D music videos, but they’ll also be able to take 3D scans and share them on Sketchfab, using an app like itSeez3D.
And, though Google’s partnership with Paracosm will open up 3D scanning capabilities, Intel’s Perceptual Computing SDK adds an even greater number of capabilities, including gesture recognition, facial recognition, augmented reality, speech recognition, and more. On top of holding your phone up and using AR to virtually decorate your 3D scanned living room before ordering a piece of furniture (or 3D print?) from IKEA, you will be able to wave objects around with their finger tips and smile as a means of telling the software that they approve of the item.
Then, with AR/VR headsets, like Microsoft HoloLens or Oculus Rift, this data will be viewed in 3D, as well, truly opening the doors to what Autodesk calls Reality Computing, HP calls blended reality, and Microsoft calls mixed reality, in which objects modeled in 3D can be brought into the physical world with 3D printing or physical objects can be made digital with 3D scanning. Only, it will be called whatever Google wants it to, if Google gets its way. That is, unless Apple unveils something more than the iPhone 6S at its upcoming event.