Making Smartphones See

How Google’s Tango Augmented Reality platform is changing the way we interact with the world.

Portrait of Tammy Strobel

How Google’s Tango Augmented Reality platform is changing the way we interact with the world.

For the past six years, Google’s Advanced Technology and Projects (ATAP) team has been hard at work on an augmented reality computing platform that they say, “could one day be as essential to your phone as GPS”. That platform is Tango and it’s finally here. And it’s about to change the way you and your smartphone interact with the world.

My Reading Room
What Is Tango Augmented Reality?

Google’s version of Augmented Reality is a huge leap forward from Pokémon Go or any other AR app you may already be familiar with. Announced early in 2014, Tango is Google’s attempt to get mobile phones and tablets to see the way the human eye sees. This is no simple task. It involves an extensive camera array that uses computer vision, depth sensing and motion tracking sensors to grant the device full spatial awareness; in other words: the ability to understand your environment and your relation to it. Tango doesn’t require GPS or any other external signal, which means it can do indoor navigation - something that’s never been done on a mobile device before. Current VR devices like the HTC Vive require carefully calibrated external sensors to know where you are, but with Tango’s “inside out tracking”, everything you need is inside your smartphone. And if that’s not impressive enough, Tango doesn’t just know where you are, it also maps and tracks every single 3D object in the same room as you.

Google’s first Tango device was a large, bulky prototype tablet made available only to developers back in 2014. Since then Google has formed a partnership with Lenovo to bring a Tango-enabled device to consumers, packing all of the sensors and processing required to run Tango into a portable, consumer- friendly form factor. That device is finally here: the 6.4-inch Lenovo Phab 2 Pro phablet.

How does it work?

The Phab 2 Pro utilizes three types of technology that all work together to make Tango work:

1. Motion Tracking

Motion Tracking lets the Phab 2 Pro track its own movement and orientation through 3D space. Walk around with the Phab 2 Pro and move it forward, backward, up, or down, or tilt it in any direction, and it can tell you where it is and which way it’s facing. This is accomplished on the Phab 2 Pro with a wide angle fisheye camera, an accelerometer, and a gyroscope. The image from the fisheye camera is used to identify key visual features such as edges and corners. The device then tracks how much these features move between frames to determine the distance traveled. The data from the accelerometer and gyroscope determine how fast the device is moving and in which direction it is turning. All this information is fused together to track where the device is in 3D space.

2. Area Learning

Area Learning means the device remembers what it sees and can also recall that information later on. With Motion Tracking alone, the device “sees” the visual features of the area it is moving through but doesn’t “remember” them. The device needs to use Area Learning to improve the accuracy of Motion Tracking by aligning the real-time data with the saved data. Area Learning is accomplished through Google’s Tango core software as it processes all the spatial information gathered by the sensors on the Phab 2 Pro.

3. Depth Perception

Depth perception gives the device the ability to measure distances to objects in the real world. This lets you augment virtual objects that not only appear to be a part of your actual environment, but can actually interact with your environment. The Phab 2 Pro accomplishes depth perception by combining the information from a “time of flight” IR (infrared) emitter and standard RGB camera. The IR emitter sends out infrared light which bounces back and is measured within a few nanoseconds, while the RGB camera creates a stereo image that is used to generate depth information.

What can Tango do?

The applications for Tango are endless, but to start with, Lenovo and Google have targeted three key areas of interest. The first is indoor navigation. Imagine having a Google Maps style layout in every mall, museum and supermarket, telling you exactly where you are, and where you want to go. Extra information could be overlaid on top of navigation elements, for example, if you’re in a supermarket, the AR display could point your attention to shelves with special deals as you walk around.

The second area is gaming. Imagine if Pokémon Go had Tango augmented reality. Instead of merely seeing a Pokémon appear on screen, it could interact with objects and your surrounding environment. Pokémon could literally hide in the tall grass near your house. For other AR games, levels could be created using the layout of your house, school, or office, making the game far more personal and familiar.

Finally, Google is looking at AR enhanced utilities. These are apps that you could use, for example, to map your living room and then re-decorate by moving around, adding or removing furniture. We’ve already seen AR apps like this, but with Tango, you can move through that space to get a real-time 3D idea of what your actual room will look like from every angle.

My Reading Room
A whole new world

Tango AR has also drawn a lot of interest for its educational potential. Google and Lenovo have started to collaborate with museums around the world to enhance the visitor experience through the use of Tango, something they’re calling Kinesthetic Learning. One of the first examples of this is the WWF’s Into the Wild exhibit, which you can experience for first hand at Singapore’s ArtScience Museum.

Stop by the exhibition and you’ll be loaned a Lenovo Phab 2 Pro handset. Enter the exhibition space and instead of a static presentation on illegal hunting in Southeast Asian rainforests, Tango creates an immersive experience on your Phab 2 Pro that transforms the museum into a rainforest. You’ll meet various animals as you explore your new surroundings, and get to see firsthand how Man’s activities affect the environment.

Thanks to a Tango device’s motion tracking capabilities, when you walk five steps forward in the real world, and you take five steps forward in the virtual rainforest. Turn around, and the visuals update in real time to show you what’s virtually surrounding you.

This works because MediaMonks – the team responsible for creating the Into the Wild experience – first created a complete virtual model of the museum, and then designed the exhibition taking to account all the interactions with the physical interior of the space. A total of ten real-world coordinates of landmarks in the ArtScience Museum were physically measured and marked in their respective locations in the virtual one, thus allowing them to precisely align both worlds.

A different kind of story telling

With this virtual model of the museum in place, MediaMonks’ artists had a canvas on which to design their 3D environments. What they were building wasn’t simply a rainforest, but a narrative that brought you in contact with a number of animals that WWF highlighted to them.

Rather than use arrows to guide the way, butterflies and tufts of grass on your smartphone display lead you through the story. MediaMonks took the topological locations and natural behaviors of the various animals appearing in the exhibit into account to determine where and when you would get to see them.

For example, the mouse deer is found in many areas including water.  Thus, it serves as the perfect lead to draw you to the tapir who, when around rivers, spends a good deal of time in and under water. Being a smaller animal, trying to follow the mouse deer also makes you look down, which then encourages you to look at what else is below you. Likewise, a Pangolin scampering up a tree leads your line-of-sight to what’s above, thus encouraging you to look up into the tall museum corridor.

Changing the real world

While much of the exhibition takes place through the display of a Tango-enabled smartphone, pains have been taken to ensure that changes in natural lighting in the museum are also reflected in the virtual world. Hours were spent on user testing to discover what storylines worked and what didn’t, and three months were spent on crafting the story, modeling the environment, building the animations and engaging in continuous testing at the museum.

Changes in the exhibitions being held at the ArtScience Museum will also mean physical changes to the space in the museum itself. When that happens, new area files will be needed, so the team will have to scan the area again. Because the exhibition is essentially an app, the experience can be updated remotely much like how apps on our phones are updated. Thus animals and objects can be moved or even added to the virtual world, changing the story appropriately, and resulting in an ever-changing world with infinite potential, all things that are only possible with Tango Augmented Reality.

Word by James Lu & Marcus Wong

Illustration by Skullbase.blogspot.sg

Art Direction by Ken Koh