Augmented reality: A battle between Apple and Google

August 30, 2017 - 9:02 AM
5138
A 3D printed Android mascot Bugdroid is seen in front of a Google logo in this illustration taken July 9, 2017. Dado Ruvic/Reuters Illustration

SAN FRANCISCO, CALIFORNIA — Alphabet Inc’s Google on Wednesday unveiled tools to make augmented reality apps for mobile devices using the Android operating system, setting up its latest showdown with Apple Inc’s iPhone over next-generation smartphone features.

Phone-based augmented reality (AR), in which digital objects are superimposed onto the real world on screen, got a huge boost from the popularity of the Pokémon Go game. The game, launched in the United States in July last year, sent players into city streets, offices, parks and restaurants to search for colorful animated characters.

Analysts expected the game to make $3 billion for Apple over two years as gamers buy “PokéCoins” from its app store.

Google’s take on the technology will first be available on the Samsung Galaxy S8 and Google’s own Pixel phone. The company said in a blog post that it hoped to make the system, called ARCore, available to at least 100 million users, but did not set a date for a broad release.

Apple in June announced a similar system called ARKit that it plans to release this fall on “hundreds of millions” of devices.

Google and Apple will jockey for the attention of customers and software developers who will build the games, walking guides and other applications that would make AR a compelling feature.

Many tech industry leaders envision a future in which eyeglasses, car windshields and other surfaces can overlay digital information on the real world. Google and Microsoft Corp have already experimented with AR glasses.

“AR is big and profound,” Apple Chief Executive Tim Cook told investors earlier in August. “And this is one of those huge things that we’ll look back at and marvel on the start of it.”

Apple and Google have had to make compromises to bring the technology to market.

In Apple’s case, the Cupertino, California-based company decided to make its AR system work with devices capable of running iOS 11, its next-generation operating system due out this fall.

This means it will work on phones going back to the iPhone 6s, which have a single camera at the back and standard motion sensors, rather than a dual camera system found on newer models such as the iPhone 7 Plus or special depth-sensing chips in competing phones. That limits the range of images that can be displayed.

Google initially aimed to solve this problem with an AR system called Tango that uses a special depth-sensor, but only two phone makers so far support it. With ARCore, Google changed course to work on phones without depth sensors.

But the fragmentation of the Android ecosystem presents challenges. To spread its AR system beyond the Galaxy S8 and Pixel phone, Google will have to figure out how account for the wide variety of Android phone cameras or require phone makers to use specific parts.

Apple, however, is able to make its system work well because it knows exactly which hardware and software are on the iPhone and calibrates them tightly.

Michael Valdsgaard, a developer with the furniture chain IKEA, called the system “rock solid,” noting that it could estimate the size of virtual furniture placed in a room with 98 percent accuracy, despite lacking special sensors.

“This is a classic example of where Apple’s ownership of the whole widget including both hardware and software is a huge advantage over device vendors dependent on Android and the broader value chain of component vendors,” said Jan Dawson, founder and chief analyst of Jackdaw Research.