Subscribe to Email Updates

Within the first two minutes of Sundar Pichai’s keynote at Google I/O, he was already talking about local. “Every year at I/O,” Pichai said, “we learn and try to make things a little bit better … This year, we wanted to make it easier for you to get around. So we are using AR to help. To get started, open your I/O app and choose Explore I/O. And then you can just point your phone where you want to go." 

On the screen behind Pichai, a demo showed AR functionality embedded in the I/O conference app, where a live image displayed meeting areas at the Shoreline Amphitheatre overlayed with icons and information about sessions and food kiosks.

Screen Shot 2019-05-09 at 4.34.18 PMSundar Pichai demos the Explore I/O app at Google I/O

“This is a pretty compelling use case,” he went on, “and we actually want to generalize this approach, so that you can explore and navigate the whole world that way.”

Of course, Google has long been in the business of organizing information and making it useful, but Pichai emphasized a shift in orientation: “We are moving from a company that helps you find answers to a company that helps you get things done.”

This broad statement means many things, evoking for instance Google’s aggressive foray in recent years into the consumer electronics space. But I also hear Pichai calling out to the shift from “strings to things” which I’ve written about recently at Street Fight. Applied to the real world, Google’s interest in entities can be summarized by saying that the company wants its maps to capture and link up with reality as closely as possible.

Google Lens Reads Restaurant Menus

One of the next demonstrations during the keynote illustrates this point. Pinchai’s co-presenter Aparna Chennapragada introduced her discussion of Google Lens, Google’s visual search product, by saying, “People have already used Lens more than a billion times so far … One way we’ve been thinking about it is, with Lens, we’re indexing the physical world, billions of places and products and so on, much like search indexes the billions of pages on the web.”

It’s worth noting that visual imagery is merely a different kind of representation of the real world as compared to text. Still, Google’s foray into visual search represents a much more immediate connection to the external world. Chennapragada illustrated this by showing how Lens now allows users to point their phone camera at a restaurant menu, highlighting popular dishes and linking them to reviews and photos from Google My Business.

Screen Shot 2019-05-09 at 4.36.39 PM
Google's Aparna Chennapragada showcases new Google Lens menu capabilities.

At the end of the meal, users can also point Lens at the restaurant receipt and have it calculate the tip and split the bill.

Connecting a real-world menu with information from Google Maps represents a granular linkage between real-world entities and data that we haven’t seen before, in this case done in the service of easing friction in local commerce. In fact, throughout the keynote, Google emphasized the local commerce use case over and over. Clearly, Google’s technological innovations are being applied in an ambitious way to its already robust local data layer.

Google Duplex Expands to Car Rental, Movie Tickets

Another point of focus was Duplex, Google’s automated voice assistant that is already helping customers book tables at restaurants. “For us,” said Pichai, “Duplex is the approach by which we train AI on simple but familiar tasks to accomplish them and save you time.” The statement suggests that self-contained commercial transactions such as restaurant booking are only a starting place, and that Google is using what it learns from such use cases to work toward more complex types of interaction.

And in fact, Pichai announced that other “narrow use cases,” in particular rental car bookings and movie ticketing, would be handled by an expanded version of Duplex called Duplex on the Web, designed to handle tedious workflows such as filling out the complex forms required to rent a car. Pichai noted that car rental sites wouldn’t need to do anything to participate in the program. Google Assistant basically works as an intelligent form-fill tool that knows enough about your personal information and preferences to be able to perform routine tasks on your behalf.

Screen Shot 2019-05-10 at 10.46.37 AM
Demo of Google Duplex for car rental.

Minimizing of Voice Technology Leads to a More Conversational Interface

A more technical-sounding announcement during the keynote turns out to have massive implications. Pichai announced that Google has achieved the milestone of shrinking instances its voice recognition software from 100GB down to 0.5GB, at which size it can now be embedded on every phone. Because Google’s voice technology no longer requires an internet connection to a remote server, its processing speed can be greatly improved. As Google’s Scott Huffman explained, the speed at which Google Assistant can now process information will make tapping on your phone seem comparatively slow.

The efficiency of onboard processing will enable users to open apps and perform tasks at the speed of conversation. During the keynote, Huffman and another Googler named Maggie demonstrated examples like sending a photo from a recent trip to a friend via text with voice commands like “Show me my photos from Yellowstone. The ones with animals.”

Screen Shot 2019-05-10 at 11.12.35 AMDemonstration of next generation Google Assistant.

For local search use cases, this “next generation Assistant” underscores the closer connection Google is establishing between the user experience on the phone and the real world around you. Users should, one imagines, be able to ask the Assistant to show various options for local shops and services based on user preferences, with Duplex eventually helping the user transact business with a chosen store.

“By moving these powerful AI models right onto your phone, we’re envisioning a paradigm shift,” Huffman said. “This next generation Assistant will let you instantly operate your phone with your voice, multitask across apps, and complete complex actions, all with almost zero latency.” The technology is slated to be added to new Pixel phones later this year.

Adding Incognito Mode to Google Maps

Pinchai spent a significant portion of the I/O keynote addressing the concept of privacy, and emphasizing Google’s improved controls over the management of personal data. On the local level, Pinchai announced a new feature in Maps whereby the same Incognito Mode that has been available in Chrome for a decade will now be a Maps feature. Using Maps in Incognito Mode, users will be able to search and navigate to local points of interest privately, without leaving any record of their activity for Google to track and analyze.

In addition, Google announced that the new release of Android’s operating system, Android Q, will make it easier to stay on top of the apps that monitor your location, with a more prominent user settings page that makes it easy to review and modify location tracking permissions across all of the apps on your phone.

Finally, More Helpful Walking Directions

Last but not least, Sabrina Ellis, during a section of the keynote on new Pixel features, unveiled an AR integration that is sure to be popular among those who use Google Maps for walking directions. The new feature adds an AR overlay to the camera view of the street in front of you, with arrows showing which direction you should walk. No more guessing which way the little blue dot is pointing! The feature is rolling out now to all Pixel phones.

Screen Shot 2019-05-10 at 12.42.58 PMDemonstration of walking directions on Pixel phones.

Though better walking directions may not be especially relevant to local businesses, clearly Google is finding more and more ways to integrate AR into local search. As Ellis said, “We’re just beginning our journey with AR and Maps.”

Summing It Up

With developments in AR, AI, machine learning, image processing, voice recognition, and other technologies, Google made it clear at I/O that restless innovation in multiple directions would continue to be its modus operandi. It was notable, though, just how many of those innovations had a local angle.

True, local search and discovery represent convenient use cases for deploying new technologies, with relatively contained sets of variables and a ready-set testbed of millions of active users. At the same time, a strong through-line in Google’s presentation was creating meaningful connections between technology and the world to help users in their daily lives. Local search fits naturally into that paradigm, and we can probably expect that it will be one of the more active areas of technical innovation moving forward.

If you’d like, you can watch the full keynote yourself on YouTube.

Damian Rollison

VP of Product Strategy