Mobile Takes Center Stage at Google Search Event
In addition to “real time search,” discussed in the previous post, Google today announced new mobile products. These include improved voice search technology, product search, and a mobile visual search product called “Goggles.”
Google VP of Engineering Vic Gundotra prefaced the announcement with the assertion that the mobile device lets us do things we couldn’t do on the PC. The software being created to run these devices is only starting to utilize the capabilities of the hardware.
“These devices have built-in sensors, cameras, GPS, speakers and accelerometers,” he said. “Alone, these aren’t that special but when connected with the cloud, the camera becomes an eye and the phone mic becomes an ear.”
In Sight
On this note, better speech recognition was exhibited — improved in the past year since Google announced voice search on the iPhone. More importantly, voice search now supports Mandarin Chinese and Japanese — important if you consider the sizes of these markets and the complexity of their written (or typed) characters.
From voice to visual, the company then announced the new “Goggles” visual search feature on mobile devices that will allow users to search via imagery. In other words pictures can be taken that are used to query a database and return information that identifies or gives related information on that image.
The company also announced a new product search feature that will pull in retail inventory feeds to reveal what products are available locally now. These data sets and actual products have already been developed by the likes of Krillion and NearbyNow (and an earlier effort by Google via StepUp Commerce).
Now that Google seems to be going after it in earnest, it could get much more recognition though. The company will deal directly with retailers such as Sears and Best Buy.
Location, Location …
Lastly, in general, location is becoming more and more central to Google mobile searches. This includes a “near me now” option and new location integration with the existing “suggestions” feature.
Suggestions, for those unfamiliar, is simply the predictive text that finishes search queries when you’re in the process of typing them. Gundotra says 40 percent of mobile searches directly result from suggested terms. This makes sense if you consider both the reality of typing on a mobile device, and the propensity for “discovery” scenarios that are inherent with mobility.
With this comes even greater relevance for local information. Gundotra demoed a side-by-side comparison of two different iPhones that were each wired to “think” they were in San Francisco and Boston, respectively. After typing in the letters “Re” the Boston phone provided suggestions starting with “Red Sox” while the San Francisco phone suggested the outdoor apparel retailer REI.
The location relevance of mobile searches are clearly top of mind for Google. They’ve told us, in fact, that local searches index 2x-3x higher (as a percentage of overall searches) on mobile than PC. Beyond the SF/Boston demos, nearly every sample search they demoed at the event (including spoken queries for “McDonald’s in Beijing”) was a local search.