Last month, Google announced that Lens — its set of visual search and analysis tools — is coming to iOS. First introduced at last year’s Google I/O developer conference, it will be rolling out over the coming weeks to all users using Google Photos.
Google Lens, like Assistant, is an extension of Google Search. It can analyze what’s in an image and provide relevant actions and search results. For example, when analyzing a picture with a phone number or address, it will provide suggestions to call and get directions.
Other capabilities include:
- Business card: You can save the phone number or address to a contact.
- Book: You can get reviews and other details about it.
- Landmark or building: You can get more details about it.
- Painting in a museum: You can get details about it.
- A plant or animal: You can learn more about it.
- A flyer or event billboard: You can add that event to your calendar.
In Google Photos, Lens can be accessed by tapping the new square camera icon in the bottom toolbar of any image. After a second of analysis, a panel will slide up with actions you can take or Search results.
According to Google, it is rolling out on version 3.15 of the iOS app starting today with a complete release for all users over the next week.
Starting today and rolling out over the next week, those of you on iOS can try the preview of Google Lens to quickly take action from a photo or discover more about the world around you. Make sure you have the latest version (3.15) of the app.https://t.co/Ni6MwEh1bu pic.twitter.com/UyIkwAP3i9
— Google Photos (@googlephotos) March 15, 2018
Subscribe to 9to5Mac on YouTube for more Apple news:
FTC: We use income earning auto affiliate links. More.
Comments