Last month, Google announced that Lens — its set of visual search and analysis tools — is coming to iOS. First introduced at last year’s Google I/O developer conference, it will be rolling out over the coming weeks to all users using Google Photos.

Google Lens, like Assistant, is an extension of Google Search. It can analyze what’s in an image and provide relevant actions and search results. For example, when analyzing a picture with a phone number or address, it will provide suggestions to call and get directions.

Other capabilities include:

  • Business card: You can save the phone number or address to a contact.
  • Book: You can get reviews and other details about it.
  • Landmark or building: You can get more details about it.
  • Painting in a museum: You can get details about it.
  • A plant or animal: You can learn more about it.
  • A flyer or event billboard: You can add that event to your calendar.

In Google Photos, Lens can be accessed by tapping the new square camera icon in the bottom toolbar of any image. After a second of analysis, a panel will slide up with actions you can take or Search results.

According to Google, it is rolling out on version 3.15 of the iOS app starting today with a complete release for all users over the next week.

Subscribe to 9to5Mac on YouTube for more Apple news:

FTC: We use income earning auto affiliate links. More.

Check out 9to5Mac on YouTube for more Apple news:

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

About the Author