Google Lens was originally unveiled at I/O 2017 as a new way of identifying objects and places in the world with just your smartphone. It was originally shipped as a beta built into the camera app on the Pixel 2 and 2 XL, and now, Google says, it’ll begin rolling out to users of the Google Assistant on the Pixel 2 and original Pixel over the coming weeks.
Rather having to open the camera app, those with a Pixel phone will be able to simply launch the Assistant by squeezing their phone, holding down the home key, or by saying “Okay Google” and tapping the Lens icon in the lower right-hand corner. This will present users with a camera interface which will let you scan practically anything and learn more about what’s in front of you with links and information. Google says the following can be identified with the utility.
- Text: Save information from business cards, follow URLs, call phone numbers and navigate to addresses.
- Landmarks: Explore a new city like a pro with your Assistant to help you recognize landmarks and learn about their history.
- Art, books and movies: Learn more about a movie, from the trailer to reviews, right from the poster. Look up a book to see the rating and a short synopsis. Become a museum guru by quickly looking up an artist’s info and more. You can even add events, like the movie release date or gallery opening, to your calendar right from Google Lens.
- Barcodes: Quickly look up products by barcode, or scan QR codes, all with your Assistant.
Google says Lens in the Assistant will begin rolling out to English-speaking Pixel users in the U.S., U.K., Australia, Canada, India, and Singapore over the coming weeks. Additional language and territory support will likely arrive later. We also suspect more Android users will be able to use Google Lens in the future as well so stay tuned.
Update: This article has been updated to clarify both Pixel and Pixel 2 users will be receiving this update.
You must log in to post a comment.