Google Lens AI-driven visual search tool rolls out for Assistant on Pixel phones | Hindustan Times
Today in New Delhi, India
Jan 23, 2018-Tuesday
-°C
New Delhi
  • Humidity
    -
  • Wind
    -

Google Lens AI-driven visual search tool rolls out for Assistant on Pixel phones

Google Lens delivers contextual information using visual analysis.

tech Updated: Nov 20, 2017 11:46 IST
Matt Vokoun, Director of Product Management at Google, Inc., introduces Google Lens at a product launch event on October 4, 2017 at the SFJAZZ Center in San Francisco, California.
Matt Vokoun, Director of Product Management at Google, Inc., introduces Google Lens at a product launch event on October 4, 2017 at the SFJAZZ Center in San Francisco, California.(AFP)

Google has started rolling out visual search feature Google Lens in Assistant for the first batch of Pixel and Pixel 2 smartphones.

“The first users have spotted the visual search feature up and running on their Pixel and Pixel 2 phones,” 9to5Google reported late on Friday.

Google Lens was unveiled at the company’s annual developer conference I/O 2017. Based on machine learning and artificial intelligence, the app delivers contextual information using visual analysis. The technology was also demoed at the Pixel 2 and Pixel 2 XL launch event last month.

Built into the Photos app, Google Lens can recognise things like addresses and books, among others. In Photos, the feature can be activated when viewing any image or screenshot. However, in Google Assistant, it is integrated right into the sheet that pops up after holding down on the home button.

“Lens was always intended for both Pixel 1 and 2 phones,” Google had earlier said in a statement.

Apart from Lens, Google had also introduced a ‘Clips’ camera which also uses AI and machine learning to capture the best pictures of users. Tesla founder Elon Musk, a big critic of AI, had slammed the device. “This doesn’t even ‘seem’ innocent,” he tweeted while referring to the Clips camera.

Recommended Section