How to use Google Lens to ask questions out loud about what you see

You can now use voice input in Google Lens to ask questions about the things you see — the same way you’d point at something and ask your friend about it. This feature is one of the latest Google Search updates that make it easier to ask questions and find helpful information from across the web.

Asking your questions out loud with Lens can be useful in all kinds of situations, especially if you’re on the go. Maybe you’re visiting a museum and want to know the history behind one of the paintings. Or perhaps you want to find out the name of a colorful bird you see while you’re out walking your dog (and using one hand to hold their leash!).

Previously, you’d have to take a picture and then type in your question manually. But now with voice input in Lens, you can search what you see and ask a question about it all in one go, so it’s more natural and intuitive to explore the world around you. Here’s how to get started.

How to use voice input in Lens

  1. Open the Google app (Android & iOS) and tap the camera icon in the Search bar to open Lens.
  2. Point your camera at whatever you want to ask about.
  3. Hold down the shutter button and ask your question out loud, like “why did the artist paint this?” or “what kind of clouds are these?”

    If you’re a Search Labs user enrolled in the “AI Overviews and more” experiment, holding the shutter button will capture a video to provide Lens with even more visual context for your search.

  4. Scroll through the results, which may include an AI Overview, as well as links to relevant sites across the web.
  5. To ask another question out loud about the photo you took, simply tap the microphone icon at the top of the results page.

Search with your voice in Lens to find the answers you need.

Voice input for Lens is now available globally for English queries in the Google app for Android and iOS.

To learn about other helpful search features — including new ways to identify songs you hear and shop what you see — read about our latest set of Google Search updates.

Blog Article: Here

  • Related Posts

    How we’re helping Google Play developers deliver better user experiences through improved performance insights.

    We’re sharing the latest Google Play Console updates that make it easier for developers to gain actionable performance insights and deliver better user experiences. High…

    AWS Weekly Roundup: Amazon EKS, Amazon OpenSearch, Amazon API Gateway, and more (April 7, 2025)

    AWS Summit season starts this week! These free events are now rolling out worldwide, bringing our cloud computing community together to connect, collaborate, and learn. Whether you prefer joining us online or in-person, these gatherings offer valuable opportunities to expand your AWS knowledge. I will be attending the Summit in Paris this week, the biggest […]

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    You Missed

    How we’re helping Google Play developers deliver better user experiences through improved performance insights.

    How we’re helping Google Play developers deliver better user experiences through improved performance insights.

    Cracking the code: How to wow the acceptance committee at your next tech event

    Cracking the code: How to wow the acceptance committee at your next tech event

    How to make your images in Markdown on GitHub adjust for dark mode and light mode

    How to make your images in Markdown on GitHub adjust for dark mode and light mode
    AWS Weekly Roundup: Amazon EKS, Amazon OpenSearch, Amazon API Gateway, and more (April 7, 2025)
    AWS Weekly Roundup: Amazon S3 Express One Zone price cuts, Pixtral Large on Amazon Bedrock, Amazon Nova Sonic, and more (April 14, 2025)

    4 Fitbit features I’m using to become a more efficient runner

    4 Fitbit features I’m using to become a more efficient runner