- Google Search Live is now available globally in 200 countries and 98 languages
- Search Live uses the new Gemini 3.1 Flash Live voice and audio model to enable “more natural” conversational search
- The audio responses have links to the source of information.
Google has launched its AI-powered conversational search tool, Search Live, globally in more than 200 countries and territories, and is available in 98 languages. First launched in the US in September 2025, Search Live lets you point your phone or tablet’s camera at something and ask the AI tool out loud, like what model of washing machine you have and how to use it.
The AI then responds with an audio response that is also conveniently subtitled and will continue to listen to any clarifications and follow-up questions to emulate a natural conversation.
You can access Search Live through the Google app on Android or iOS by tapping the “Live” button below the search bar, located between the AI Mode and Nano Banana buttons. It can also be accessed through Google Lens and the dedicated Gemini app.
Article continues below.
Google has said the expansion has been made possible by the launch of a new voice and audio model called Gemini 3.1 Flash Live, which it says is “intrinsically multilingual.” The company also claims that the model also responds to queries faster and aims to offer “more natural and intuitive conversations.”
Analysis: Good but not perfect
Search Live uses fan querying, an information retrieval technique that broadens the search by looking at related answers beyond a specific question, to provide a more complete answer and duplicate the conversational aspect.
We tested Search Live in June of last year and noticed how the tool continues to work in the background to utilize query distribution, and my colleague Eric Hal Schwartz said that responses “didn’t seem locked into a single answer form, even for relatively simple queries.”
I tried it myself and tried it on my bike. While Search Live was good at identifying the specific model, year of release, and why it had a specific paint job, it didn’t recognize that I had swapped out the original wheelset for a third-party set and thought it still had the integrated handlebars it originally came with. He also failed to correctly identify the bike’s accessories, such as the taillight, water bottle and bottle cages.
In a similar test, it couldn’t identify the Nothing Phone 4a Pro that was on my desk, and instead called it the Nothing Phone 2a. I compared the results to the same question on Gemini Live and received identical answers.
It’s understandable why some of the results were incorrect, as the AI assistant relied on existing online sources and new products won’t necessarily have information for the model to learn from, but as it stands, it can handle some general queries.
According to Google, more than 1.5 billion people were using Google Lens to identify objects around them as of June 2025 and there are around 750 million Gemini Live users, so it will be interesting to see what the uptake of Search Live will be globally and if it becomes the default way to search for information online.




