Google Search is under pressure: not only many of us replace it with the search for chatgpt, Google’s attempts to avoid competition with characteristics such as AI descriptions have also failed due to some worrying inaccuracies.
That is why Google has just been looking for its greatest review for more than 25 years in Google I/O 2025. The era of the ‘ten blue bonds’ is closing, with Google Now giving its mode AI (previously hidden in its laboratory experiments) a broader deployment in the US.
The AI mode was far from the only search news in this year, so if you are wondering how the next 25 years of ‘Google’ look, here are all the new search functions that Google just announced.
A warning: beyond the AI mode, many of the characteristics will only be available for laboratory testers in the US.
1. The ai mode in the search is being implemented in the US.
Yes, Google has just removed the stabilizers in your mode AI for the search, which was previously available in laboratories for the first evaluators, and threw it all in the US. UU. It is still not known when it will reach other regions.
Google says that “in the coming weeks” (which sounds worryingly vague) will see that AI mode appears as a new tab on Google Search on the web (and in the search bar in the Google application).
We have already tried AI mode and we conclude that “it could be the end of the search as we know it,” and Google says it has been refining it since then: the new version is apparently fed by a personalized version of Gemini 2.5.
2. Google also has a new ‘deep search’ mode
Many AI chatbots, including chatgpt and perplexity, now offer a deep research mode for longer research projects that require a little more than a fast Google. Well, now Google has its own equivalent for the search called, yes, ‘deep search’.
Available in laboratories “in the coming months” (always the fastest of the launch windows), the deep search is a characteristic within AI mode that is based on the same “consultation fan-out” as that wider mode, but according to Google it takes it to the “next level.”
Actually, that should mean an “expert and totally cited level report” (Google says) in just a few minutes, which sounds like a great schedule, provided that precision is a little better than Google’s descriptions.
3. Search live allows you to examine Google with your camera
Google already allows you to ask questions about the world with Google Lens, and demonstrated its Universal Astra Assistant Project on Google I/O 2024. Well, you are now folding Astra in the search for Google so you can ask real -time questions using the camera of your smartphone.
‘Search Live’ is another laboratory feature and will be marked by an icon ‘live’ in Google AI mode or on Google Lens. Touch and point out your camera and have a round trip chat with Google about what you are in front of you, while sending the links with more information.
The idea sounds good in theory, but we still have to prove it beyond its prototype incarnation last year and the multimodal IA project is based on the cloud, so its mileage can vary depending on where you are using it. But we are excited to see how far it has come in the last year more or less with this new version of Labs in search.
4. The general views of the AI will globalize
We are not exactly wild about the descriptions of AI, which are the small paragraphs generated by the AI that often sees at the top of its search results. Sometimes they are inaccurate and have resulted in some infamous clangers, such as recommending that people add glue to their pizzas. But Google is moving forward with them and announced that IA descriptions are obtaining a broader deployment.
The new expansion means that the function will be available in more than 200 countries and territories and more than 40 languages worldwide. In other words, this is the new normality for the search for Google, so we better get used to it.
Liz Reid (Search Vice President) of Google recognized in a press information session before Google I/or 2025 that the descriptions of AI have been a learning experience, but states that they have improved since those first incidents.
“Many of you may have seen that a set of problems arose last year, although they were very education and quite rare, we also took them very, very seriously and made many improvements since then,” he said.
5. Google Search will soon be your ticket purchase agent
Find and buy tickets and still some painful experience in the search for Google. Fortunately, Google promises a new mode driven by Project Mariner, who is an AI agent who can navigate the web as a human and complete task.
Instead of a separate feature, this will apparently live within AI mode and activate when asking questions such as “find two affordable tickets for the Red game this Saturday at the lower level.”
This will make it out and analyze hundreds of ticket options with real -time prices. You can also complete the forms, leaving it with the simple task of pressing the ‘purchase’ button (in theory, at least).
The only drawback is that this is another of Google’s laboratory projects that will be launched “in the coming months”, so who knows when we will see it in action.
Google gave its shopping tab within Google Search a great update in October 2024, and now many of those characteristics are receiving another impulse thanks to some new integration with AI mode.
The ‘virtual test’ function (which allows you to load a photo of yours to see how new clothes could be seen), but the largest new feature is a payment function with AI that tracks prices for you, then buy things in your name using Google Pay when the price is correct (with your confirmation, of course).
We are not sure that this will help cure our gear acquisition syndrome, but also has some potential to save time (and savings elimination).
7. Google’s search is becoming even more personalized (if you wish)
Like the traditional search, Google’s new AI mode will offer suggestions based on its previous searches, but you can also make it much more personalized. Google says that you can connect it to some of its other services, especially Gmail, to help your response to your consultations with a more personalized personal touch.
An example that Google gave was asking the way “things to do in Nashville this weekend with friends.” If you have connected it to other Google services, you could use your reservations and searches for previous restaurants to bow the results towards restaurants with outdoor seats.
There are obvious problems here: for many, this can be an invasion of privacy too far, so they are probable that they do not choose to connect it to other services. In addition, these powers of “personal context” sound as if they had the problem of the “Eco Chamber” of assuming that you always want to repeat their previous preferences.
Even so, it could be another useful evolution of the search for some, and Google says that you can always administer its customization configuration at any time.