I’ve never had another smartphone apart from an iPhone until this year. However, as AI makes its way in each technological product on the planet, I needed to try Android to understand the differences between artificial intelligence in the two ecosystems.
After using a Samsung Galaxy S25 for a few weeks, I returned to my iPhone 16 Pro Max. Not because it was better, but because the ecosystem in which he has built his life is equivalent to the decisive factor when it comes to choosing between flagship phones.
Once I returned to iOS, I found myself losing a specific function more than others, and without access on iPhone, I quickly lived again with an Android device.
That AI feature I am talking about is Gemini Live, and although you could access it in iOS, the experience was silly. That was until yesterday, in Google I/or 2025, when Google announced that all Gemini Live capabilities are being implemented on iPhone and at no cost.
Here is why Gemini Live is the best AI tool I have used, and how to add all its abilities to the iPhone means that I am ready to return to Apple.
What visual intelligence wanted to be
Gemini Live already existed in the Gemini application in iOS, but lacked two crucial elements that make the Android version much better. First, Gemini Live in iOS could not access the camera of his iPhone, and secondly, he could not see what he was doing on his screen. E/S 2025 changed all that.
Now, iPhone users can give Gemini access to their camera and screen, allowing new ways to interact with AI that we have not really seen in iOS before.
The capacity of the Gemini camera is one, if not the best tool that I have used to leave, and I am delighted that iPhone users can now experience it.
What is Gemini Live’s camera function? Well, imagine a better version of what Apple wanted visual intelligence to be. You can simply show Gemini what you are looking at and ask questions without describing the issue.
I found that the functionality of the Gemini Live camera thrives in situations such as cooking. I used it last week to make Birria tacos, and not only gave me advice in every step of the road, but I could also see everything I was doing and help me to address a delicious dinner.
Not only support my S25 in a stand gave Gemini live the perfect angle, but can connect to Google applications, I could ask you to get information about a recipe directly from the video of the content creator. It is not necessary to constantly touch your phone with dirty hands in the kitchen, and it is not necessary to verify a recipe. Gemini Live can do everything.
A partner of AI in every step of the road
The screen exchange allows Gemini Live to see what is on their screen at any time, which allows you to ask questions related to the images, something that you are working on or even how to complete a puzzle in a game. It is very great, similar to the Siri to Apple’s intelligence proof, they promised us but we never received at WWDC 2024.
The Free GEMINI Live deployment has just begun, so we still do not see how this functionality will work in iOS. That said, if it works in the middle of half as in Android, this will be a characteristic that could see many people fall in love.
Gemini Live and its multiple ways of interacting with the world completely unlock AI on a smartphone, and now that iPhone users can also access it, I have no reason not to return to Apple’s ecosystem.
@Techradar ♬ Original Sound – Techradar
You may also like