- It is said that Apple is still working in cameras on Airpods
- The visual intelligence of iOS 18 is in the heart of Apple’s plans
- The characteristics are still “distance generations”
We have known the “what” for some time, Apple is experiencing with cameras in its airpods, and now we may know the “why”. A new report sheds light on Apple’s plans for future Airpods, and if technology can do what it promises to do, it could be a really important personal security feature.
However, there is an important warning: the characteristics “are still at least generations to hit the market.”
The report comes from the connected good Mark Gurman in Bloomberg, who says that “Apple’s last visual intelligence plan goes far beyond the iPhone.” And Airpods are a large part of that plan.
According to Gurman, visual intelligence, which recognizes the world that surrounds it and provides useful information or assistance, is considered a big problem within Apple, and is planning to place cameras both in Apple Watch and Apple Watch Ultra. As with the airpods, “this would help the device see the outside world and use AI to deliver relevant information.”
How Airpods will work with visual intelligence
Visual Intelligence was introduced in iOS 18 for the iPhone 16, and allows you to aim the camera to something and obtain more information about it: the type of plant, the dog’s breed (as in the image in the upper part of this article), the hours of opening of the coffee you just found, and so on.
Visual intelligence can also translate the text, and perhaps one day can help people like me who have a surprisingly bad memory for names and faces.
However, the big problem with visual intelligence is that you have to get your phone to do it. And there are circumstances in which you will not want to do that. I remember when Apple brought Maps to Apple Watch: by making possible use maps without transmitting “I am not from here and I am lost. I also have a very expensive phone” for all the villains of the neighborhood, it was an important personal security feature.
This could also be. If Apple makes it possible to invoke visual intelligence with a head point and a tightness of the stems, that would allow you to obtain important information, such as a translation of a sign of address in another country, without stirring your phone.
We are very far from having these characteristics: I do not wait for them in the Airpods Pro 3, which will probably come later in 2025. But I am excited by the perspective: Imagine Apple’s intelligence, but well.