As we expected, WWDC 2025, mainly the key opening note, went and left without a formal update on Siri. Apple is still working on the infused update with AI, which is essentially a much more pleasant and processable virtual assistant. The Techradar editor in general, Lance Ulanoff, broken down the details of what the delay is causing after a conversation with Craig Federight, here.
Now, even without the Siri infused with AI, Apple offered a fairly significant update to Apple’s intelligence, but it is not necessarily in the place where you would think. It is giving visual intelligence, an exclusive characteristic of the iPhone 16 family, the iPhone 15 Pro or the iPhone 15 Pro Max, an update as it obtains awareness on the screen and a new way of searching, all hosted in the power of a screenshot.
It is a complementary characteristic for the original visual intelligence set know how: a long pressure in the camera control button (or customization of the action button in the 15 Pro) extracts a live view of the iPhone camera and the ability to take a shot, as well as “ask” or “search” what your iPhone sees.
It is a kind of more basic version of Google Lens, in the sense that you can identify plants, pets and look visually. A large part of that will not change with iOS 26, but you can use visual intelligence for screenshots. After a brief demonstration on WWDC 2025, I am anxious to use it again.
Visual Intelligence makes screen catches much more processable, and could save space on your iPhone … especially if your photos application is like mine and full of screenshots. Apple’s great effort is that this gives us a sample of consciousness on the screen.
The screen capture capture a message chat with a poster for a next movie next year in the demonstration that VI revealed a vision of the new interface. It is the iPhone classic screen capture interface, but in the lower left is the family “question” and the “search” is on the right, while in the middle it is a suggestion of Apple’s intelligence that can vary according to whatever screenshot.
In this case, it was “add to the calendar”, which allowed me to easily create an invitation with the name of the film night on the right date and time, as well as the location. Essentially, it is identifying the elements in screen capture and extracting the relevant information.
Very good! Instead of simply taking a screenshot of the image, you can have a processable event added to your calendar in just seconds. It is also baked in functionality that I think that many iPhone owners will appreciate, even if Android likes the best pixels or the Ultra Galaxy S25 could have done this for a while.
Apple Intelligence will provide these suggestions when you consider them well, that could be to create an invitation or reminder, as well as translate other languages into your favorite, summarize the text or even read aloud.
All very useful, but let’s say that you are displacing the tiktok or instagram reels and see a product, perhaps a charming button or a poster that caught your attention. Visual intelligence has a solution for this, and it is a kind of response from Apple to ‘circle to search’ on Android.
To capture of the screen, and then, after it is taken, simply rub the part of the image you want to search. It is a similar effect on screen when you select an object to delete in the photos ‘Clean Up’, but after that, it will allow you to search through Google or Chatgpt. Other applications can also opt for this API that Apple is making available.
And that is where this becomes quite exciting: you can move through all available places to search, such as Etsy or Amazon. I think this will be a fans favorite when sent, although it is not quite a reason to go out and buy an iPhone that admits visual intelligence … However, at least, at least.
In addition, if you prefer to search only all screenshots, that is where the ‘ask’ and ‘search’ buttons enter. With them, you can use Google or Chatgpt. Beyond the ability to analyze and suggest through screenshots, or search with a selection, Apple is also expanding the types of things that visual intelligence can recognize beyond pets and plants to books, reference points and art pieces.
Not everything was immediately available at launch, but Apple is clearly working to expand visual intelligence capabilities and improve the set of Apple Intelligence characteristics. Taking into account that this gives us an idea of consciousness about the screen, I am quite excited.