- Ray-Ban Meta Smart glasses only obtained a great update of AI in the United Kingdom
- Your smart glasses can now use the camera to respond to what you can see
- Live translation tools are on the way “soon”, too
If you live in the United Kingdom, your Ray-Ban Meta Smart glasses are receiving an update. As of today (April 10, 2025), their smart goals obtain a free target update AI, which was promised when they first revealed themselves in 2023.
The largest new feature is the arrival of the appearance tools, which allow their glasses to look for answers to their questions not only of what it says, but also in what you can see.
This tool will help specifications to transform tourist guides while asking them to identify and tell you about a reference point or help you improve your understanding of nature by helping you identify plants and animals that spy, provided that the glasses look good.
You can also get the specifications to help you with translation.
At this time, they can only read text and tell him what he says in English, whenever it is a compatible language such as French or Italian. However, Meta says that “soon” they can also provide a real -time translation when someone is talking to you: to translate, in real time, between Spanish, French, Italian and English.
To turn on these new tools, you must enable AI goal for your glasses in the configuration of your View target application.
After that, use the language to ask your Ray-Bans to take a visual example with indications such as “Hello goal, what am I seeing?” Or “Hello goal, look and tell me what kind of tree is this”, and you will do the rest.
As with other AI, their Ray-Bans will not get everything well 100% of the time, but in our tests the glasses did it surprisingly well. They helped us navigate the map of one meter from London, even when the station we wanted was out of view of the glasses, and did a great job summarizing signs for us in supported languages.
So, if you have a couple of smart Ray-Bans, go and enjoy your new goal functions AI.