The race to put smart glasses augmented on your face is heating. Snap shows are transforming into “specifications” and will be launched as lighter and more powerful ar wearables in 2026.
The Evan Spiegel CEO announced the new specifications on the stage in the astonishment of the XR event, promising smart glasses that are smaller, considerably lighter and “with a ton more capacity.”
The company did not spend a specific frame or price of time, but the 2026 launch program puts the finish line in warning, which is busy preparing its exciting glasses orion arg by 2027. It seems that SNAP specifications will face the glasses based on Samsung/Google Android XR, which are also expected at some point in 2026.
As for what consumers can expect from the specifications, SNAP is building them in the same SNAP operating system used in their fifth generation glasses (and probably still uses a pair of Qualcomm Snapdragon XR chips). That means that all interface and interaction metaphors, such as gesture -based controls, will remain. But there is a significant number of new characteristics and integrations that will begin to appear this year, long before the specifications arrive, including AI.
Platform update
Spiegel explained the updates by first revealing that SNAP began working in glasses “before Snapchat” was even one thing and that the general objective of the company is to “make computers more human.” He added that “with advances in AI, computers are thinking and acting as humans more than ever.”
Snap’s plan with these updates for Snap is bringing platforms to the real world. They are bringing Gemini and OpenAi models to the SNAP operating system, which means that some multiple models will soon be part of the fifth generation shows and, eventually, specifications. These tools can be used for the translation of the text on the march and the conversion of currencies.
The updated platform also adds tools for SNAP lens builders that will be integrated with visualization capabilities based on the Spectacles and specifications.
A new SNAP3D API, for example, will allow developers to use Genai to create 3D objects in lenses.
The updates will include an AI of the depth module, which can read 2D information to create 3D maps that will help anchor virtual objects in a 3D world.
Companies that implement shows (and eventually specifications) can appreciate the new fleet management application, which will allow developers to administer and monitor multiple specifications at the same time, and the ability to implement the specifications for guided navigation in, for example, a museum.
Later, Snap will add webxr support to create AR and VR experiences within web browsers.
Let’s do it interesting
Spiegel said that, through Snapchat lenses, Snap has the largest AR platform in the world. “People use our AR lenses in our camera 8 billion times a day.”
That is a lot, but it is practically through smartphones. At the moment, only developers are using serious shows and their lens capabilities.
The consumer release of specifications could change that. When I tried the shows last year, I was impressed by the experience and found them, although not as good as Orion Meta glasses (the lack of look stood out for me), full of potential.
A lighter factor that is approaching or exceeds what I found with Orion and I have seen in some Samsung Android XR glasses, the specifications of the ar glasses could vault. That is, provided they do not cost $ 2000.