Snap is back with a new pair of new AR smart glasses. It’s ready to sell to consumers for the first time in years.
The company plans to sell these new glasses, known as specifications, to consumers starting in 2026, with CEO Evan Spiegel unveiling at the World’s Fair in Long Beach, California on Tuesday. A spokesman for SNAP said that it will also ship glasses to TechCrunch in 2026.
The Snap specs feature many of the same augmented reality and artificial intelligence features available in the company’s smart glasses for developers.
The specs feature see-through lenses that allow users to display graphics as if they were projected into the world. The glasses also feature AI assistants equipped with SNAP technology that can handle both audio and video.
The Specs announcement comes almost 10 years after SNAP first attempted to sell consumer smart glasses at the first launch of its 2016 glasses. Snap was ahead of its time, but the company is currently facing fierce competition in the AR glasses market with giants like Meta and Google.
Meta will be releasing glasses in the second half of 2025 with a built-in screen codenamed “Hyper Nova.” Meanwhile, Google recently announced partnerships with Warby Parker, Samsung and other companies to develop Android XR smart glasses.
Snap hopes that the Snapos developer ecosystem (which has spent building over the past few years) will have an edge in AR races. The company said many AR experiences and glasses called lenses have been built by millions of AR experience developers for Snapchat will also be working on new specifications.
On stage, Spiegel introduced some of these lenses. One of them, “Super Travel,” translates signs and menus for foreign users. Another app Spiegel introduces “Cookkmate” to find recipes based on the ingredients available in the user’s kitchen, and then provide step-by-step cooking guidance.
Companies have demonstrated these use cases for many years, but have struggled to deliver affordable, comfortable smart glasses that are capable enough to give everyday consumers a taste of AR. Snap appears to believe it is happening in the specs, but some details are still unknown.
SNAP on Tuesday did not disclose the cost of the specs, how the glasses would be sold, or exactly what they would look like.
Snap has also announced several developer updates to enhance its Snapos platform. Developers can now build apps powered by OpenAI and Google DeepMind multimodal AI models. To enable more AI apps, the company has announced a “depth module API” that fixes AR graphics from large language models in 3D space.
In the future, Snap says it will partner with Niantic Spatial, a company that has been woven from the creators of Pokémon Go to build a map that drives AI around the world.
It is still to be seen whether all of these efforts will be converted into a pair of smart glasses that consumers really want to buy. Meta has had early success with the Ray-Ban Meta, but the Snap specs can be very expensive. To get consumers on board, SNAP may need to change AR glasses from new to practical devices.