AR devices are having a moment in the sun, but getting access to them is not as straightforward as it could be. One device that has created a lot of buzz lately is Snap’s fifth-generation release of its Spectacles. Like other recent releases, these glasses are not available to the general public — the website encourages potential users to “apply to join the Spectacles Developer Program to build, play, and test with Spectacles,” which requires a commitment of 12 months, priced at $99 per month.
One important note is that the new Spectacles allow users to develop experiences using Snap’s Lens Studio 5.0. This sets them apart from other devices on the market, as Snap already has an existing ecosystem of very successful tools within Lens Studio and a very significant user base across its platform; around 850 million monthly active users provide the company with a real running start against other competitors within the space, and this average has grown by around 50 million users year on year. Added to this, there are already over 350,000 Lens creators using Lens Studio, and they have built nearly 3.5 million Lenses over the years.
But is the device worth the commitment and the cost? And what is it like to actually use? Once again, I spoke to creatives working with AR technologies to find out their unbiased opinions.
Aesthetics and Display of Spectacles
Self-expression technologist Sasha Zabegalin was impressed by how the new device blends technology and fashion: “They’re honestly the coolest XR glasses I’ve ever worn. The style is a bit out there, but I kind of love that about them. They’ve got this bold, high-fashion feel, which is so rare for a tech device. I once had someone in LA ask if they were new Balenciagas!”
David Robustelli, founder and creative director at Beyond.Studio, tells me “the glasses are lightweight (226 grams), making them easy to wear for extended periods, with a bright display and smooth frame rate.” The new Spectacles offer a 46 degree field of view at a resolution of 37 pixels per degree, providing impressively sharp visuals via optical waveguides and liquid crystal on silicon (LCoS) micro-projectors.
The immersive experience when wearing them is also very smooth; “Snap’s Spatial Engine understands the world around you so that Lenses appear realistically in three dimensions. Plus, a 13 millisecond motion-to-photon latency renders Lenses with incredible accuracy, integrating them naturally into your environment,” according to Snap’s AR platform director, Sophia Dominguez.
When discussing the user interface and appearance of augmentations, Robustelli adds, “Lens experiences support up to 25MB, a substantial increase from the previous 8MB on mobile, enabling richer content and more detailed visuals. The gesture control is impressively accurate, trained to recognize movements with minimal errors, enhancing the ease of interaction. The interface itself is intuitive, feeling refined and ready for mainstream use rather than experimental – one standout feature is the menu projected on the hand, offering easy access and making the experience feel natural and futuristic.”
Zabegalin adds, “You are very aware it’s not your full vision, of course, but it’s not bad. It reminds me of the HoloLens — it has a very hologram feel, but I like that because it feels futuristic. It ties in cohesively with everything.”
Spectacles Technical Features
One of the most exciting features is the introduction of electrochromic tint to the lenses. When the tint is activated, the lenses transition from clear to tinted, which enhances readability and contrast, further improving immersion. This is especially useful for applications like watching videos or viewing floating virtual objects where a clear view of the real world is not necessary. “I found that really cool, actually. I was like, ‘Oh, that is so smart.’ Because all AR glasses in the future should be like that—they have to double as see-through glasses and sunglasses… It’s seamless. You don’t even notice it, and it works really well,” says Zabegalin.
Bas Gezelle, co-founder at PLAYAR, explains that with this tint feature, “you can also use the Spectacles outside which is already awesome, but you are hands-free, you interact with content in a more intuitive way than ever, and the world is your screen. Even better, your canvas.” Developers can create bespoke mobile controllers, linked via the Spectacles app, allowing you to use your smartphone as an additional controller. “There is already a game in the Spectacles where your phone is the controller to fly a helicopter that can pick up and drop things, completely in 3D space. Try doing that with only hand tracking,” Gezelle adds.
Spectacles’ integration with Snap’s Lens Studio 5.0 lets developers create AR content with TypeScript and JavaScript. Through a partnership with OpenAI, Spectacles also integrate multi-modal AI capabilities, enabling developers to incorporate contextual AI that can respond to what users see, say, or hear via cameras and sensors on the device. This feature, combined with their machine-learning library, SnapML, allows the development of AR experiences that feel highly personalized and responsive. The official website adds that “custom ML object trackers turn everyday objects into controllers that interact with digital content, while SnapML allows you to bundle custom ML models directly into your Lens to accurately identify, 3D track, and augment common objects.”
Transforming Fashion and Design
Dr Helen Papagiannis, immersive technology advisor and author, states that this “fusion of AI and AR will transform fashion into a highly customizable and expressive medium. Living garments and dynamic accessories will change in response to environmental factors, moods, or activities. We’re entering a time where fashion doesn’t have to be static and can adapt to express a user’s unique aesthetic moment by moment.”
“As we move toward everyday AR wearable glasses, fashion — both in form and function, from hardware design to content creation — will be at the forefront. I help my clients to mindfully navigate and design for this future where fashion and multi-sensorial immersive technology intersect to blend creativity, heritage, and forward-thinking customer experiences.”
City Scale AR experience for ASOS on Spectacles 5 by Beyond.Studio
Robustelli concurs with this vision; “Being able to showcase designs and dynamically change textures, shapes, prints, etc. can improve the design process in a very immersive way. Looking further into the future, accurate body tracking options would allow users to express themselves by wearing digital fashion which other users can see. Like dressing an avatar in a virtual world but instead, dressing yourself with virtual clothing in the real world.”
Collaborative Experiences on Spectacles
Another aspect of the Spectacles’ capabilities that enables these futures is the functionality for collaborative experiences. “The Spectacles SDK allows users in different locations to see synced dynamic 3D objects and see each other’s interactions with those objects. This makes it possible to collaborate or co-create in real-time,” says Robustelli. “For example, playing a game of chess and seeing the same board and the opponent’s moves without any delay. It also allows two or more users to design in an AR space while working together on a similar object making it a highly social and collaborative experience.”
Dr Papagiannis adds, “multiplayer AR fosters co-creation and allows for new shared story spaces that captivate and inspire, breaking the norm of isolated AR experiences.” She describes one example from her first-hand experience; “while Snap introduced Connected Lenses in 2021, the experience has now evolved, allowing multiple devices to share and interact seamlessly. At the recent Snap Partner Summit in Los Angeles, I joined an Es Devlin experience where we interacted with the cosmos, insects, and animals with our hands and voice. Our words collectively wrapped around a virtual globe, creating a powerful shared AR moment that was unique to each group and different each time.”
Implications for AR in Creative Industries
Gezelle feels that devices like this unlock meaningful immersive experiences for younger generations, who “do not just want to passively consume content, they want to engage and create. From story-watching to story-living. Not alone, but with others and that is also one of the extremely valuable features, that you can have the same experience in AR with others in the physical world.”
Zabegalin adds, “it’s a huge chance for anyone, anywhere, to bring their ideas to life, especially in the metaverse. I also think more brands will hop on the virtual try-on train to put on Spectacles, Vision Pro, Orion, all of it. I think what’s most exciting is that our AR creations can finally be freed from the phone! It’s like our ideas are really in the world now, which blows my mind. When I got my ‘screen-wear’ (digital fashion) working — well, sort of working — on the Spectacles, I was honestly emotional.”
The Future of AR
The future looks promising with AR glasses such as this, focused around a community of developers, to redefine how we engage with digital content and physical spaces. As Robustelli explains, “once (and if) this device will replace your mobile phone, people can have personalized interactive experiences and connect with fashion in the most dynamic and immersive way imaginable.” Gezelle succinctly adds “The world becomes our internet.” This is a future that all of us can get behind.