With its announcement on December 9th that it would be releasing AI-powered smart glasses in 2026, Google joined Meta in entering a market that has been predicted for years but never made it to the mainstream. Google was arguably one of the earliest players in the space, releasing the ill-fated Google Glass to the public in 2014 before pulling back a year later. Others, like the North Focals, were promising but beset by high costs and limited availability – in the case of the Focals, users had to travel to one of two locations to be fitted before they could purchase the device. But with the release of the Meta Display glasses earlier this year and rumors that Apple will announce glasses next year, we could be hitting peak smartglasses very soon.
In the short term, there are plenty of concerns, ranging from the mundane (battery life, comfort) to the profound (privacy, consent when it comes to videos and photos, facial recognition). All of these questions will likely be answered in some form in the coming years, and norms will shift and reorient with them; for instance, smartphones already allow people to film surreptitiously and that has been used in both whistleblower cases and for nefarious ends. But as we move towards wider adoption, it is also worth discussing how these devices will impact work, both blue and white collar. Just as smartphones and AI have revolutionized the way we work on both job sites and offices, the glasses will change how we relate to each other and our labor.
Smartglasses are already being used in certain enterprise cases, and those cases will likely expand in the coming years. For instance, warehouse workers can wear glasses that will show them exactly where to go to pick out the next item in their queue; rather than having to wander or memorize a layout, they can just follow a simple map. The glasses can also display item counts and send warnings when something is running out so that it can be restocked. Beyond warehouses, this could be deployed in any number of retail environments, making it much easier for overworked frontline workers to locate items and share product information.
Additionally, smartglasses can incorporate overlays to help trainees in the trades learn and produce more quickly. There is a shortage of skilled tradespeople already, and even if some of the currently unemployed young college grads make the switch, there still won’t be enough people to fill certain roles. Younger workers can use overlays in the glasses to follow directions, getting them out into the field more quickly and getting customers serviced faster. And while there will always be repairs and work that is specialized, overlays and step by step directions can allow customers to do some of their own simple fixes, freeing up professionals for the more interesting tasks.
On the white collar side of the house, real time translation features in the glasses will open up a whole new world of doing business globally. Many smart, capable people are shut out of the market because they lack proficiency in certain languages; this provides a solution and will help bring a pool of talent online.
As AI takes over more rote tasks, management and relationship skills will become more important, and that requires both knowing who people are and knowing about them. Facial recognition is going to be a thorny topic as the glasses come to market, but if consent is built in, it could be a boon for lots of people, including folks who are neurodiverse, face-blind, or just not good with names. Rather than having to recite your CV when you meet someone, as long as you’ve opted in, they can see your top level details and the conversation can go from there. External to work, this could be an incredible benefit for people with dementia, as the glasses could prompt them when they are speaking to family and care providers.
Then there are niche applications by field – emergency medicine could benefit by allowing people to use apps in the field to do life-saving work. I’d always defer to a professional but if a friend sliced their hand open while I was around, using a smartglasses app to stop the bleeding and do rudimentary care before getting them to the doctor could be life-saving; this impact would only be amplified in conflict zones.
We are really only starting to scratch the surface when it comes to what can be built for these devices and how people will use them in the future. Twenty years ago, the idea that every single person in any workplace would have a smartphone seemed extreme; today, it’s taken for granted. In 2035, someone not wearing glasses will likely appear to be the odd one out.











