The idea of the smart glasses — a pair of eyewear that can do smartphone-like things directly on our face — has been around for a few years, but 2025 seems to be the year the product category really pick up steam. At this month’s CES (Consumer Electronics Show) trade show in Las Vegas, smart glasses were the star of the show, with brands both established and new showing off their vision of smart eyewear.

Most of these smart glasses use what is called a waveguide technology to project visuals in front of the wearer’s face. I’ve tested a few products with this tech before, and while the tech has an undeniably futuristic feeling, the reality is the waveguide tech requires a prism built into the lens that distorts a section of my visuals. On top of that, even when the display is not on, the prism is visible to myself and others (see the image below of RayNeo’s smart glasses).

And so a brand named Halliday has decided try another route with its pair: the Halliday AI glasses still project visuals in front of the user’s face, but the display is mounted on the inside of the frame (the side facing the wearer’s face), above the right eye. This means to see the screen, the wearer would have to glance up slightly, and the projected visuals will be over the glasses’ black frame instead of “on the real world” like other waveguide glasses, but Halliday believes this is a more practical approach.

For one, the display becomes much more discreet — others won’t be able to see the screen at all, unless they do the very unusual move of hovering over your head looking your face from inches away. The fact the display isn’t embedded onto the lens also frees up the field of view from any prism distortion; and it makes the lens far easier and cheaper to replace if they were to get damaged.

I’ve been trying a prototype version of the Halliday glasses for the past week, and I really like this idea. I actually like that I need to glance up to see the screen, so it’s not constantly in my face as a distraction. I also like that this alternative screen tech allows the Halliday glasses to be lighter and thinner (35g) than most other smartglasses, which can appear bulky due to the waveguide technology requiring space inside the frame.

I still think the Halliday AI glasses are slightly larger than a regular pair of glasses, but it’s not so much that it would attract attention. I have worn these in public and the glasses have not attracted questions from acquaintances (friends would obviously ask, because I don’t normally wear glasses).

It took skilled engineering for Halliday to build the display so discreet: it’s a tiny module measuring just 3.5mm. But despite how small the display is, the visuals projected is very, very bright (even outdoors, I didn’t need brightness to go above 10%), and the visuals projected to my eye approximates a 3.5-inch screen.

The glasses do need to pair to a smartphone to work, and Halliday has a well-designed app for iPhones and Android devices. Once paired, the glasses will use your smartphone’s data. You control the glasses via a touch sensitive panel on the right arm of the glasses, or a controller ring that Halliday is including with the package.

But the digital AI assistant you use will be Halliday’s own, not Apple’s Siri or Google’s Assistant. Halliday’s AI agent runs large language multimodal AI and has been optimized by the company to run on the glasses. It can do basic digital assistant things like answering questions and live interpretation, but it can also apparently be much smarter — as proactive AI agent that can chime in during your daily interactions.

To be honest, I don’t want an AI to always be listening in, so I turned the proactive feature off (plus, it’s not fully ready for real testing anyway). But the reactive AI works very well. I can ask the glasses to convert currency, and I’d see the results on the display within one to two seconds, which is much faster than other AI glasses I’ve tested.

The interpretation works mostly well: I had a friend speak Spanish, and the glasses were able to pick up the audio and then project English text in front of my eyes. There is about a 1-2 second delay, so the speaker will need to occasionally pause or speak slightly slower than usual. I don’t think this can be used to interpret entire meetings or dinner conversations, but it’s enough to help me order food or direct a taxi driver in a foreign land.

Honestly, I don’t really need a pair of smartglasses to go so above and beyond being able to answer questions and show me notifications. I do not need it to chime in offering me suggestions, I do not need to wear the glasses to watch a foreign movie without subtitles. I just want to be able to ask the glasses to direct me to an address, or tell me how much 5,000 yen is in U.S. dollars, or tell me I have an urgent email that just came through — little things that free me up from needing to pull out my phone every two minutes. And to that, the Halliday AI glasses deliver.

Unfortunately, the product isn’t immediately on sale yet, but is going the Kickstarter route. But the glasses, which start at $399 on the site, are more than fully backed, having generated over 1.3 million already. Halliday is promising the glasses will ship in April.

Share.

Leave A Reply

Exit mobile version