Monday, 20 April 2026

Smart Glasses: AIsplaining, Useful, Creepy, but not Transformative

 

Back in December I bought a pair of Meta Oakley Vanguard wrap-round AI glasses to try out and be able to talk about with clients and at my January Tech Trends seminars. I hadn't had much opportunity to wear them before those events due to the terrible weather, but since then I've been wearing them regularly. Here are my conclusions.

Fit and Wear

First of all these are great sunglasses - the optical quality is superb. I opted for the golden color and it gives a glorious, bright shine to any view, enhancing perceived colors. My eyes are well protected from wind, dust, and pollen. The electronics do not add significantly to the weight and sit comfortably even on my large head.

Voices in my Head

The audio quality is surprisingly good, without compromising being able to hear what is happening around you. I do not listen to music when out and about as I like to hear my environment. I prefer the sound coming from near my ears rather than having something in or over them. I can, however, imagine circumstances when listening like that would be pleasant, for example relaxing on a sun lounger. The only problem I have come across with streaming music is that it occasionally just starts randomly. I have yet to figure out what causes this as it does not happen that often, and inevitably only when I am busy engaged in something else.
Meta AI now offers three voices here in the UK: Judi Dench, a man, and a disdainful young English female. Dame Dench is definitely the best option. Purely in terms of synthesis quality they are all remarkable, but fall apart hilariously when trying to pronounce foreign words and names.
While I am not a huge fan of voice interaction, I have found that there are certain good applications when using the glasses. Classic assistants such as Alexa and Nest Home end up with the three Ts: Tunes, Trivia, and Timers. Smart glasses risk ending up with some similarly simple activities, for example asking the time, or requesting running stats without having to look at a smart watch. Much of the time it really is that trivial, but hearing new messages read out does have an appeal, as does replying verbally. It is much less silly than talking to your watch which was one of the early applications of smart watches.

Camera

It is really sad that the camera has been demonized by a few creepy individuals as it really is useful. I take many photos with a wide variety of devices, ranging from absolute top-end equipment down to the wonderful Kodak Charmera. The camera on the Oakley glasses is in the mid-to-low quality range, similar to a low-end phone, but it is the ability to capture pleasant views while doing something else that makes it interesting. Noticing the light hitting a building while out running and being able to capture the view without breaking stride is not earth shattering, but is pleasant. On the other hand, the ability to quickly capture things as an aide memoire without pulling out a phone is useful.
I have used the camera to record some POV Parkour, but the field of view is not wide enough to make it engaging. I tried recording some hyperlapse while running, only to discover that I am apparently looking all over the place resulting in very jerky footage, but with more careful planning it could make some quite good material. Sadly I think the best use of video is recording conversations with people which has some positive applications as well the creepy ones, or capturing incidents, but there are many better use cases.
The camera really excels when combined with AI. The potential as an assistive technology for people with reduced vision is enormous. You can ask the glasses to tell you what you are looking at and it replies with remarkable accuracy. Despite one of my novels being called The Botanist, I struggle to identify flowers, plants, and trees, so I was delighted to be able to look at a flowering tree and say "Hey Meta, What kind of tree is this?" and get good answers.
Another prime use case is being able to read signs and translate other languages. I need prescription glasses to read now, but the smart glasses are not corrective, so the ability to read out the words on a plaque at which I am looking is fabulous. I have also tried translating local shop signs in Chinese and Arabic and the translation corresponded to the English names of the shops, which is encouraging. I did make sure that the glasses were not reading the English. On the other hand, the glasses dangerously misinterpreted a road sign at first but corrected themselves and apologized. They also used the American term "traffic circle" instead of the UK "roundabout" which might confuse someone who does not spend time in the USA. 

AI Capabilities

It's a good job that Bluetooth normalized wandering around apparently talking to yourself as it is remarkably easy to get into discussions with the Meta AI. Unfortunately quite a few of these degenerate into fights about how and what to do. Meta AI seems very eager to tell me how to do things on my phone. I am going to call this "AIsplaining." While I had the dismissive young woman's voice enabled she told me that I'd received a long WhatsApp message from my wife, read out part of it, and then bluntly told me to read the rest on the phone. Rather then telling me that I could get stats from my Garmin watch by asking for them individually - "Hey Meta, what is my pace?" - the AI denied strenuously for many minutes that there was any integration with Garmin which was very odd. There is, apparently, some way to take a photograph with the stats nicely inserted into them but I have yet to figure it out.
Another experience was trying to get the glasses to read out a message a second time. The AI adopted a positively hostile tone and kept telling me to read it on my phone. Very odd, especially since it had read the message out a couple of minutes early. Also I could not send a message in French, it was as if I was saying nothing. I have had incidents of AIsplaining when replying, but when it works, replying to a message in English works well and can be done while your hands are occupied, wet, or dirty.
The most impressive AI capabilities are definitely the ones associated with the camera. I even tried asking the glasses where I was, knowing that there is no GPS embedded in them, and they were able to use the distinctive view to say that I was in Holyrood Park in Edinburgh based on the view of the crags.

Conclusion

First of all, I don't think anybody will wear smart glasses if they neither need prescription glasses nor if there is bright sunshine. It is just too much of an imposition. However, if you do wear glasses, the added capabilities are useful "nice to have" features for most people, not more. They are not transformative in the way that smart phones are, or even as useful as the more discrete smart watch. They could, however, be completely life changing for those with impaired vision, especially using the "Be My Eyes" service.

No comments:

Post a Comment