Table of Contents
Smart glasses with microphones, cameras, built-in computers and even AR (augmented reality) have been the subject of spy thrillers for decades. But the reality of face-worn wearables with truly meaningful utility has become something of a white whale for the consumer tech industry.
Of course, this isn’t for lack of trying. I tried Google Glass at the company’s offices in 2013, I was one of the first in the UK to buy the original Snapchat glasses in 2017, and big names like Sony and Oppo have been working on the form factor for years.
Despite their best efforts, no company has been able to break the balance between form and function that has led to mainstream acceptance or adoption.
Foundry | Alex Walker-Todd
The most recent newcomer in this area is Ray-Ban Meta smart glasses (photo), which, despite the original launch at the end of 2023, reappeared in the news in October this year, thanks to the extension of Meta AI functionality to more international markets. including Great Britain.
This marks the biggest distinction between Meta and (owners of the Ray-Ban brand) Luxottica’s first foray into smart glasses. 2021’s Ray-Ban Stories featured a similarly interesting range of technologies, but ultimately proved to be less than the sum of its parts, and AI could very well have been the missing ingredient.
Different shades of AI
One of the big hooks of the inclusion of Meta AI in the Ray-Ban Meta glasses is the new multimodality. Using a feature called ‘look and ask’, the glasses can take a picture of what’s in front of you and – using machine vision in combination with Meta’s Llama AI model – explain what you’re looking at.
Depending on your request, you can even use ‘Look and Ask’ to quickly summarize signage or documentation, extract nutritional information from food packaging, or learn new recipes inspired solely by the ingredients you have in mind. However, for the majority of users in the UK at the moment, this prominent facet of the Ray-Ban Meta experience remains inaccessible, with no set date for its addition.
Foundry | Alex Walker-Todd
Apart from those who have been updated to the latest Ray-Ban Meta software via a VPN regionally set to the US, or who have been whitelisted from accessing the Meta View app beta updates, most UK users are still without this device. define upgrade.
The reason? A combination of the EU AI law and the GDPR, which have collectively hampered Meta’s AI efforts in the region, resulting in a limited experience for on-site users.
Meta AI on your face
If, like me, you’re a UK-based Ray-Ban Meta user, you’re probably well aware of the limitations the integrated Meta AI experience currently suffers from.
It’s a nice blessing that Meta’s assistant is always present and completely hands-free in daily use; more accessible than when I go to Gemini or Siri on my phone and less distracting, because there’s no interface to look at. I can still fire off conventional digital assistant requests – like checking the weather forecast or step-by-step instructions for baking the perfect brownies – but otherwise the experience still feels decidedly sparse and incomplete.
On paper, the combination of the Ray-Ban Meta’s form factor and hardware configuration is a recipe for success when it comes to making AI interaction meaningful in everyday use.
Without the machine vision-driven multimodal component, however, the most useful features of these smart glasses include photo and video recording, integrated Bluetooth audio, and the ability to answer calls with solid voice clarity (thanks to a quintet of microphones placed around the frame are posted). A much less ‘smart’ skill set than Meta wants to focus on.
Foundry | Alex Walker-Todd
Great for you, less so for everyone else
When it comes to face-worn wearable technology, as mentioned above, no manufacturer has yet cracked the code for mainstream adoption or acceptance, but the Ray-Ban Meta’s may be the industry’s best attempt yet.
While the company’s augmented reality efforts remain reserved for the Meta Quest That said, the same privacy concerns raised with previous Ray-Ban Stories, not to mention Meta, don’t exactly affect this latest generation of smart glasses.
If anything, such concerns are driving the immature expansion of the glasses’ groundbreaking Meta AI integration and multimodality (outside the US, Canada and Australia). My frustrations come directly from the perspective of a user who knows they can’t fully utilize the cutting-edge technology at their disposal.
That said, I already know that if I wait, my experience is will improve. You could argue that the opposite is true for everyone else on the other side of Ray-Ban Meta’s camera lens. The lack of multi-modality actually gives bystanders in Britain a greater degree of privacy than those where the full functionality of the glasses is already available.
Unless you’re someone who already recognizes Ray-Ban Meta smart glasses, understands that they have an integrated camera and can be used to take hands-free photos or videos and even live stream them, chances are you have no say in the matter. whether your likeness is recorded, shared online, transmitted through Meta’s servers, or a combination of these factors.
On the one hand, products such as the Ray-Ban Meta glasses Are gaining acceptance in society, if only because people don’t necessarily know at a glance that they are smart. As for their Meta AI integration, while the wearer benefits from it, those around them are likely to be less excited about the growing AI-powered repertoire of these specs, if they are even aware of its presence.