The Ray-Ban Meta smart glasses launched last October and succeeded the first-generation Ray-Ban Stories. They are a collaboration between Meta, the company formerly known as Facebook, and Ray-Ban, the iconic eyewear brand. They are designed to offer a range of features, such as camera, audio, and AI, in a stylish and comfortable form factor. They come in two styles: Wayfarer and Headliner, and various colours and lenses. They cost from £299 and are available in selected countries.
While Meta does not explicitly market the smart spectacles’ accessibility potential, what sets these glasses apart from other smart wearables is their capacity to enhance accessibility and improve the quality of life for disabled people.
This blog post reviews the Ray-Ban Meta smart glasses and explores how they can benefit disabled people. It discusses the advantages and challenges of using these glasses and provides tips and best practices for optimising their use. Whether you are a disabled individual seeking a new way to interact with the world or a tech enthusiast curious about the latest innovations in smart glasses, this post is for you.
Camera and sound
The glasses don’t have a display with augmented reality capabilities. They do have a 12MP camera that can take photos and record videos in 1080p resolution, which is an improvement on the first generation. The photos, which are in portrait mode, and videos can be easily synced to the Meta View app on your phone, where they can be edited and shared. You can also take photos and send them in a WhatsApp message without touching your phone. As simple as saying “Hey Meta, take a photo and send it to dad”. This is a boon for accessibility and feels almost magical for someone like me who has never been able to take my own photos and videos in the digital age because of my disability.
The open-ear speakers deliver quality sound for music, calls, messages, and the voice assistant, with less sound leakage compared to the previous model. The five microphones in the frames ensure clear audio during calls and excellent spatial audio when recording videos.
I have Apple’s AirPods Pro, and while I would never say the audio quality is on a par, I would gladly trade them for the Ray-Ban Meta smart glasses. AirPods Pro fall out of my ears, and become uncomfortable after a fairly short time as they dig into my inner ears, but these spectacles are lightweight, comfortable, and never fall off my face,
The cameras on the glasses, and the voice control, are helpful in moments when you want to record what you’re doing, but your hands are occupied, or if you are unable to use your hands because of disability.
A case in point is when my cat Fionek decided he wanted to join in the writing of this article.
Artificial intelligence integration
One of the most interesting features of the glasses is the AI, which is powered by Meta’s technology. The AI can perform tasks such as identifying objects, reading text, translating languages, and answering questions. The AI can also livestream the user’s view to their friends or followers on Meta platforms like Facebook and Instagram.
However, there is not much more I can say about the AI right now as it is currently only available in the US and requires an early access beta programme registration. Meta owner Mark Zuckerberg has promised AI features are coming this year, like saying “Hey Meta, look at this”, which prompts the glasses to take a photo and then provide an answer to the question asked. Imagine taking a photo of a work of art in a gallery, or an historic building, and the AI in theory should be able to tell you all about what is in the photo.
Yet, without firsthand experience of the AI’s real-world performance, many, myself included, are left pondering the practicality and effectiveness of the AI across real-world situations. It’s evident that they don’t offer ChatGPT-level AI. Only time will reveal whether Meta AI will ultimately meet the expectations set for it.
For UK users, there are some rudimentary signs that there is some AI being deployed when you use the Meta assistant. I have discovered some useful questions the Meta assistant will happily answer such as “what time is sunset”, “what date is Easter Sunday”, and “will it rain this afternoon”.
Accessibility advantages
One of the major benefits of using Ray-Ban smart glasses is that they can improve accessibility for disabled people, especially those with mobility impairments, and blind or low vision users.
When the AI is fully integrated the glasses should be able to provide blind and visually impaired users with audio feedback, such as describing surroundings, identifying objects, reading text, and translating menus.
Steven Scott is blind and hosts the acclaimed Double Tap podcast. I asked him how the glasses are helping him: “One of the biggest advantages of Ray Bans for me as a blind person is the ability to record video with spatial audio. My entire family can enjoy the content visually, but I get an audio postcard that takes me right back to the scene. It’s like bringing an old photo to life in my ears”
Hands-free heaven
The glasses are voice-activated and hands-free, which makes them easy to use without touching your phone something disabled people with upper limb mobility issues will appreciate.
Messaging – Facebook Messenger, SMS, and the one I use the most, WhatsApp, are all available hands-free. You can dictate messages to contacts and hear their replies through the glasses.
Accepting and receiving calls — without having to pick up a phone — is another significant accessibility selling point of the Ray-Ban Meta Smart Glasses. I have a shortcut set up on my iPhone that will automatically answer calls on the smart spectacles every time I have them on my face. I have another shortcut that disables auto answer calls when I take the glasses off. This is all very convenient and means I never miss a call these days.
As someone who lives in Central London, I often stumble upon interesting and historical sites. Due to my muscle wasting disability getting my phone out, and snapping a photo is not an option. However, with the Ray-Ban Meta glasses, I can quickly snap a photo or take a video hands-free with a simple voice command; “Hey Meta take a photo”. It’s as easy as that. For those that struggle to access tech hardware this feels like a superpower.
Room for improvement
While the Ray-Ban Metas have so many hands-free features there is a glaring omission – smart home control. This isn’t surprising when Meta doesn’t have a smart home platform, nor does it offer a Meta smartphone. I would like to see the company make the glasses compatible with the smart home standard Matter or do a deal with Apple or Google and bring Siri or the Google Assistant to the glasses for voice control of smart home devices. Amazon has Alexa in their smart grasses and other companies have integrated Siri into their products through an API Apple offers developers. I can’t see this happening but maybe Matter compatibility could be a possibility in a future version of the glasses.
The smart glasses come with a music feature called Spotify Tap, which requires users to physically lift their hand and tap the frame to initiate music playback. Why isn’t there an alternative option to accomplish this same through a voice command? This would provide accessibility for individuals with upper limb disabilities, ensuring a more inclusive and user-friendly experience.
Another area that could be improved is messaging. Meta has told me the assistant will read out messages up to 10-16 words or 90-93 characters in length but many of the messages I receive are longer. When this happens the Meta Assistant says: “you have received a long message from John” and inconveniently you have no option but to open your phone to access the message. Meta should give you the option to read out longer messages with a “read it” command.
Including support for emojis in messaging is another feature that Meta should consider adding, bringing a touch of fun and colour to messages. It’s somewhat ironic that the glasses can recognise emojis in received messages, reading them aloud to you, yet there’s currently no functionality to dictate emojis when composing your own messages. Adding this feature would contribute to a more expressive and enjoyable messaging experience.
Ray-Ban Meta Price
You can buy the Ray-Ban Meta Smart Glasses for £299 from Ray-Ban.com
Adding prescription lenses for all-day use indoors and outdoors could raise the price to over £800, depending on your prescription.
Conclusion
Good points
• Improved open-ear speaker technology for good sound
• Calls are clear
• Excellent hands-free capabilities
• Easy sync of photos, videos with your phone
Bad points
• Limited availability of Meta AI
• Shorter battery life than the first-gen Ray-Ban Stories
• Occasional WhatsApp disconnections after firmware or View app updates
• Poor customer service
The Ray-Ban Meta smart glasses may not appeal to everyone but are worth trying if you rely on hands-free capabilities due to disability or visual impairment. Despite some drawbacks, the glasses offer unique features like a camera, audio, and AI, enhancing the user’s experience and interaction. A glimpse into the future of wearable technology, these glasses provide a useful extension of your phone, offering hands-free access to a camera, an AI assistant, calls, and messages. While rumoured augmented reality smart glasses from Meta might surface at Meta Connect later in 2024, the Ray-Ban Meta Smart Glasses provide a practical hands-free solution for the time being.