Home » Technology » How Meta’s Orion AR glasses and wristband revolutionise accessibility

Share This Post

Technology

How Meta’s Orion AR glasses and wristband revolutionise accessibility

Exploring how Meta’s inclusive AR innovations enhance accessibility for disabled people

Meta’s Orion AR glasses and neural EMG wristband

At Meta’s annual developer conference on Wednesday, CEO Mark Zuckerberg unveiled a prototype of new augmented reality glasses, showcasing the company’s latest venture into smart eyewear. In addition, Zuckerberg revealed that Meta AI will soon feature the ability to speak in the voice of Dame Judi Dench.

Named Orion, the demo glasses—not yet available for consumer release—can overlay digital images of media, people, games, and communications onto the real world. Meta has positioned this product as a move beyond desktop computers and smartphones, offering eyewear capable of handling similar functions.

Having believed in the accessibility potential of Meta’s existing Ray-Ban Meta smart eyewear from the very start, I’m genuinely excited about the company’s latest developments. The impressive demo of the Orion AR glasses suggests that Mark Zuckerberg might actually achieve his ambitious goal of AR glasses replacing smartphones. But what truly stands out is the accompanying neural electromyography (EMG) wristband technology, which could be revolutionary for people with severe mobility impairments like myself.

A new era of AR technology

The Orion AR glasses showcased in Meta’s demo are not just another piece of tech; they represent a significant leap toward integrating augmented reality seamlessly into our daily lives. With features that could potentially replace many functions of a smartphone, these glasses aim to make technology more intuitive and less obtrusive.

Neural Wristband: an accessibility game-changer

What sets the Orion glasses apart is the neural wristband designed for controlling them. This device uses EMG to translate subtle neural signals into digital commands. Essentially, it picks up the electrical signals sent from your brain to your hand muscles and interprets them to control the glasses.

For people with severe mobility impairments who can only perform subtle movements with their hands this could be transformative. The wristband requires only minimal hand movements—or even just the intention to move—to operate. This is a stark contrast to the wrist acrobatics often needed to control devices like the Apple Watch. It makes me wonder: is it time for Apple to develop a smart wristband of their own?

Inclusivity at the core of Meta’s design

Meta’s approach to this technology is notably inclusive. According to their blog post on the EMG wristband:

“We’re designing this technology with inclusivity in mind, developing wearable devices and input control algorithms that are performant across behavioural , physiological, and motor abilities. Regardless of the size and shape of your hand or how you move it, human-computer interaction with an EMG wristband should simply work—and it may even be able to adapt itself to your unique movements over time.”

As a tech accessibility advocate with a severe physical disability, this commitment is both encouraging and essential. Technology should adapt to the user, not the other way around.

Enhancements to Ray-Ban Meta smart glasses

In addition to the Orion demo, Meta announced new AI features coming to the existing Ray-Ban Meta smart glasses at Meta Connect. These enhancements include:

•   Real-time translations
•   AI-powered reminders
•   More intuitive control via voice commands

In terms of accessibility, Meta announced it will be partnering with Be My Eyes, a free app that connects blind and low-vision people with sighted volunteers so they can talk you through what’s in front of you. Thanks to the glasses and POV ((point-of-view) video calling, the volunteer can easily see your point of view and tell you about your surroundings or give you some real-time, hands-free assistance with everyday tasks, like adjusting the thermostat or sorting and reading mail.

As an upper limb mobility impaired user, I am particularly glad to see the ability to ask Meta AI to record and send voice messages on WhatsApp and Messenger. It’s something I have long called for. I just hope they remember the feature to play voice messages hands-free when you receive one.

Such features not only enrich the user experience but also significantly enhance accessibility for disabled people.

The future of accessible technology

The integration of advanced AR glasses and an inclusive control mechanism has the potential to make technology more accessible than ever before. For disabled people, this could mean greater independence and a better quality of life.

While there’s still much work to be done, Meta’s efforts signal a positive direction for the industry. It also raises the bar for other tech giants to prioritise accessibility in their innovations.

Conclusion

The future of AR technology is not just about flashy features and sleek designs; it’s about making technology work for everyone. Meta’s Orion AR glasses and neural EMG wristband are promising steps toward an inclusive tech landscape. As someone with a severe physical disability, I am hopeful and eager to see how these developments will shape the future of accessibility.

What are your thoughts on Meta’s approach to accessibility in AR technology? Let us know in the comments below!

Read more about Meta’s inclusive approach here: Meta’s Blog on EMG Wristband.

Colin Hughes is a former BBC producer who campaigns for greater access and affordability of technology for disabled people

Leave a Reply