Home » Technology » Meta’s Hypernova smart glasses: accessibility must be central

Share This Post

Technology

Meta’s Hypernova smart glasses: accessibility must be central

Upcoming high-end wearable tech sparks excitement—and questions about inclusivity

Close-up of Meta Hypernova smart glasses showing a heads-up display with icons for voice control and AI, symbolising futuristic and accessible wearable technology.

Meta is gearing up to unveil its much-anticipated Hypernova smart glasses, an advanced wearable expected to launch as early as the end of the year. Following the success and growing adoption of the Ray-Ban Meta glasses, this new iteration aims to significantly elevate the technology—and the price tag, reportedly set around the $1,000 to $1,400 mark.

But as we anticipate cutting-edge features, the real measure of success must be how accessible these innovations are to disabled people.

What to expect from Hypernova smart glasses

According to Bloomberg, the Hypernova smart glasses will feature a built-in heads-up display in the lower-right corner of the right lens, allowing users to see notifications, photos, and app interactions directly within their line of sight. This marks a considerable step forward from the current Ray-Ban Meta glasses, which lack an integrated visual interface.

Moreover, these glasses will reportedly include an upgraded camera with photo quality rivaling smartphones like the iPhone 13. There’s also talk of deep integration with Meta’s AI systems, providing smarter interaction capabilities through voice and gesture-based commands.

Perhaps the most intriguing development is the bundled neural wristband, internally codenamed “Ceres,” designed to interpret neural signals and enable subtle hand gesture controls. This technology, if implemented thoughtfully, could revolutionise the interaction model for wearable tech.

A make-or-break moment for accessibility

However, significant innovations bring significant responsibilities, particularly around accessibility. Meta’s current Ray-Ban glasses have been notably inclusive thanks to deeply ingrained hands-free voice control—a feature particularly valued by users with mobility impairments. Ensuring this functionality isn’t compromised or complicated by new features is vital.

Similarly, while the neural wristband controller sounds promising, it’s essential that Meta fully considers disabled users who might struggle with subtle hand movements or have neurological conditions affecting their motor control. Offering customisable sensitivity settings and alternate control mechanisms could broaden the usability of Hypernova significantly.

Lessons from Ray-Ban Meta glasses

Visual accessibility also deserves attention. Users with visual impairments could find the new heads-up display challenging unless Meta integrates comprehensive accessibility options, such as voice feedback, screen readers, and adjustable text size. Inclusion shouldn’t be an afterthought but central to design.

Meta has previously demonstrated a commitment to accessible technology, notably through partnerships like the one with Be My Eyes. Their existing smart glasses have already provided tangible benefits to visually impaired users through integrated assistance and voice control functionalities. Building on these foundations with Hypernova could set a powerful precedent.

Conclusion: designing for everyone

As Meta prepares Hypernova for launch, our excitement is justified—but so is our scrutiny. Accessibility must be woven deeply into every feature, not treated as an optional extra. By prioritising inclusivity, Meta has a genuine opportunity to redefine what truly “smart” glasses can mean for everyone.

Technology should empower, never exclude. Hypernova’s real breakthrough will be ensuring no user is left behind.

Colin Hughes is a former BBC producer who campaigns for greater access and affordability of technology for disabled people

Leave a Reply