Apple has long been a pioneer in accessibility, crafting hardware and software that unlock new possibilities for disabled people. While big tech grapples with shifting priorities around inclusion, Apple’s commitment to diversity, equity, and inclusion remains central to its ethos.
As a disabled person who relies on Apple’s ecosystem daily, I’ve experienced both the transformative power of its tools and the gaps that still hinder full independence.
With 2025 set to be a pivotal year for innovation in AI, voice control, wearables, and third-party app integration, Apple has a unique opportunity to redefine accessibility standards. Here are four critical areas where the company can lead the charge.
Enhanced Voice Control and Siri for greater autonomy
Voice Control and Siri are lifelines for users who cannot rely on traditional input methods like keyboard and mouse. While Apple has made strides, these tools still fall short of their potential. To bridge the gap, the company should focus on:
Personalised speech recognition:
Implementing machine learning algorithms that adapt to individual speech patterns can improve dictation accuracy, especially for users with non-standard speech. While iOS 18 introduced early steps, macOS 15 lags behind.
Continuous updates and feedback mechanisms:
A responsive system for users to report Voice Control errors directly to Apple would accelerate fixes. Currently, updates for this mission-critical tool are frustratingly slow, leaving users stranded with unresolved bugs.
Contextual understanding:
Current voice control systems struggle with complex commands or interpreting intent. Apple should invest in AI to make dictation and commands with its Voice Control and Siri features more natural. For example, if you dictate the word “Apple” Voice Control takes your cursor to the top left hand corner of your Mac screen where the Apple logo is instead of transcribing the word “apple”. There are many other examples of Voice Control failing to understand user intents.
I’d like to see AI-driven dictation that offers seamless formatting, discreet corrections, and a 99 % accuracy rate. Voice Control should intuitively understand your intentions without the need for complex custom commands to memorise.
System-wide consistency:
Voice Control doesn’t always work seamlessly across apps and interfaces. For example, Safari supports some commands better than third-party apps. Apple must ensure consistency for a smoother experience.
Long-term memory
Currently, Voice Control lacks the ability to learn from corrections. If a recognition error occurs in your dictation and you correct it, the correction isn’t retained for future use. For instance, if Voice Control consistently misinterprets how you pronounce your dog’s name, “Rover,” it will continue to make the same mistake every time. This results in a frustrating and inefficient dictation experience. Building a memory feature would reduce repetitive fixes and boost efficiency.
Smarter Siri integration:
Siri has the potential to take on a more significant role by handling advanced voice commands, transforming it into a truly valuable assistant for disabled users. It will be interesting to see whether Apple enhances Siri’s capabilities for people with atypical speech, and what the upcoming upgrades in iOS 18.4 later this year will mean for accessibility users.
By infusing AI into these tools, Apple can transform Voice Control and Siri into indispensable, adaptive companions for disabled users.
Why does this matter? Severely disabled people often struggle with impaired speech, and limited breath and energy, making it crucial for Voice Control and Siri to operate efficiently and minimise constant recognition errors. This isn’t just about convenience; it’s about creating a better, more accessible user experience.
Reimagining the Apple Watch for universal access
Despite its 10th anniversary in 2024, the Apple Watch still presents barriers for users with limited mobility. Features like AssistiveTouch and Double Tap aim to reduce reliance on touch, but they require specific wrist movements that many cannot perform. For example, waking Siri demands raising of the watch —a wrist acrobatic challenge too far for those with severe physical disabilities.
Prioritise always-on Siri
Enabling “always listening” mode for Siri, similar to iPhone and Mac, would eliminate the need for physical interaction. Voice activation alone should suffice to send messages, control smart home devices, or monitor health metrics.
Expand alternative methods
Innovations like neural electromyography (EMG) wristband technology could revolutionise accessibility for people with severe mobility impairments. For instance, a future smart wristband designed for the Apple Watch could utilise EMG technology to convert subtle muscle signals into digital commands. By detecting electrical activity sent from the brain to the hand muscles, this system would interpret those signals to enable device control. Capturing even the faintest neuromuscular impulses, it has the potential to offer seamless interaction for users with minimal physical movement.
Empowering developers and third-party hardware
In 2018, Nuance discontinued its Dragon Professional for Mac dictation product, leaving many disabled users without a vital tool for communication and productivity. This decision highlighted the challenges third-party developers face within Apple’s restrictive ecosystem. Nuance’s move underscored the need for Apple to empower developers to create and maintain essential accessibility tools.
Apple’s walled garden approach has long been a double-edged sword. While it ensures a consistent and secure experience for users, it can also limit the potential of third-party devices and software to fully integrate with its ecosystem.
A more recent example of developer frustration is Meta’s call for Apple to open up the iPhone to provide more interoperability and deeper integration for its Ray-Ban Meta smart glasses. As an iPhone user, I know I would like to use my Ray-Ban Meta smart glasses to control my HomeKit devices but that is not possible because of Apple’s restrictions..
For disabled people, deeper interoperability could be transformative. Opening up the iPhone, Mac and iPad more fully to third-party hardware and software could foster a wave of accessibility innovation.
Apple has a history of resisting such openness, but as competition increases, user demand grows, and authorities like the EU require change, it may need to reconsider. Doing so wouldn’t just benefit competitors like Meta it would directly benefit disabled users by fostering a broader ecosystem of accessible technology.
Apple’s smart glasses: a game-changer for accessibility
The fourth area of accessibility improvement I’d like to see this year involves a product Apple doesn’t currently offer but is widely rumoured to be developing.
Speculation about Apple releasing smart glasses has been circulating for years, including reports that the company is working on a competitor to Meta’s Ray-Ban glasses.
While Bloomberg’s Mark Gurman has just reported that Apple has cancelled a project to develop AR glasses that connect to a Mac, he says the company is still working on the underlying technology for standalone glasses potentially launching in the future.
I believe 2025 could be the perfect time for the company to introduce a basic form of smart glasses without a display. While I don’t think we will see Apple smart glasses this year the accessibility benefits of such a product could make a huge difference to the lives of disabled people.
Smart glasses have the potential to open up entirely new ways for disabled people to interact with technology and their environment:
Visual assistance:
For people with visual impairments, smart glasses could provide real-time object recognition, text-to-speech functionality, and enhanced navigation assistance. Imagine walking through a busy street and receiving auditory cues about obstacles or directions directly through the glasses.
Hands-free interaction:
Disabled users who struggle with traditional input methods could use voice commands or eye tracking to control their smart glasses, creating a seamless and more independent experience. Ray-Ban Meta smart glasses have helped me document my life and share photos and videos hands-free with voice commands for the first time.
Health monitoring and alerts:
For users with health conditions, smart glasses could work in tandem with the Apple Watch to display critical health metrics, such as heart rate or oxygen levels, directly in their field of view without the need to pick up an iPhone or raise your wrist.
Apple’s expertise in accessibility and its existing ecosystem make it uniquely positioned to create a wearable product that combines cutting-edge features with robust accessibility tools. By focusing not on disabled users needs, Apple could ensure its smart glasses are not only a hit with the general public but also an essential tool for empowering millions of disabled people worldwide.
Conclusion
Apple’s legacy of accessibility is undeniable, but 2025 demands bold action—especially amid growing political pushback on inclusion. By refining Voice Control, reengineering the Apple Watch, embracing third-party collaboration, and pioneering smart glasses, Apple can empower millions of disabled users to live more independently.
As someone who relies on these tools every day, I urge Apple to seize this moment. Inclusion isn’t just a virtue it’s a responsibility and a business opportunity. By prioritising accessibility, Apple won’t just innovate; it will redefine what’s possible for millions of disabled people worldwide.