Global Accessibility Awareness Day is in its tenth year of promoting digital access and inclusion for the more than one billion people worldwide with disabilities.
Apple is celebrating the day by making its most popular virtual Today at Apple sessions even more accessible with sessions presented in sign language.
It has to be said the Cupertino company has certainly been punching its weight recently as accessibility for disabled iPhone users received a major boost with the release of iOS 14.5. The update now lets users answer phone calls using Siri voice commands hands-free without having to touch anything.
The option is part of the Announce Calls with Siri feature, which lets you hear the name of who is calling when using AirPods, the company’s popular headphones.
However, as well as telling you the name of the caller, Siri now understands the commands to answer the call. There’s no need to say, “Hey Siri,” you just need to say “answer” or “decline.”
As someone with a severe physical motor disability I’m unable to touch the iPhone screen and press the green button to answer a phone call. Being able to answer a call with just my voice makes this new feature, undeniably, one of the company’s most life changing in a long time. Apple doesn’t even class it as an accessibility feature making it a very inclusive implementation.
When I first tried the capability I was crying with joy at its effectiveness. It will change my life as I can now easily answer all the phone calls I receive every day hands-free with just my voice.
I’ve lobbied Apple to include this ability for the past four years. It always seemed absurd to me that almost since Siri launched you could make a phone call simply by asking the virtual assistant but, until now, you couldn’t answer a call in the same way. It’s great to see the company has finally acted.
However, I am left wondering why the feature is only available when you are “wearing second-generation AirPods” connected to your iPhone. Like a lot of people at home I often have my iPhone sat locked on a charging stand on my desk and it would be great just to say, “answer”, and take the call hands-free on loudspeaker. Hopefully, the company will extend the capability in iOS 15.
Unfortunately, Voice Control, Apple’s accessibility speech recognition app, can’t assist users with answering and ending phone calls. Voice Control relies on your iPhone being out and on display and unlocked with Face ID, but this is not my preferred setup when out and about. I always worry someone may come up and wrench my iPhone from its holder mounted to my wheelchair and as bold as brass stroll off with it. I prefer to have my iPhone stowed away in a pocket on my wheelchair and rely on Siri, through my connected AirPods, as my main interface to all my iPhone’s functions. I think in the development of Voice Control Apple didn’t understand that not all disabled users want their mobile gear mounted to their wheelchairs on full display, which is what Voice Control relies on to function fully.
Room for improvement
Apple’s Worldwide Developer Conference (WWDC) gets underway on 7th June as a virtual event again this year. The annual keynote is where the company unveils details of the next major releases of its operating systems, such as iOS 15, watch0S 8 and macOS.
Despite the welcome introduction of the answer calls feature there is still a tonne of stuff you can’t do hands-free with just your voice on Apple devices, which is limiting the independence of users with severe physical motor disabilities.
There’s therefore plenty of room for improvement in the company’s 2021 software updates and these are the issues I would like to see Apple address.
1) Siri why do you keep the hanging on the telephone?
A glaring flaw with iOS is users can’t hang up a call with Siri: “hey Siri end call” or “hey Siri hang up” or dropping the need for “hey Siri” altogether and just say “hang up” when you are on a phone call. Incredibly, users still can’t do this hands-free on a iPhone or Apple Watch cellular.
It isn’t too much of a problem if the other person on the line can hang up and end the call for you. But what if the call is from a stubborn and persistent telemarketer. There’s nothing you can do except listen to them drone on about how wonderful what they are trying to sell you is. It also means that if you call a number and go to voicemail, you have to wait until the mailbox times out before the call ends, which can take two or three frustrating minutes. I feel a real sense of powerlessness when this happens to me. This unsatisfactory situation can’t be allowed to persist and I hope Apple will do something about it in iOS 15.
2) Auto answer
Thankfully, perhaps in no small part to The Register highlighting my experience, auto-answer was introduced as an accessibility option in iOS 11 in 2017. However, the implementation has shortcomings because you have to touch the screen to turn it on and off. I can’t do that and there is no Siri command option such as, “turn on auto answer”, so a feature designed to help people who can’t touch the phone screen, ironically, requires you to do just that to turn it on and off. You might not want to have every call auto answered, or only at certain times of the day. At the moment the auto-answer accessibility feature is a crude catch-all, with no way to whitelist certain contacts you want to auto answer calls from.
There is no watchOS support for auto answer either, despite cellular versions effectively being wrist-worn phones.
Auto-answer remains important for severely physically disabled people because the less you have to project your voice the better. It can help save vital bits of energy so automation remains important. I think there should be a way of creating Siri Shortcuts, involving the auto answer feature, and automating your personal preferences for handling calls that way. For example, perhaps you want your calls auto-answered between certain times but not during the night.
3) Messaging apps
Many popular third party messaging apps don’t work with Announce Messages with Siri, which allows you to listen, dictate and reply to messages hands-free when wearing AirPods connected to a iPhone or Apple Watch cellular. In the UK, where WhatsApp is more popular than Apple’s iMessage, there is no Announce Messages with Siri integration with WhatsApp. Apple provides access to the feature for messaging app developers through an API and it is up to the developers if they want to take advantage.
Suffice to say, concerns about Facebook pushing an aggressive ‘privacy update’ on WhatsApp users aside, if WhatsApp, and other messaging apps like Facebook Messenger, integrated with the Announce Messages with Siri feature it would be of huge benefit to people with limited mobility. It would just be great to both hear and respond to WhatsApp and Facebook messages on the go through your Airpods as they come in, in the same way you can with iMessages. Apple and Facebook, despite their row over app tracking, should get together to make this happen in the name of accessibility for disabled users. It may, in a small way, ease anti-trust pressures both companies face if they can demonstrate that they can open up and cooperate in the name of accessibility.
4) Security vs accessibility
Another problematic area is that dreaded announcement you receive via Siri on your iPhone when you ask the virtual assistant to read your messages: “you need to unlock your iPhone first”. I want to keep my iPhone locked with a pass code and Face ID, but I am unable to unlock my iPhone with my hands so in this scenario Siri is unable to read out my new messages. If I turn off Face ID and keep my phone unlocked this won’t happen but I don’t want to do that. I don’t want to show a preview of my messages on the lock screen either. This expected behaviour is about security but surely there could be an override, with sufficient warnings, for accessibility purposes. I just want to be able to access and listen to all my messages, iMessages, emails, WhatsApp messages, FB Mesenger messages from a locked iPhone through my Airpods when I ask Siri to read them to me, or as they come in, whichever I set as a preference.
I am always having to weigh up accessibility vs security but for me independence and accessibility are important, and I would like to be able to make the choice, rather than Apple saying: “no you can’t do that”.
5) Apple Watch
It goes without saying that as an Apple Watch cellular owner, which is, after all, a phone, I would like it to have the same Siri and Airpods functionality I have mentioned above. Siri to end a phone call, Siri to turn auto answer on and off, and Siri to answer and decline phone calls. Hopefully, this is something Apple will address in watchOS 8 this year.
I have always felt Apple has failed to recognise the potential of the Apple Watch to people with severe upper limb problems As far back as 2017 I was drawing their attention to the problems if you can’t touch the screen or the watch controls such as in this video. I even suggested they could use the inbuilt gyroscope and accelerometer to help.
Well, it seems the company has been listening. Today Apple has just announced AssistiveTouch for Apple Watch to help users with limited mobility. It says it’s a new accessibility feature, which will allow users with upper body limb differences to enjoy the benefits of Apple Watch without ever having to touch the display or controls. Using built-in motion sensors like the gyroscope and accelerometer, along with the optical heart rate sensor and on-device machine learning, Apple Watch will soon detect subtle differences in muscle movement and tendon activity, which will let users navigate a cursor on the display through a series of hand gestures, like a pinch or a clench. AssistiveTouch on Apple Watch will enable customers who have limb differences to more easily answer incoming calls, control an onscreen motion pointer, and access Notification Center, Control Center, and more, the company said.
The feature is coming later this year, with software updates across all of Apple’s operating systems. It looks really transformative if it works for the groups of users it is intended for. As always with these announcements the proof will be in the pudding when the feature is released to the public.
6) AirPods automatic switching
Last September Apple issued an update for AirPods that allows them to automatically switch between your iPhone, iPad, and Mac depending on which device you want to listen to.
Whilst the feature is supposed to be convenient, it’s lead to a lot of frustration from general Apple users finding their AirPods randomly switching between Apple devices.
If you have a physical motor disability, and you can’t pick up your Apple devices to trigger automatic switching, it can get even more problematic. I own two iPhones, a MacBook, and an Apple Watch. I need my Airpods connected to one particular device as default, and ignore connecting to the other devices. But I also want to be able to connect to the other devices sometimes. For people who can use their hands the new automatic Airpods switching is a good solution. But I can’t use make use of the feature because I can’t physically pick up any of my devices to trigger automatic switching. What I would like to be able to do instead is ask Siri to connect my Airpods to the Apple device that I want to connect to at any given time.
To offer up a real world scenario, I could be using my MacBook Pro, or one of my iPhones, with Airpods connected, but then I may want to leave home and go out for a walk and be connected to the iPhone I store in my wheelchair pocket. The only way to get my Airpods to connect to that iPhone is to physically pick it up and wake the screen. I can’t do this, so I can’t be spontaneous and leave home and be 100 per cent sure that my Airpods will connect to the iPhone stored in my wheelchair pocket.
It would be really useful to be able to ask Siri to connect to my second iPhone, or whatever I choose call it, for example: “hey Siri, connect Airpods to Colin’s home iPhone” or “hey Siri connect Airpods to Colin’s second iPhone” or “hey Siri, connect Airpods to Colin’s Apple Watch”.
7) Voice Control needs to get smarter
Apple’s dedicated accessibility speech recognition app has disappointed since its launch to much fanfare at WWDC 2019. The company appears to have invested little in Voice Control and there are there aren’t any exciting new features of note.
Dictation accuracy needs improving, as does its editing capabilities, particularly for long form dictation. Sophistication has to go beyond dictating short messages like: “I’ll be home in 10 minutes” with a cute emoji. Disabled people need more from speech recognition applications, both for education and employment purposes, and keeping in touch with friends and family in the online world we all rely on so much these days.
There are still bugs in Voice Control dictation. If you pause, even for a split second in your dictation, Voice Control applies a capital letter to a word with no reason for the word to have a capital letter. This happens quite frequently in a paragraph of dictation and always has done. When you correct a word the numbered list of alternative words that comes up rarely includes the word you are looking for. I don’t think much clever machine learning is being used. Voice Control dictation doesn’t learn from your mistakes either so the same mistakes keep happening over and over again meaning it is not very productive to use.
Accuracy of Voice Control dictation varies depending on which text box you are using. For example, I have found Voice Control dictation to be most accurate in the iMessage text box, both on iPhone and Mac. It is much less accurate in the Mail application, particularly on the Mac, and in text boxes such as Google and WhatsApp in Safari the accuracy can be very poor. Accuracy should be the same across all text boxes on all Apple devices and operating systems. Finally, you should be able to train words, so they are always recognised the way you pronounce them after training.
Speech recognition in Voice Control doesn’t come close to what Nuance offers Windows users with its Dragon products, which is ironic as Nuance had to drop their Apple voice dictation product in 2018 because of the way Apple controls API access to its platforms, it was alleged at the time.
Try writing a long email to a loved one, running a business, writing a book, campaigning or journalism, if you can’t use a keyboard and rely 100 per cent on Voice Control recognising your voice and dictating your words accurately onto the screen. It isn’t up to the job at the moment and hasn’t been for the past two years.
Soon after it launched, I feared Voice Control would become a bit of a ghetto, a specialist accessibility application that receives next to no investment, and isn’t updated very often. I think Apple would serve all its customers better if it made Voice Control a mainstream inclusive speech recognition application powered by the improving Siri speech engine. I am sure lots of general users would be interested in it. There is a need for high quality speech recognition for all sorts of reasons, RSI, dyslexia, physical motor disability, and, who knows, perhaps Long Covid too.
Rather than building accessibility ghettos or silos, Apple should push the philosophy of inclusive design I mentioned earlier in relation to the answer calls feature released last week.
Why have both Siri and Voice Control? Why not make Siri more powerful by including all the Voice Control features? In that way, Apple would have one voice technology application to manage, and because Siri is mainstream, it might get more attention than a feature used by a small number of Apple customers.
There is one small positive development with Voice Control to note. Earlier this week, on his YouTube channel, Aaron Zollo showed how Apple is testing the ability to unlock your iPhone with just your voice. The feature is still in beta so not available to the public yet. It relies on having your iPhone on display, and I think it would be quicker and easier just to say “Hey Siri unlock my iPhone”. This approach would rely on making Siri only recognise the user’s voice and no one else’s. I think Google has tried this in the past, and the HomePod Mini has it to an extent with multi user access. My HomePod is often asking me who is speaking to verify it is me. I think Apple should have the same for iPhone voice unlock .Even if it isn’t as secure as Face ID I would accept it and take the risk for accessibility reasons and independence.
8) IPhone Upgrade Programme
Whilst not WWDC related, (although I am sure Apple will use the occasion to talk about what it is doing in the area of disability and accessibility), it’s not only about the technology inside devices where Apple needs to be more disability aware. It also needs to stop discriminating against severely disabled people who try to enrol or upgrade on its iPhone Upgrade Programme in the UK.
The iPhone Upgrade Programme is a credit scheme, run with Apple’s financial partners Barclays Partner Finance, that allows customers to spread the cost of a new iPhone over two years with the right to trade in your iPhone each year for the latest and greatest when it is released. Apple says the scheme is available “in store only“ but some severely disabled people who are housebound, or can’t use transport, can’t attend Apple stores in person even if the stores are accessible.
The last two years I have faced unreasonable delays, and point blank refusals, when I tried to enrol and upgrade on the iPhone Upgrade Programme and asked for alternative arrangements to a store visit, such as sending in a carer on my behalf.
In 2019 and 2020, at the eleventh hour, faced with threats of legal action under UK equality laws, Apple and Barclays Partner Finance backed down and reluctantly allowed me to enrol and upgrade by sending a nominated person to my local Apple store on Regent Street in London with the necessary documents. On each occasion it took several weeks of lobbying to reach that point and I wasn’t able to get my new iPhone on release day as many diehard Apple fans like to do. I had to make do with getting my hands on it several weeks after release.
Things took a more concerning turn last October when Apple’s Executive Relations in the Republic of Ireland forced me into agreeing to a statement confirming I wouldn’t try to upgrade under the programme this year in return for what they said was a one off upgrade last autumn. The California tech giant might as well have stuck a large sign above the door of its London Regent Street store saying “no disabled people allowed”. Of course both companies are perfectly capable of making reasonable adjustments for disabled people who can’t attend Apple stores in person but they are reluctant to do so. At the moment I just don’t know if I will be able to upgrade to the iPhone 13 Pro when it is released this autumn.
It seems bizarre that Apple in the UK is insisting on store visits when most businesses during this past Covid year have moved online. Apple is a big tech company, surely they’ve heard of electronic signatures and Zoom. Apple doesn’t require a store visit to enrol or renew on the iPhone Upgrade Programme in the USA. It’s a mystery as to why they do in the UK despite being informed of the problems the policy causes to disabled people who can’t attend stores as far back as 2019. While they insist on this requirement, and behave the way they do when you ask for an adjustment on the grounds of disability, the Cupertino company runs the risk of being called out for discrimination, and their practices investigated by the authorities.
I have made a complaint to the UK Financial Ombudsman who have confirmed the complaint meets the test to warrant assigning to an investigator to look into the matter, and have launched a formal investigation into both Apple and Barclays Partner Finance. Hopefully, this will bring about much-needed change and the California tech giant will think more inclusively about all its customers whereever they are in the world.
Sum up
It has to be noted that Apple does more than most in the technology industry when it comes to ensuring their devices and software are accessible to the one billion people in the world living with a disability, including mobility, hearing and vision. I applaud this and every day make use of the tech they have produced to make my life easier and more productive.
However, reforming the iPhone Upgrade Programme in the UK and ensuring their business practices and policies are accessible and inclusive to all is a must in the coming year.
The on device software and hardware accessibility improvements I have just highlighted, making it easier than ever for everyone to control their Apple devices with just your voice, will, I believe, prove to be popular with a lot of people generally. This will mean that any improvements the company decides to make in the coming weeks will be inclusive and accessible to everyone.
This is exactly how it should be these days. Apple devices should be for everyone.