Readers of Aestumanda may recall I had my electric door opener automated with Amazon Alexa voice control just before the UK went into Covid 19 lockdown earlier this year. Due to the lockdown I have had few opportunities to use the new functionality, which helps me get in and out of my flat unaided, because I have been shielding. I’ve only used it with my indoor Amazon Echo smart speakers when I have received the occasional home shopping delivery. Outside my flat in the street, where there are no indoor Amazon smart speakers handy, the solution relied on me wearing Amazon’s wireless Echo Buds to summon Alexa to open the door but they don’t have deep integration with my Apple iPhone. For this reason Apple’s Airpods were always my preferred option for controlling my electric door ...
It’s a year since Apple first unveiled its flagship accessibility feature called Voice Control at WWDC 2019. With an inspiring short film the company showed Ian McKay, a disability advocate and outdoor enthusiast, using voice commands to control his Mac computer. Voice Control is a speech to text application that is now baked into Apple devices and offers physically disabled people, and anyone who owns a Mac computer, iPhone or iPad the ability to precisely control, dictate and navigate by voice commands alone. I’ve been trying to use the application over the past year and have been left feeling frustrated and disappointed. With the next version of macOS set to be unveiled later this month at WWDC 2020 here’s why and what Apple needs to do next to improve Voice Control. Please ...