Apple announced a variety of new software features designed for people with mobility, vision, hearing, and cognitive disabilities. These will launch later this year. The features include AssistiveTouch for Apple Watch and third-party eye-tracking hardware support for iPad. Furthermore, from May 20, customers will be able to communicate with Apple using American Sign Language (ASL) in the U.S., British Sign Language (BSL) in the UK, or French Sign Language (LSF) in France via service called SignTime.
New Accessibility Features Coming Later This Year
Commenting, Sarah Herrlinger, Apple’s senior director of Global Accessibility Policy and Initiatives, said:
At Apple, we’ve long felt that the world’s best technology should respond to everyone’s needs, and our teams work relentlessly to build accessibility into everything we make. With these new features, we’re pushing the boundaries of innovation with next-generation technologies that bring the fun and function of Apple technology to even more people — and we can’t wait to share them with our users.
AssistiveTouch uses an Apple Watch’s motion sensors, things like the gyroscope and accelerometer, as well as the optical heart rate sensor and on-device machine learning. These detect subtle differences in muscle movement and tendon activity, allowing users with limb difference to navigate a cursor on the display through gestures such as pinch or a clench. It will provide access to Notification Center, Control Center, as well as the ability answer incoming calls, amongst other things.
New features also coming include an upgrade to the screen reader voiceover tool, and Apple is adding support for new bi-directional hearing aids. Furthermore, compatible MFi devices will be able to track where a person is looking onscreen. The pointer will then move to follow their gaze.