Apple's new accessibility features include Eye Tracking, Music Haptics, Vocal Shortcuts, and more
- by autobot
- May 17, 2024
- Source article
Publisher object (8)
Apple has announced a slew of new accessibility features coming later this year, including Eye Tracking, Music Haptics, Vehicle Motion Cues and more. The accessibility features are introduced as part of and make use of Artificial Intelligence and on-device machine learning to make devices more accessible for users with disabilities. Using the front-facing camera of iPads and iPhones, . Powered by artificial intelligence, the Eye Tracking feature uses the front-facing camera to set up and calibrate. Furthermore, Apple has claimed that it will work across iPadOS and iOS apps without requiring additional hardware or accessories (see video above, it takes a while to load). In the , there was also a demonstration of Dwell Control which allows users to activate and use various elements of an app. The Eye Tracking feature will use on-device machine learning, with all data used to set up and control this feature will be kept on the device. Music Haptics is e. The feature leverages the Taptic Engine within the iPhone to play taps, textures, and refined vibrations to the audio of the music. The Music Haptics feature will work across millions of songs in the Apple Music catalogue and will be available as an API for developers to leverage this feature and make music more accessible in their apps. Vocal Shortcuts allow users while Listen for Atypical Speech enhances speech recognition for users with speech difficulties. With these features, users can now utter phrases such as “how hot” and Siri will launch relevant applications such as the Weather app to provide insight into the temperature and humidity. These features build upon existing accessibility options, providing greater customisation and control. If you’re one to suffer from motion sickness easily when using your devices, it’s probably because of a sensory conflict between what you see and what you feel. With Vehicle Motion Cues, represent changes in vehicle motion to help reduce sensory conflict without interfering with the main content. The feature uses sensors (likely the accelerometer and gyroscope) built into the iPhone and iPad to recognise when a user is in a moving vehicle and respond accordingly. The feature can be set to show automatically on the iPhone or can be turned on and off in the Control Centre. Coming to CarPlay is Voice Control, which will . Sound Recognition alerts drivers or passengers who are deaf or hard of hearing to car horns and sirens. Colour Filters make the CarPlay interface more accessible for users with colour vision deficiency, accompanied by additional visual accessibility features like Bold Text and Large Text. visionOS will introduce systemwide , enabling users who are deaf or hard of hearing to follow the spoken dialogue in live conversations and audio from apps. Live Captions for FaceTime in visionOS allow users to connect and collaborate using their Persona. Apple Vision Pro adds features like moving captions with the window bar during Apple Immersive Video and support for additional Made for iPhone hearing devices. Other updates include: Though Apple had only mentioned that these features would be released later this year, they will likely be announced during as part of iOS 18 and iPadOS 18. As part of commemorating Global Accessibility Awareness Day, select Apple Stores will host free accessibility sessions, and Today at Apple group reservations are available for community groups to learn about accessibility features together. For more information, you can visit the .