Apple reveal Music Haptics for feeling songs
To make music more accessible for everyone, Apple are adding Music Haptics which make songs come alive to the touch.
Apple have revealed a slew of new accessibility features that make their devices more useable for those with physical disabilities. There are a bunch of new features, which Apple say “combine the power of Apple hardware and software, harnessing Apple silicon, artificial intelligence, and machine learning to further Apple’s decades-long commitment to designing products for everyone”.
Amongst the new features is the addition of Music Haptics. This is to empower people who are deaf or hard of hearing to experience music in a unique way. Using the haptics, users with the setting turned on will be able to feel “taps, textures, and refined vibrations to the audio of the music” on their iPhone.
Apple have revealed: “Music Haptics works across millions of songs in the Apple Music catalog, and will be available as an API for developers to make music more accessible in their apps.”
Apple have also introduced Eye Tracking, which allows control over iOS devices without touch. Vocal Shortcuts allow users to perform tasks with sounds, whilst Vehicle Motion Cues have been added to reduce motion sickness whilst in a vehicle.
Apple CEO, Tim Cook says: “We believe deeply in the transformative power of innovation to enrich lives. That’s why for nearly 40 years, Apple has championed inclusive design by embedding the accessibility at the core of our hardware and software. We’re continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users.”
Apple’s senior director of Global Accessibility Policy and Initiatives, Sarah Herrlinger says: “Each year, we break new ground when it comes to accessibility. These new features will make an impact in the lives of a wide range of users, providing new ways to communicate, control their devices, and move through the world.”