Sonic Interaction Design
Interaction designers need to move beyond just visual interactions and fully embrace the world of sonic interactions.
User interfaces have traditionally been centered around visuals, with sonic and tactile cues used as a backup. But sonic interactions have the ability to provide a wholly distinct experience. Whether it’s a non-visual form of navigation [1], or way to monitor patient health [2] , sound can provide a distinct and effective form of interaction that can move interfaces beyond the screen. In order to do that, interaction designers need to move beyond just visual interactions and embrace the world of sonic interactions.
The growing market for smart speakers and voice assistants [3] , as well as the rise of podcasts indicates that there is a clear market for “audio-only” interfaces. But where voice assistants are speech based, audio interfaces can be so much more. Using ear-cons and non-speech sounds can actually be more effective in conveying information [2]. And more importantly can provide multiple layers of data and information without interfering with other interactions. [4]
Sonic interactions also present an opportunity for brands to differentiate themselves from the competition. While jingles and radio spots have been around for more than a century, brands are now increasingly adopting musical logos [5]. As more electric and automated vehicles enter the world’s fleets, sonic interactions will become more relevant, both as warnings and as a means of interaction [6].
In order to grow the space, we need to develop a language or means of communication between interaction designers and sound designers. While it may be easy to map data points to pitch and volume, timbre and instrumentation and musicality become more subjective and contextual. Adding elements like spatial audio can add complexity, but also provide an effective means of interaction
[1] . In order to effectively use these different parameters (or to create entirely new ones), and create new forms of interaction, designers need to learn the rules and the craft of sound design, but also to develop a means of creating wire-frames and interfaces purely from sounds.
Sonic interactions are not without their drawbacks. They are by nature, exclusionary to people with hearing disabilities. Poorly designed interactions can lead to ear fatigue, or even tinnitus and hearing loss. And adding more sounds to an already noisy environment can be distracting [7] . As with visual interaction design, understanding the users, their context, and providing fail-safes and
fall-backs will be required. There will continue to be situations and contexts where visual interfaces take priority. But the ideal interface is almost always multi- modal [8], and makes the best possible use of the strengths of each mode.
Interaction design has, for the most part, been built around visual interfaces. And it makes sense, as vision is the primary way humans perceive the world. But the most effective forms of interaction are multi-modal [8], and embracing, learning how to create sonic interactions, will only lead to better interaction design.
Sources & Citations:
(1)
Mashiba, Iwaoka, R., Bilal Salih, H. E., Kawamoto, M., Wakatsuki, N., Mizutani, K., & Zempo, K. (2020). Spot-Presentation of Stereophonic Earcons to Assist Navigation for the Visually Impaired. Multimodal Technologies and Interaction, 4(3), 42–. https:// doi.org/10.3390/mti4030042
(2)
Hickling, Brecknell, B., Loeb, R. G., & Sanderson, P. (2017). Using a Sequence of Earcons to Monitor Multiple Simulated Patients. Human Factors, 59(2), 268–288. https://doi.org/10.1177/0018720816670986
(3) pwc Consumer Intelligence Series: Prepare for the voice revolution https://www. pwc.com/us/en/advisory-services/publications/consumer-intelligence-series/voice- assistants.pdf
(4)
Bonebright, & Nees, M. A. (2009). Most earcons do not interfere with spoken passage comprehension. Applied Cognitive Psychology, 23(3), 431–445. https://doi. org/10.1002/acp.1457
(5)
(Music as Multimodal Discourse, Lyndon C. S. Way, Simon McKerrell, , and Paul Bouissac p.120)
(6)
(Beattie, Baillie, L., & Halvey, M. (2017). Exploring How Drivers Perceive Spatial Earcons in Automated Vehicles.)
(7)
First National Survey of Patient-Controlled Analgesia Practices
Michael Wong, JD (Executive Director, Physician-Patient Alliance for Health & Safety) Authors: Anuj Mabuyi, PhD (Assistant Professor Department of Mathematics, Northeastern Illinois University)
Beverly Gonzalez, ScM (Biostatistician, Johns Hopkins Bloomberg School of Public Health)
(8)
Kaber, Wright, M. C., & Sheik-Nainar, M. A. (2006). Investigation of multi- modal interface features for adaptive automation of a human–robot system.