By Alexandra Grossi
Since their inception, hearing devices have been designed to hide the user’s inability to hear. Glasses, on the other hand, went from being solely a vision-correcting device to being a fashion statement. Graham Pullin’s book, Design Meets Disability, challenges the way we design for disability and argues that design can help fight stigmas. Pullin questions why hearing devices can’t go down the same path as glasses. Rather than focusing on overturning stigma associated with cochlear implant (CI) product design, my design explorations use Pullin’s concepts to make the user experience of the CI Remote Assistant closer to the intuitive hearing experience of a normally hearing person. Then I take the project a step further to explore how the CI user experience might incorporate intuitive “super hearing” abilities that people with normal hearing do not have.
Cochlear implants (CIs) are electronic devices that replace the function of a damaged inner ear in people with profound hearing loss. The latest iteration of the CI, offered by Cochlear Americas, is the Nucleus 6, which has an option for a “Remote Assistant” that lets users adjust settings wirelessly. The Remote Assistant is meant to give users greater control over their sound settings. Instead, users report that its user interface is difficult to navigate and time consuming to operate.
In describing what makes clear interaction design, Gillan Crampton Smith lists the following:
- It offers users a clear mental model
- It gives reassuring feedback
- It is easy to navigate
- It is consistent (Moggridge, 2007).
Judging the usability of the Remote Assistant based on the qualities Smith outlines, it is clear that the design does not offer users a clear mental model, nor is it easy to navigate. Instead, the onus is on the user to learn and memorize the functions of the Remote Assistant. The Remote Assistant uses linear navigation that requires users to click through many screens to make adjustments to volume or sensitivity. The linear navigation is problematic because it is difficult for users to know “where they are” unless they have the order memorized. The icons and language used in the interface do not clearly inform users of their functions. Obfuscation of functions goes against Don Norman’s characteristics of good design: discoverability and understanding (Norman, 2013).
In the case of a bilateral CI user (a CI recipient who has CIs in both ears), it takes seven steps to reach the screen that allows the user to change the volume of their left ear. If the same CI user wants to adjust the sensitivity of the left ear, it takes ten steps. If a user has a microphone accessory, the user can control the volume of the microphone input, but it takes 14 steps.
In my explorations addressing the usability of the current UI, I explore the concept of a UI that exists on many connected devices. These mini-design explorations look at the most commonly changed variables: volume and sensitivity.
In looking at ways to expand the idea of what a hearing device is supposed to do, I went a step further to explore elements of “super hearing.” What if the user could block out the sound of a blender and make a friend’s voice louder? I explored the concept of a noise blocking feature that allows users to visualize their soundscape and select sounds they would like to either block or enhance.
In my research, I conducted interviews with many hearing device users, and identified shared struggles for inspiration. My interviews revealed that even with this incredible technology, it is difficult for users to follow along what is being said in a crowd or in a noisy environment. Scriber is a feature that uses voice recognition software to offer users a real time and saved transcription of the world around them.
David Rose’s Enchanted Objects examines the possibilities for objects once they are connected to the internet. A connected UI would enable a contextual interface that users can program. Over time, the UI would be able to learn the user’s behaviors and set off sequences that automate functions. I call these “routines.” Users can set up conditions such as a time of day or a location that sets into motion a predesigned routine. For example, my persona, Max, is a 24-year-old programmer who lives in a big, noisy city. Max likes to “warm up” his ears in the morning by playing a playlist that slowly rises in volume and then crossfades to environmental sounds so that by the time he reaches his office, he has completed the transition.
Older and disabled users have been disenfranchised by the very technology that they rely on for their way of life. Including the user in the design process is key to producing a great user experience. Designers have the power to give users a stronger role in the design process. By not taking a deeply user-centered design approach, assistive technology companies are missing out on the opportunity for great innovation. They could help push progress on a cultural scale. Connecting users to the internet opens a world of possibilities. It would empower users by giving them a voice and the opportunity to think critically about the technology they depend on. My explorations in this research seek to push boundaries, and employ Human-Centered Design. While I cover a wide range of topics and ideas, there are limitless possibilities in the field. Further investigations should put the user at the center of the design process and continue to dare to ask, “what if?” and “why not?”
References
Moggridge, B. (2007). In NetLibrary I. (Ed.), Designing interactions. Cambridge, Mass.: MIT Press.
Norman, D. A. (2013). The design of everyday things. New York: Basic Books, a member of the Perseus Books Group.
Pullin, G., 1964-. (2009). Design meets disability. Cambridge, Mass.: MIT Press.
Rose, D.,1967- author. (2015). Enchanted objects : Innovation, design, and the future of technology. New York, NY: Scribner.