Ebook: Designing Across Senses: A Multimodal Approach to Product Design
Author: Christine W. Park, John Alderman
- Genre: Art // Design: Architecture
- Year: 2018
- Publisher: O’Reilly Media
- Language: English
- epub
What Is This Book About?
From the keyboard, mouse, and touchscreen, to voice-enabled assistants and virtual reality, we have never had more ways to interact with technology. Called modes, they allow people to enter input and receive output from their devices. These inputs and outputs are often designed together in sets to create cohesive user interfaces (UIs). These modes reflect the way our senses, cognitive functions, and motor skills also work together in sets called modalities. Human modalities have existed for far longer than our interface modes, and they enable us to interact with the physical world. Our devices are only beginning to catch up to us. We can now jump and move around in our living rooms to play a game using Microsoft’s motion-tracking peripheral, Kinect. We can ask Domino’s to deliver a pizza using the Amazon Echo.
We often use several modalities together in our daily activities, and when our devices can do the same, they are considered multimodal UIs. Most UIs are already multimodal, but because they’re so familiar we don’t tend to think of them that way. In fact, almost all designed products and environments are multimodal. We see a door and knock on it, waiting for it to open or to hear someone inside ask who it is. We use our fingers to type on a keyboard, and see characters appear on the screen in front of our eyes. We ask Siri a question and see the oscilloscope-like waveform to let us know we are being heard. We receive a phone call and feel the vibration, hear the ringtone, and see the name of the person on the screen in front of us. We play a video game and are immersed in sensory information from the screen, speakers, and the rumble shock controller in our hands.
Multimodal products blend different interface modes together cohesively. They allow us to experience technology the same way we experience our everyday lives: across our senses. Good multimodal design helps us stay focused on what we are doing. Bad multimodal design distracts us with clumsy or disjointed interactions and irrelevant information. It pulls us out of our personal experience in ways that are at best irritating and at worst dangerous.
As technology is incorporated into more contexts and activities in our lives, new types of interfaces are rapidly emerging. Product designers and their teams are challenged to blend modalities in new combinations for new products in emerging categories. They are being asked to add new modalities to the growing number of devices we use every day. This book provides these teams with an approach to designing multimodal interactions. It describes the human factors of multimodal experiences, starting with the senses and how we use them to interact with both physical and digital information. The book then explores the opportunities represented by different kinds of multimodal products and the methodologies for developing and designing them. Following this approach will develop multimodal experiences for your users. You will be able to deliver successful products that earn trust, fulfill needs, and inspire delight.
Who Should Read This Book
This book is for designers who are developing or transforming products with new interface modes, and those who want to. It will extend knowledge, skills, and process past screen-based appraoches, and into the next wave of devices and technologies. The book also helps teams that are integrating products across broader physical environments and behaviors. The senses and cognition are the foundation of all human experience, and understanding them will help blend physical and digital contexts and activities successfully. Ultimately, it is for anyone who wants to create better products and services. Our senses are the gateway to the richness, variety, delight, and meaning in our lives. Knowing how they work is key to delivering great experiences.
From the keyboard, mouse, and touchscreen, to voice-enabled assistants and virtual reality, we have never had more ways to interact with technology. Called modes, they allow people to enter input and receive output from their devices. These inputs and outputs are often designed together in sets to create cohesive user interfaces (UIs). These modes reflect the way our senses, cognitive functions, and motor skills also work together in sets called modalities. Human modalities have existed for far longer than our interface modes, and they enable us to interact with the physical world. Our devices are only beginning to catch up to us. We can now jump and move around in our living rooms to play a game using Microsoft’s motion-tracking peripheral, Kinect. We can ask Domino’s to deliver a pizza using the Amazon Echo.
We often use several modalities together in our daily activities, and when our devices can do the same, they are considered multimodal UIs. Most UIs are already multimodal, but because they’re so familiar we don’t tend to think of them that way. In fact, almost all designed products and environments are multimodal. We see a door and knock on it, waiting for it to open or to hear someone inside ask who it is. We use our fingers to type on a keyboard, and see characters appear on the screen in front of our eyes. We ask Siri a question and see the oscilloscope-like waveform to let us know we are being heard. We receive a phone call and feel the vibration, hear the ringtone, and see the name of the person on the screen in front of us. We play a video game and are immersed in sensory information from the screen, speakers, and the rumble shock controller in our hands.
Multimodal products blend different interface modes together cohesively. They allow us to experience technology the same way we experience our everyday lives: across our senses. Good multimodal design helps us stay focused on what we are doing. Bad multimodal design distracts us with clumsy or disjointed interactions and irrelevant information. It pulls us out of our personal experience in ways that are at best irritating and at worst dangerous.
As technology is incorporated into more contexts and activities in our lives, new types of interfaces are rapidly emerging. Product designers and their teams are challenged to blend modalities in new combinations for new products in emerging categories. They are being asked to add new modalities to the growing number of devices we use every day. This book provides these teams with an approach to designing multimodal interactions. It describes the human factors of multimodal experiences, starting with the senses and how we use them to interact with both physical and digital information. The book then explores the opportunities represented by different kinds of multimodal products and the methodologies for developing and designing them. Following this approach will develop multimodal experiences for your users. You will be able to deliver successful products that earn trust, fulfill needs, and inspire delight.
Who Should Read This Book
This book is for designers who are developing or transforming products with new interface modes, and those who want to. It will extend knowledge, skills, and process past screen-based appraoches, and into the next wave of devices and technologies. The book also helps teams that are integrating products across broader physical environments and behaviors. The senses and cognition are the foundation of all human experience, and understanding them will help blend physical and digital contexts and activities successfully. Ultimately, it is for anyone who wants to create better products and services. Our senses are the gateway to the richness, variety, delight, and meaning in our lives. Knowing how they work is key to delivering great experiences.
Download the book Designing Across Senses: A Multimodal Approach to Product Design for free or read online
Continue reading on any device:
Last viewed books
Related books
{related-news}
Comments (0)