It’s never been more important to design digital products for trust. Consumers are being asked to allow smart digital products such as voice assistants to manage more and more of their lives. But, trust is nebulous to design for because trust is complicated. As products become smarter, designers need to learn how to build not only functional trust but emotional trust. In this two-part series, I’ll take a closer look at designing for emotional trust. Part One, below, articulates why emotional trust matters and shares an example of how to design for emotional trust.
We live at a time of consumer distrust. Consumers don’t trust big technology companies such as Facebook to respect our privacy. More often than not we are prompted for “strong passwords” so that we feel secure during a time when security breaches are aplenty. And the term “dark web” is scarier and bigger than the latest “dark” UI craze sweeping our digital experiences. Trust can be difficult to build because people already carry their own defenses and distrust from past hurts, rejections, and deceptions.
Functional Distrust versus Emotional Distrust
Most of the stories we read in the news media about technology distrust focus on one kind of distrust: functional distrust. Facebook fails to manage our personal data properly. Smart voice assistants listen to our conversations because they’ve been designed to do so, and the tech companies allow workers to listen to snippets of our conversations because their policies allow them to do so.
The challenge of forging functional trust between companies and consumers in the digital age is not new. At the dawn of the digital age, the debate about functional trust centered on whether people could trust businesses such as Amazon to store their personal credit card data securely. The same debate arose with the rise of the app economy. With the rise of Uber, the debate became more serious (could we trust ride-sharing services such as Uber with our personal safety?)
That debate comes in waves and usually gets addressed (sometimes with the involvement of the government) before another functional trust flashpoint arises. The issues of functional distrust I cite here will indeed be addressed – and are, with the advent of heightened privacy laws and pressure from consumer watchdogs.
But with the advent of artificial intelligence, businesses need to build not only functional trust with people but also emotional trust. AI raises the stakes. It’s like a super powerful Infinity Stone from the Marvel universe, capable of doing great good and evil – or at least that’s how people perceive it. AI requires us to trust machines to teach our children, help doctors perform life-or-death surgeries, and drive our cars for us. AI wants to be part of our very existence in the home, on the road, and everywhere else. At the same time, AI inspires a widespread fear that robots might take our jobs and take over our lives.
People don’t think of AI as lovable. And that’s a problem rooted in emotional distrust, not functional distrust – and a problem that product designers need to help solve. But designers can crack the code of emotional trust, the payoff is big: smart, trustworthy products will encourage:
- Repeatability: people will come back to your product.
- Dependability: people will repeatedly engage with your product.
- Relatability: people will feel so connected to your product they will form a trusted relationship.
Let’s take a closer look at designing for emotional trust.
Signs of Progress
Machines are showing the potential to be more trustworthy. Thanks to advances such as natural language processing and text-to-speech, products are becoming more intelligent and human. AI has emotional intelligence. As a result, product developers are being challenged to design for machines with emotional intelligence instead of designing for dumb screens.
When products become intelligent and interact with people as people do (not as robots do), the person interacting with the machine can actually develop an emotional trust. No longer is the interface one of issuing commands to a dumb screen. People can allow the machine to guide the conversation. That’s a reason why Amazon and Google are injecting personality into AI-fueled voice assistants such as the Samuel L. Jackson Alexa skill, which assumes the voice of the popular actor. Injecting personality into a voice skill may seem like a gimmicky thing to do – but it’s an important example of designing for emotional trust. The voice of a famous actor makes Alexa just a bit warmer and more approachable. We know that voice. Alexa feels a little less emotionally disconnected.
Designing for Emotional Trust
Designing to build emotional trust is within our grasp. For example, Moonshot recently developed Encyclopedia Britannica’s Guardians of History voice app. Guardians of History is a voice-activated immersive experience that takes 8-to-12 year-olds on a journey through time to solve though-provoking challenges within ancient Greece. With the help of voice-activated prompts, kids solve challenges and learn about history. To pull off an experience like that, we had to think carefully about the emotional component of the story. Asking kids to interact with a device that will make them enjoy learning was a big ask. We needed for the voice-activated experience to impart the right kind of encouraging tone to keep kids engaged as they solved tasks. Using a robotic voice would not develop trust. We worked hard to get the narration down right, employing encouraging, upbeat voices with just enough dramatic flair to keep the kids engaged.
The end result of this minimum lovable product is one that was focused on relatability over repeatability and dependability. We felt it was important to relate to kids 8-to-12 before we executed on more robust plans under the Guardians of History banner. We did so back in February 2019 by spending extensive time with this demographic, understanding what they relate to today and what they want to relate to tomorrow. This process was eye-opening for us.
We learned that 8-to-12-year are bored. They leverage voice as a means of entertainment such as the “I’m bored” feature on Google Home. It was a challenge for us to design a relationship where we could transform boredom into a fun learning experience. This demographic is on the cusp of obtaining smart phones – so we needed to match or exceed the experience they can get from a smartphone screen. Our challenge was to devise an experience that was just as good as content they would get on a smartphone, but without the screen.
Evolving Design Thinking
Popular design approaches are capable of developing emotional trust if we push them. In my next post in this two-part series, I’ll take a closer look at why we need to evolve design thinking. If we can crack the code for building emotional trust, we’ll create products that build relationships through repeatability, dependability, and relatability.
Solving product and experience challenges around trust is not new for us. Our process called FUEL (Future Unified Experience Lifecycle) was intentionally designed to push on “trust” because we recognize that trust is not assumed, it is earned. To learn more about our workshops and ways of working around designing for emotional trust, contact us.