Ir para o conteúdo principal

Article 8 min read

Our technology is only as empathetic as we are

Por Susan Lahey

Última atualização em December 8, 2020

I’ve been to SXSW Interactive six times. Maybe it’s just me, but for many years I took away a sense that the technological revolution—like the Cultural Revolution or the French Revolution—was upon us, ready to sweep up everyone in its path. Resist at your own peril! To survive, humans would have to embrace a whole new paradigm of existence or be left behind. This year, however, was very different. The emphasis in 2018 was on how tech could better serve humanity, rather than drive us in a new direction. There were more than 40 sessions in which experts talked about the relationship between empathy and technology, and that’s just counting the ones with the word “empathy” in the title. Many other sessions addressed this underlying theme: By creating tech to achieve commercial ends, without understanding how it would impact people, we’ve done great damage. It’s time to repair it.

In his keynote, for example, digital anthropologist Brian Solis quoted Tristan Harris, founder of the Center for Humane Technology, saying the designers of social media manipulated neuroscience to make us dependent on the tools they created: “They had the power of gods without the wisdom, prudence, and compassion of gods. They designed without actually understanding what the net effect of this would be.”

He was speaking to the way that social media has warped people’s sense of self-esteem and identity, making them depressed and lonely because they can’t live up to the fictitious lives they post or view online. He was also referring to the underlying problem of artificial intelligence (AI) and other technology being too often designed without thinking about all the people who are going to use it.

[Related read: How an automation-first strategy delivers better human support]

There are many examples of this:

  • In 2016, Microsoft released the chatbot Tay into the Twittersphere to learn to interact. “She” was apparently converted to Naziism overnight by her interactions with humans.
  • Most voice assistants on our phones struggle to understand people unless they’re white and from a region without a strong accent Meaning: if you’re not one of those people, even your phone is dissing you.

  • In early 2018, MIT student Joy Buolamwini gave a TED talk about her research on facial recognition software. As a dark-skinned woman, she realized that whenever sat in front of facial recognition software—she tried several versions—it didn’t register her face… at all. The software programs never had that problem with lighter-skinned students. Finally, she put on a light-skinned mask and voila, the computer recognized her existence.

These are revealing outcomes and, in a hard about-face, many in the tech world want to ensure that virtual reality (VR), augmented reality (AR), AI and other technologies are used far more consciously and empathetically in the future. For example, there’s a lot of interest in how VR can simulate experiences so we can learn from and understand the lives of others. There are even programs that take that a step further, to allow robots and other non-sentient entities, your refrigerator for example, to sense your emotions or levels of stress and respond accordingly.

[Related read: Leading with empathy: What you don’t say is just as important as what you do]

The question is, how much of this Utopian perspective will reach or be adopted by commercial organizations? Will the AI our businesses employ work equally hard to achieve empathy?

How much of this Utopian perspective will reach or be adopted by commercial organizations? Will the AI our businesses employ work equally hard to achieve empathy?

Empathy means: I know how it feels to be you

In the packed “From AI to EI, Empathy is the New Tech Superpower” session, panelist Danny Devriendt, a global leader in Digital and Social Media and the predictive web in Europe, noted that AI has no conscience. It can only be empathic if the data sets fed into it teach it to be. As long as we feed the algorithm data sets that are biased toward the views and experiences of, say, young, white, straight males, that’s how the algorithm will interpret the world. If we want AI to have empathy for different kinds of people, he said, “We have to give the algorithms a good balance of the world. We are all different. We have different views on beauty. We have different views on religion. We have different views on sexuality, You have to feed that variety into the autonomic system because AI is going to behave as babies behave.”

As long as we feed the algorithm data sets that are biased toward the views and experiences of, say, young, white, straight males, that’s how the algorithm will interpret the world.

One example is the processing software for people seeking asylum, according to panelist Min Teo, a trustee of the organization Techfugees, which offers a platform for creating tech that helps refugees. She explained that older refugees tend to be neglected in favor of children, and young men are favored over women. There are logical reasons for this, but, she said, “We have to question our heuristics, how we program the code and make sure it’s not inherently filled with biases.”

Teo said the way to do that is to make sure that all kinds of people are involved in helping create the code.

“We always wanted to co-create with the refugees,” she said. “We didn’t want to build tech top-down for a faceless beneficiary. That means really understanding someone else’s point of view. Understanding the struggles they went through. Promoting diversity of thought will be part of the journey to remove biases.”

[Related read: How the International Rescue Committee welcomes asylum seekers and refugees to the U.S.]

Generating and expressing empathy is difficult, even with people you already care about or who are part of “us.” It’s much harder with people you don’t care about or who are part of “them.” In one study about the neuroscience of empathy, APS Fellow Ying-yi Hong of the Chinese University of Hong Kong said people show more activity in their amygdalas—the emotion center of the brain—when watching someone of their own race being afraid or hurt than when it’s someone of a different race.

Generating and expressing empathy is difficult, even with people you already care about or who are part of “us.” It’s much harder with people you don’t care about or who are part of “them.”

Hiring diverse teams leads to more empathic AI

It’s tough for someone to design an AI program that has a broad enough data set to be sensitive to the needs of all possible users—people of different ages, races, cultures, socioeconomic backgrounds, or people with disabilities or who don’t speak English. The best way to do that is to make sure your team and the people you test the software on represent that kind of broad spectrum. That’s something that’s not common in most development teams. There is a global talent shortage and most companies struggle to find enough qualified developers of whatever background.

Another option is to undergo empathy training. Many people in different SXSW sessions also embraced the idea of mindfulness and compassion meditation. Simply mustering compassion for others during meditation has been proven to measurably change neural pathways. Panelist Marie Glad of the Barcelona Technology School recommends mindfulness and Ikigai—the Japanese art of finding the intersection of what sparks joy, what the world needs and what you can get paid for. But she, too, suggested hiring diverse teams. The best way to understand and empathize with others is to transform our view from “them” to “us” by working together, side by side.

Really empathizing with another goes beyond intellectually processing their experience—although that would be a beginning—and trying to understand “what it’s like to be you.” Then we have to go a step further, making sure that awareness is in our hearts and minds as we design customer service, user experience, and AI. That’s a big job. And it’s not one we can leave to the robots. Even feeding a diverse data set into an algorithm can only take us so far. Consider this excerpt from a novel written by a machine, using all the data available from the Harry Potter series:

[Related read: Inclusive leadership has never been more imperative]

Leathery sheets of rain lashed at Harry’s ghost as he walked across the grounds toward the castle. Ron was standing there and doing a kind of frenzied tap dance. He saw Harry and immediately began to eat Hermione’s family.

Clearly, humans are still needed.

So if as many as 80 percent of enterprises are in the midst of developing AI solutions, nearly half of which are for customer service, it’s up to us to make sure that the output of our technology is not only a reflection of who we are but, and more importantly, who we want to be: empathic people, socially-conscious organizations, actually serving our customers. All of them.

Histórias relacionadas

Article

Why we need to gather, dream, and amplify louder and prouder in 2023 and beyond: A Pride month special with Zendesk’s Scott Morris

In a candid interview, Zendesk Acting Chief Marketing Officer and Executive Sponsor of the Pride employee committee, Scott Morris, shines a light on why it's a pivotal time for Pride community members and allies to rally together.

Article
3 min read

Elevating women and embracing equity: How 3 organizations make a difference

See how Zendesk partners are creating more equitable workplaces, advancing women’s careers, and empowering women across the globe.

Article
2 min read

Building community for Black History Month and beyond

Zendesk celebrated Black History Month under the theme of Black History Is My History. Here’s how employees embraced their authentic selves and formed deeper connections with their Black peers and allies.

Article
2 min read

The International Rescue Committee puts Afghan refugees on the path to safety

Learn how the International Rescue Committee is supporting resettlement efforts in the U.S.