Artificial Empathy Improves Communication Skills of AI-Machines

Posted by Peter Rudin on 29. June 2018 in Essay

AVA, Autodesk’s conversational Avatar for customer support
Picture Credit: Soul Machines Ltd.

Introduction

Most surveys on the consequences of Artificial Intelligence (AI) on employment conclude that about 1/3 of all job positions will disappear or existing job descriptions will be transformed. Robotic production lines will continue to reduce manufacturing jobs and, as AI improves, a much broader set of ‘thinking’ as opposed to ‘doing’ jobs will be affected. New types of jobs might emerge; however, we do not know yet to what extent they will compensate the loss.

Many skilled ‘thinking’ jobs follow the same general workflow: gather, analyse and interpret data to determine and recommended a course of action. Machines can do these types of tasks efficiently, overcoming limitations humans inherently have. Doctors will never be able to keep up with every new publication in their areas of expertise.  Instead, they must rely on a small number of personal experiences rather than the complete knowledge in their field. Consultants, too, can only experience so many company transformations over their careers.  From a narrow set of experiences, they form their preferences, expectations, and insights. Current advice suggests that to stay relevant in a profession, one will need to focus on skills and capabilities that AI has trouble replicating – understanding, motivating, and interacting with human beings.

However, as AI rapidly advances its communication skills, persuasion and empathy are no longer an exclusive human asset. Artificial empathy is about to overthrow the idea that to communicate expertise and advice is best performed by humans.

Definition of Empathy

Our ability to understand and relate to other people is, in part, a function of an empathetic process.

How well we make rapid and accurate inferences about the feelings, goals, attitudes, motivations and beliefs of other people determines, to a large extent, our contribution to a specific social or business situation and our decision making and cognitive thought processes.

According to Wikipedia, empathy is the capacity to understand or feel what another person is experiencing from within their frame of reference, i.e., the capacity to place oneself in another person’s position. Empathy is generally divided into two major components:

  • Affective empathy (also called emotional empathy): the capacity to respond with an appropriate emotion to another person’s mental state.
  • Cognitive empathy: the capacity to understand another person’s perspective or mental state.

Affective and cognitive empathy are independent from one another; someone who strongly empathizes emotionally is not necessarily good in understanding another person’s perspective. The human capability to recognize the bodily feelings and facial expressions of another person is related to one’s own imitative capacities of producing those corresponding movements or expressions.

Affective-Computing 

An emerging trend in artificial intelligence is to get computers to detect how we’re feeling and how to respond accordingly. Artificial emotional intelligence, or emotional AI, is focused on developing algorithms that can identify not only basic human emotions such as happiness, sadness, and anger but also more complex cognitive states such as fatigue, attention, interest, confusion, distraction, and more. Facial expressions are just one set of data that is fuelling the rapid advancement of a subset of AI also called ‘affective-computing.’ Researchers and developers are creating algorithms that try to determine the emotional state of the human based on input such as gestures, facial expressions, text, and tone of voice. More importantly, they are using machine-learning techniques to develop emotional intelligent interfaces that accurately detect a person’s mood and respond to it appropriately. However, businesses will have to address two potentially limiting factors, in particular: the availability of the data that provides accurate emotional cues; and the ethical and data privacy issues that will emerge as companies seek to gather this intimate information about customers and employees. “Once you are able to analyse a person’s affective state, you can also respond to it and influence it,” says Stacy Marsella, a professor in Northeastern University’s College of Computer and Information Science with a joint appointment in psychology.

Artificial intelligence that reads and responds to our emotions is considered a ‘killer app’ of the digital economy. The customer experience is the most obvious sweet spot for affective-computing. Forrester analysed its customer experience data from 2014 and 2015 and found that emotion was the number-one factor in determining customer loyalty in 17 out of the 18 industries surveyed. The global affective-computing market is estimated to grow from just over US$ 9.3 billion in 2015 to more than US$ 42.5 billion per year by 2020, according to the market research firm ‘Research and Markets’. The firm predicts that affective-computing will “revolutionize the way organizations gather, organize, collaborate, and deliver information.”

The conversational avatar – a milestone in affective-computing

Today’s internet-based product or service offerings typically incorporate a section for FAQ’s (frequently asked questions) which provide static answers to questions users typically ask. With the advancement in speech recognition and voice response, so called chatbots (‘bot’ is an abbreviation of ‘robot’) are replacing the FAQ listings. Chatbots are typically used in conversational systems via smart phones and messaging apps such as Facebook Messenger, or in combination with stationary bidirectional microphones/loudspeakers such as Alexa from Amazon or Google-Home from Google. Some chatbots use sophisticated, AI-based natural language processing systems, but many simpler systems scan for keywords within the input, then pull a reply with the most matching keywords, or the most similar wording pattern, from a database.

Applying affective-computing technology, an avatar, representing a digital copy of a human being, can conduct a conversation similar to the experience one has with video-conferencing between individuals. The avatar’s facial expressions and voice modulations and his ability to recognize emotional reactions of the user, stipulates empathy which, according to management theory and psychology, is one of the most important assets to convincingly and effectively communicate with individuals. An up-to-date knowledge-base generated by ‘deep-learning’ algorithms, interfaced to the avatars emphatic communication skills, opens the door to a new range of expert-advice services at low cost. Available 24 hours/day, the avatars can be multiplicated for unlimited worldwide access. As deep-learning is best suited to solve specific problems, the knowledge-base of a conversational avatar is focused on expert-conversations for specific topics such as finance, health, corporate strategy, customer support, logistics etc.

‘Soul Machines Ltd.’ from New Zealand, a Start-up company of AI researchers, neuroscientists, psychologists and artists have launched – in cooperation with IBM and their Watson Cloud service for language processing – their first conversational avatar in May this year. AVA – an acronym for automated virtual assistant – is a next-generation digital assistant which the company Autodesk Inc. is providing to support its customers in the application of Autodesk’s software. Autodesk hopes that AVA’s animated face and voice will engage customers at a deeper level than current conversational software can. “If a customer tells AVA, ‘I am having trouble,’ I want AVA to frown a bit and say, ‘I am sorry, let me see if I can help’” says Rachael Rekart, who manages Autodesk’s digital support channels. At first, AVA will respond purely to positive and negative signals. She will smile, frown and adjust her tone based on clues to the user’s emotional state. She is designed to process emotional cues from users’ tone of voice and facial expressions and she will tell users that if they turn their camera on, she will be able to see them. Later she may be programmed to differentiate frustration from sadness. After that, she might be given arms, so she can express herself through hand gestures and the like. Autodesk aims to take it step by step, gauging AVA’s impact on customer satisfaction before making her more elaborate, Ms. Rekart says. Digital representation of humans risks falling into what computer scientists call the uncanny valley, a creepy netherworld in which not-quite-humans are more unsettling than relatable. Autodesk wants to make sure AVA improves the user experience before giving her more humanlike characteristics. “We’re just sticking our toe in the water,” Ms. Rekart says. “We have a long way to go.”

Future directions 

As affective-computing technology improves, one can no longer distinguish between a video representation of a conversational avatar and a human being. As cognitive intelligence of the avatar is provided by deep-learning algorithms, emotional intelligence to digitally define empathetic behaviour needs to be developed. The intimate relationship between emotions and human intelligence seem to justify this effort. Both AI and the neural architecture of the brain deal with information, adding plausibility that AI, as a theory, may encompass emotions and possibly consciousness as well.

Michael Graziano, an American scientist and novelist who is currently professor of Psychology and Neuroscience at Princeton University, has claimed that consciousness in AI may be an engineering problem. This implies that conscious awareness and the emotional aspects of empathy are possibilities for machines to address, once we overcome the current technical limitations of ‘narrow-AI’. One can conclude that the path towards Artificial General Intelligence (AGI) will stipulate machine consciousness as conversational avatars will eventually become an integrating part of our social and business infrastructure. The interesting part of launching conversational avatars is not just their effectiveness in improving expert conversations, but also in fuelling the discussion about the role of homo sapiens on the road towards singularity.

One Comment

  • Excellent and visionary article, as always! It is also important to promote empathy amongst humans directly. I recently invited Parag Mankeekar, social entrepreneur and Ashoka Fellow. He is the founder of RealLives.com. A reality computer game in which you are born in a different country and live a life according the conditions there. His talk can be viewed here: https://www.video.ethz.ch/events/2018/mankeeka.html

    Parag will be back in September for an open lecture at ETH and to teach in our social and bio-entrepreneurship course.

Leave a Reply

Your email address will not be published. Required fields are marked *