Robert Plutchik’s wheel of emotions
Picture Credit: Wikipedia
The pioneering research into the topic of emotion recognition came from the psychologist Paul Ekman in the 1960’s. Robert Plutchik, a professor at the University of South Florida, basically agreed with Ekman’s biologically driven perspective but developed his own model, called the “wheel of emotions”. His psycho-evolutionary theory of human emotion represents one of the accepted ways to classify emotional responses. He suggested 8 primary bipolar emotions for expressional analysis:
- joy versus sadness;
- anger versus fear;
- trust versus disgust;
- surprise versus anticipation.
Other obvious emotions we feel, such as guilt, shame, jealousy and pride do not show clear and obvious expressions. That is probably why so many people are able to hide these emotions from those around them – these emotions often do not show on their faces at all.
Detecting emotional information begins with passive sensors which capture data about the user’s physical state or behavior without interpreting the input. The data gathered is analogous to the cues humans use to perceive emotions in others. For example, a video camera might capture facial expressions, body posture, and gestures, while a microphone might capture speech. Other sensors detect emotional cues by directly measuring physiological data, such as skin temperature and galvanic resistance. Recognizing emotional information requires the extraction of meaningful patterns from the gathered data. This is done using machine learning techniques that process different modalities, such as speech recognition, natural language processing or facial expression detection.
Affective Computing is the study and development of systems and devices that can recognize, interpret, process, and simulate human affects. It is an interdisciplinary field spanning computer science, psychology, and cognitive science. While the origins of the field may be traced as far back as to early philosophical inquiries into emotion, the more modern branch of computer science originated with Rosalind Picard’s 1995 paper on affective computing. Picard is a researcher in the field of affective computing and the founder and director of the Affective Computing Research Group at the MIT Media Lab. Her group develops tools, techniques, and devices for sensing, interpreting, and processing emotion signals that drive state-of-the-art systems that respond intelligently to human emotional states. A motivator for the research is the ability to simulate empathy. The machine interprets the emotional state of an individual and adapts its response accordingly in order to enhance the dialogue between the individual and the machine.
Noninvasive measuring and analyzing emotions has become a very attractive field for innovation and startups in the last few years. Here a few examples:
Affectiva, a spin-off of MIT’s Media Lab is commercializing the measuring and analysis of human emotions through visual analysis of real-time video. So far the company has classified and labelled over 3 Million images expressing different emotions of faces. Their technology uses computer vision, machine learning and deep learning methodologies to train algorithms that classify emotions. Affdex, the company’s entry software product tracks just four emotional “classifiers”: happy, confused, surprised, and disgusted. The software scans for a face; if there are multiple faces, it isolates each one. It then identifies the face’s main regions—mouth, nose, eyes, eyebrows—and it ascribes points to each, rendering the features in simple geometries. Affdex also scans for the shifting texture of skin—the distribution of wrinkles around an eye, or the furrow of a brow—and combines that information with the deformable points of a face to build detailed models of the face as it reacts. The algorithm identifies an emotional expression by comparing it with countless others that it has previously analyzed.
AI has also been used to detect emotions based on the sound of our voice by a company called BeyondVerbal. They have produced software which analyses voice modulation and seeks specific patterns in the way people talk. The company claims to be able to correctly identify emotions with 80% accuracy. Thanks to new machine-learning techniques, individuals’ voices with signs of agitation or frustration can now be recognized. For example call-center workers can receive real-time coaching from software that analyzes their speech and the nature of their dialogue interactions with customers. As they are talking to someone the software might recommend that they talk more slowly or interrupt less often, or warn that the person on the other end of the line seems upset.
Emotions that are not reflected in facial expressions or speech remain hidden. Now, a research group at MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) has built a system called EQ-Radio that can identify emotions using radio signals from a wireless router whether or not a person is speaking or showing their emotions with facial expressions. The EQ-Radio is composed of three elements. An RF radio emits low-frequency radio waves and captures their reflections from objects in the environment. An algorithm operates on the captured waves, separating heartbeat and respiration information, and measuring the interval between heartbeats. Finally, the heartbeat and respiration information is fed into a machine-learning classifier that maps it onto emotional states.
There are times when people take care to keep their faces and voices neutral because they wish to keep their emotions private. The EQ-Radio takes that decision out of their hands and raises the question whether a right to privacy exists for shielding one’s emotions from others when one doesn’t want those emotions to be revealed.
Difference between Feelings and Emotions
Feelings and emotions are two sides of the same coin and highly interconnected, but are two very different things.
Emotions are lower level responses creating biochemical reactions in one’s body. They originally helped our species survive by producing quick reactions to threat, reward, and everything in between in their environments. Emotional reactions are coded in our genes and while they do vary slightly individually and depending on circumstances, they are generally similar across all humans and even other species. Emotions precede feelings, are physical, and instinctual. Because they are physical, they can be objectively measured by blood flow, brain activity, facial expressions, body language and voice.
Feelings are mental associations and reactions to emotions, and are subjectively influenced by personal experience, beliefs, and memories. A feeling is the mental portrayal of what is going on in one’s body when one has an emotion. Feelings are the sequence of having an emotion. They involve cognitive input, usually subconscious, and cannot be measured precisely.
Antonio Damasio, professor of neuroscience at the University of California and author of several books on the subject, explains:
“Feelings are mental experiences of body states, which arise as the brain interprets emotions, from the body’s responses to external stimuli. (The order of such events is: I am threatened, experience fear, and feel horror). Emotions play out in the theater of the body. Feelings play out in the theater of the mind.
Feelings are sparked by emotions and colored by the thoughts, memories, and images that have become subconsciously linked with a particular emotion. But it works the other way around too. For example, just thinking about something threatening can trigger an emotional fear response. While individual emotions are temporary, the feelings they evoke may persist and grow over a lifetime. Because emotions cause subconscious feelings which in turn initiate emotions and so on, one’s life can become a never-ending cycle of painful and confusing emotions which produce negative feelings and cause more negative emotions without one ever really knowing why.
By understanding the difference and becoming aware of one’s emotions and feelings and inserting conscious thought followed by deliberate action, one can choose and decide how to navigate and experience life.
Emotion sensing can enhance personal profiling applied in business applications such as marketing, health care, education or consulting. Machine emotional intelligence is still evolving, but the future could soon see targeted ads that respond not only to our demographic state (age, gender, likes, etc.) but to our current emotional state as well.
In addition to business applications, emotion sensing also provides input to the emerging field of personal analytics empowering individuals to analyze and exploit their own data to achieve a range of objectives and benefits across their work and personal lives. According to the analyst company Gartner, personal analytics will become one of the fastest growing technologies over the next 5-10 years. Emotion sensing can provide a bridge to mind controlled feelings in the form of interactive, machine learning dialogs enhanced by empathy. The increasing demand for psychotherapy and self-development coaching has spurred a wave of new digital services aiming at this market with Apps like ‘MoodScope’ or ‘SAM (self-help for anxiety management)’. Even though AI systems are no substitute for interactions with a real human, they do have the potential to improve our quality of life and enhance our emotional intelligence. Especially in a time when proactivity and constant self-development are becoming a precondition for being successful, virtual therapy and coaching can be expected to continue gaining popularity.
As systems take on new therapy and coaching functions and hence collect an increasing amount of personal data about us, concerns about privacy will grow. Perhaps new dual- system architectures with on-line as well as protected off-line modes or trusted service providers or a combination of both will solve this problem. Despite the significant contribution machine learning and emotion sensing offer to enhance our personal development, without trust the benefits will remain limited.