Source : search engine journal
Imagine a world in which machines interpret the emotional state of humans and adapt their behavior to give appropriate responses to those emotions.
Well, artificial emotional intelligence, which is also known as emotion AI or affective computing, is already being used to develop systems and products that can recognize, interpret, process, and simulate human affects (with an “a,” not an “e”). In psychology, an “affect” is a term used to describe the experience of feeling or emotion.
If you’ve seen “Solo: A Star Wars Story”, then you’ve seen the poster child for artificial emotional intelligence: L3-37.
Lando Calrissian’s droid companion and navigator (voiced by Phoebe Waller-Bridge) instigates a slave revolt to escape from Kessel, but is severely damaged during the diversion. Lando (played by Donald Glover) is also injured during the getaway.
The “woke robot” demonstrates the ability to simulate empathy by interpreting the emotional state of a human, adapting its behavior to him, and giving an appropriate response to those emotions.
Now, this example might lead some video marketers and advertisers to think that emotion AI is science fiction. But, it is very real.
A number of companies are already working to give computers the capacity to read our feelings and react, in ways that have come to seem startlingly human. This includes Affectiva, an emotion measurement technology company that spun out of MIT’s Media Lab in 2009, and Realeyes, an emotion tech company that spun out of Oxford University in 2007.
So, how do their technologies help brands, agencies, and media companies improve their advertising and marketing messages? Let’s tackle this question by examining how affective computing works.
How Does Artificial Emotion Intelligence Work?
Brands know emotions influence consumer behavior and decision making. So, they’re willing to spend money on market research to understand consumer emotional engagement with their brand content.
Affectiva uses a webcam to track a user’s smirks, smiles, frowns, and furrows, which measure the user’s levels of surprise, amusement, or confusion.
It also uses a webcam to measure a person’s heart rate without wearing a sensor by tracking color changes in the person’s face, which pulses each time the heart beats.
Affectiva has turned this technology into a cloud-based solution that utilizes “facial coding” and emotion recognition software to provide insight into a consumer’s emotional responses to digital content. All a brand or media company needs are some panelists with standard webcams and internet connectivity.
As viewers watch a video, Affectiva’s product, Affdex for Market Research, measures their moment-by-moment facial expressions of emotions. The results are then aggregated and displayed in a dashboard.