Unlike Emoticons, Human Emotions Are Difficult to Interpret
by Ed Lyon
Humans communicate in many ways. Although speech is our primary mode, many non-verbal cues assist in getting one’s message across. Folded arms, crossing then uncrossing one’s legs, yawns, and eye rolling all augment verbal speech. A recent push to scientifically read and interpret peoples’ emotional states has led to several companies marketing products to aid in their interpretive efforts.
Northeastern University psychology professor Lisa Feldman Barrett expresses credible doubt concerning this practice. “The topic of facial expressions of emotion – whether they’re universal, whether you can look at someone’s face and read emotion in their face – is a topic of great contention that scientists have been debating for at least 100 years,” she stated. Because of opinion and belief differences in that area, Barrett referred to a recent study by five accomplished scientists on the subject of reading emotions by the Association for Psychological Science (APS).
This study’s stated purpose was for the scientists, of whom Barrett was one, to perform “a systematic review of the evidence testing the common view” that emotion can be reliably determined by external facial movements.”
She said each scientist “represented very different theoretical views. We came to the project with very different expectations of what the data would show, and our job was to see if we could find consensus in what the data shows and how best to interpret it. We were not convinced that we could, just because it’s such a contentious topic.”
True to Barrett’s concerns, this study took two years to complete, far longer than the months envisioned by the APS.
There’s no scientific support for the common assumption “that a person’s emotional state can be readily inferred from his or her facial movements.”
There might be a multitude of reasons a person might scowl, for example. A scowl might represent confusion, concentration, flatulence and, of course, anger. Similar or same facial expressions are just not reliable signs of what people are thinking or feeling.
People raised in different cultures may associate an entirely different facial expression caused by the same stimulus than people raised in other cultures.
The study’s overall conclusion was that “facial configurations in question are not ‘fingerprints’ or diagnostic displays that reliably and specifically signal particular emotional states regardless of context, person and culture.”
Industrialized emotion interpretation technology continues to thrive despite the APS study’s findings. This “Robot Surveillance” market is projected to be a $3.8 million market by 2025.
Regarding such an industry, Barrett explained, “there is no automated emotion recognition. The best algorithms can encounter a face — full frontal, no occlusions, ideal lighting — and those algorithms are very good at detecting facial movements. But they’re not equipped to infer what those facial movements mean.”
From this information, it is not hard to imagine a robot surveillance smart camera worn by a cop signaling hostile intent on a citizen’s face because the citizen is experiencing stomach distress or gas with the ultimate result being another cop shooting. For the best ultimate result in reading another person’s emotional state, a direct verbal question would likely elicit the best and most accurate response.
Source: aclu.org
As a digital subscriber to Criminal Legal News, you can access full text and downloads for this and other premium content.
Already a subscriber? Login