Faking it? Just hope there’s no computer around
A new found that computer systems are able to spot real or faked expressions of pain more accurately than humans.
Published Friday, March 21, 2014 2:56PM EDT
While your Oscar-worthy grimace might fool your boss into believing you have a headache, new research suggests that computers aren’t nearly as gullible.
A joint study conducted by researchers at the University of Toronto and the University of California, San Diego found that computer systems are able to differentiate between real and faked expressions of pain more accurately than humans.
The study found that humans could only differentiate between real and faked expressions of pain in other humans at no better than random chance.
And even with training to help spot fakers, humans only had a 55 per cent accuracy rate. Computers, on the other hand, had an accuracy rate of 85 per cent.
“Humans can simulate facial expressions and fake emotions well enough to deceive most observers,” Kang Lee, one of the study’s co-authors and a professor at the University of Toronto’s Dr. Eric Jackman Institute of Child Study, said in a summary of the study on the school’s website. “The computer’s pattern-recognition abilities prove better at telling whether pain is real or faked.”
The study’s authors say that computer pattern-recognition technology has several practical applications, from helping with medical diagnosis to national security.
What gives it away?
Researchers found that the mouth illustrates falsified expressions of pain most clearly. Fakers’ mouths open with less variation and far too often, they said.
The computer vision system used in the study could potentially help health-care workers determine if people are faking pain, researchers said.
But they note it could also be used to detect other deceptive actions in the world of homeland security, psychopathology, job screening, medicine and law.
“As with causes of pain, these scenarios also generate strong emotions, along with attempts to minimize, mask, and fake such emotions, which may involve ‘dual control’ of the face,” Marian Bartlett, the lead author of the study and research professor at UC San Diego’s Institute for Neural Computation, said in the summary.
“In addition, our computer-vision system can be applied to detect states in which the human face may provide important clues as to health, physiology, emotion, or thought, such as drivers’ expressions of sleepiness, students’ expressions of attention and comprehension of lectures, or responses to treatment of affective disorders.”