Skip to main content
University at Buffalo

UB Today

A publication of the University at Buffalo Alumni Association

Fall 2012

Features

Profiles

Departments

UB EVENTS CALENDAR

Join the Alumni Association

UBAA on Facebook

To stop receiving the print version and read UB Today online, > click here

To view a virtual version of the print magazine > click here. To view the PDF version of this issue > click here

Lying Eyes

Those Lying Eyes

Researchers explore whether machines can read the visual cues that give away human deceit

Story by Riley Mackenzie

Here’s Moe, the sad-sack bartender on “The Simpsons,” hooked up to a police polygraph. “I got a hot date tonight,” he says. The machine buzzes: It’s a lie. “A date.” (buzz) “Dinner with friends.” (buzz) “Dinner alone.” (buzz) “Watching TV alone.” (buzz) “All right! I’m gonna sit at home and ogle the ladies in the Victoria’s Secret catalog.” (buzz) “Sears catalog.” (ding!) “Now, would you unhook this already, please? I don’t deserve this kind of shabby treatment.” (buzz). Poor Moe. You can’t fool a machine—in the popular imagination, at least. The reality, for those studying how computers can catch human beings in a lie, is a whole lot trickier.

Now a new project by three UB researchers is showing promise for computer-aided lie detection. Analyzing videotapes from a previous study by Mark Frank, BA ’83, a Department of Communication professor, researchers in UB’s Center for Unified Biometrics and Sensors (CUBS) developed an algorithm that measured changes in the subjects’ eye movements. The video analysis software identified the liars in the group with 82.5 percent accuracy— far better than even experienced police investigators, who typically are about 65 percent accurate in ferreting out falsehood.

“The eyes have been an area of interest from time immemorial,” says Frank, who has consulted widely with law enforcement agencies on how to detect deception. “There are parts of the Hindu Vedas from 3,000 years ago that say, ‘Liars look away.’ But the conventional wisdom is not always true. For example, liars may learn to look you in the eye.”

But, he says, we have a much harder time controlling the signals our faces give off. These signals, Frank says, “leak” subtle cues to our emotions, in “microexpressions” that flash by in a fraction of a second. Human beings often miss those fleeting signals. Likewise, the eyes can generate different movement patterns when people are trying to manipulate others, as when lying, and that too is a subtle signal people often cannot detect. The study showed that, apparently, computers can.

 

A high-stakes experiment

The trio’s collaboration began with 40 videotaped interviews culled from a previous study by Frank on the ways group affiliations foster terrorism, research funded by the U.S. Department of Defense. The subjects were chosen because they had strong feelings about a political or ethical issue—the Palestinian-Israeli conflict, animal rights, abortion, even one’s political party.

One by one, the subjects were placed in a room and told that down the hall was an envelope containing a check made out to an organization they opposed. A die-hard Democrat, for example, would find a check payable to the Republican Party. Then they had to decide: They could walk down the hall and steal the check, thus keeping it out of the coffers of the hated organization, or they could leave it be.

After they took the check or decided not to, each subject then faced questions from an interrogator, mostly retired FBI agents who presented themselves as sympathetic to the “opposed” group. The conversation was mundane until, at a crucial juncture, the questioner lowered the boom: “Did you remove the check from the envelope?”

Maybe they did, maybe they didn’t. For the subjects, the goal was to lie successfully. If they managed that, they were led to believe that they would be able to tear up the check to this opposing organization; instead the $100 would go to the organization they favored and the subject got a $75 bonus. If they lied unsuccessfully, meaning they were caught in their deceit, the organization they opposed got the money. So the stakes were high; these were liars under pressure.

 

Mining for information

Enter Venu Govindaraju, PhD ’92 & MS ’88, a SUNY Distinguished Professor in the Department of Computer Science and Engineering, and founding director of CUBS; and Ifeoma Nwogu, PMCRT ’10 & PhD ’09, a CUBS research assistant professor. (Another researcher and co-author of the study, Nisha Bhaskaran, MS ’10, has left UB and is now a software developer in Los Angeles.) Together with Frank they obtained a National Science Foundation grant to pursue further work on these data. They took an optical disc containing excerpts from the 40 interviews—culled from about 130 total and chosen for diversity of age, sex and race—and wrote software that analyzed the subjects’ eye movements in excruciating detail. At issue: At the “gotcha” moment when the interviewer asked about the stolen check, did the subjects’ eyes begin to move in a different way?

It was technically tricky work. The researchers had to deal with, for example, reflections from some subjects’ eyeglasses and the data disaster that resulted when someone’s hair fell over one eye.

The algorithm looked at how often the person blinked and the direction of his gaze. The data was analyzed used Bayesian statistical techniques, which estimate the probability that two events (such as eye movement and lying) are related.

“The baselines have to be established to see what is normal behavior for this person,” Govindaraju says, “and when the questions are being asked and there is an incentive to lie, what changes are taking place. The analysis is probabilistic in nature, so it will make some errors.”

Nevertheless, the researchers found remarkable success: Better than eight times in 10, when the subjects answered at the “gotcha” moment, the algorithm detected telltale shifts in the subject’s eye movement.

 

A note of caution

Frank cautions that the popular notion of a perfect “lie detector” is still a flight of fancy. Conventional systems measure heart and respiratory rates, blood pressure and perspiration, but do not directly measure lies. Researchers also have tried to use linguistic analysis, static images of facial expressions and even thermal imaging to detect deceit.

The UB researchers’ algorithm points to physiological patterns that indicate something is going on with the person. That something could be a lie or it could be any of a thousand other events or emotions. Maybe, for example, one research subject stole a check in another context, and the interrogator’s “gotcha” question has brought back memories of that crime. And nobody’s completely comfortable parrying questions from a guy with a badge.

“The problem,” Frank says, “is that there is no Pinocchio response to lying— there is no unique behavior that indicates a lie. Everything that co-occurs with a lie has been found to occur with other things. … What this technology is detecting at its core is not a lie. It reflects the underlying emotion or effortful thinking. What’s really happening is that we’re learning how to read people, how to detect ‘hot spots’ rather than lies. That can then be a pointer to further questions and areas of interest. It makes for more effective questioning.”

 

Learning from each other

Both sides say working at the intersection of behavioral theory and computer analysis has been a fruitful way to collaborate. “Cross-fertilization is important,” Frank says. “The big accomplishment is pairing two very diverse fields. A lot of computer science is done in a behavioral vacuum, but knowing where to look matters.”

Adds Govindaraju: “Computers are good at looking at lots and lots of data and analyzing it, so there’s this notion of discovering new things. We might discover nuggets of knowledge, and we can go back to the behavioral scientists and say, does this piece make sense and does this kind of correlation between the verbal behavior and the facial expression make sense?”

The study’s success has drawn international attention. Scientific American came calling, as did the BBC. The researchers presented their findings at the 2011 IEEE Conference on Automatic Face and Gesture Recognition and published their paper, “Lie to Me: Deceit Detection Via Online Behavioral Learning,” in the proceedings of that conference.

 

Widening the scope

The scientists are now looking at the full set of videotapes from Frank’s terrorism study and exploring the idea of expanding their algorithm to examine other facial cues: a scrunched forehead, a twitch of the lips, raised eyebrows. If the eyes are a window into the soul of a liar, might these other cues produce an even more accurate indicator? Nwogu also notes that for the algorithm to be fully useful in situations like police interrogations, it would have to produce its results instantly—something that would require further programming work.

And while no one is ready to roll this technique out as a commercial application just yet, Frank says machine analysis could be used outside of law enforcement as well. For example, he says, it might help persons with autism spectrum disorders, like Asperger’s syndrome, to recognize the social cues that make for smooth human interaction, or to discover early indications of schizophrenia.

Facial recognition technology “has the advantage of the human element,” Govindaraju says. “Faces are in public view, and most people can tell whether the facial images of two people are the same or different.” That’s not true, he says, for the other two major biometric instruments, fingerprint recognition and iris recognition.

And as Nwogu says, “When you have someone who really knows how to hide their emotions, it would be useful to have a detector that might help law enforcement deal with the occasional expert liars they face. … These changes in the face happen in a fraction of a second. We don’t want to miss them.”

Riley Mackenzie is a Buffalo freelance writer.

UB in the News

Fox News reports on UB research using nanoballoons and lasers to deliver anti-cancer medication

4/11/2014 The medications are deliverd straight to the tumor itself.

TIME Magazine reports on UB study about a new way to get college students to reconsider binge drinking

4/11/2014 Talking about the link between alcohol and cancer may work as a deterrent.

Forbes reports on UB research about the dangers of texting while walking

4/11/2014 UB professor of emergency medicine says putting your cell phone down when walking may save lives.

More of UB in the News