Every day we recognize friends, family, and co-workers from afar — even before we can distinctly see a face. New research reveals that when facial features are difficult to make out, we readily use information about someone’s body to identify them — even when we don’t know we’re doing so.
“Psychologists and computer scientists have concentrated almost exclusively on the role of the face in person recognition,” explains lead researcher, Allyson Rice of the University of Texas at Dallas. “But our results show that the body can also provide important and useful identity information for person recognition.”
The new findings are published in Psychological Science, a journal of the Association for Psychological Science.
In several experiments, the researchers had college-age participants look at images of two people side by side and identify whether the images showed the same person or different people. Some of the image pairs looked very similar even though they actually showed different people, while other image pairs looked very different even though the pictures showed the same person.
The image pairs were chosen this way so that the information provided by the subjects’ faces was ambiguous and not very helpful in determining the subjects’ identity, based on a computer face recognition system’s performance with these image pairs. The researchers edited the pictures in several of the experiments, omitting the subjects’ bodies or faces to determine which features were most important for successful identification.
Overall, participants were able to accurately discern whether the images showed the same person when they were provided complete images. And participants were as accurate in identifying image pairs in which the faces were blocked out and only the bodies were shown.
But accuracy dropped off when participants saw images that included the subjects’ faces but not their bodies. Surprisingly, seeing the subjects’ bodies seemed to increase their accuracy in identifying the subjects.
Participants stated afterward that they used the nose, face shape, ears, mouth, and eyes as tools for identifying even though their results suggested otherwise.
“This left us with a paradox,” the researchers write. “The recognition data clearly indicated the use of body information for identification. However, the subjective ratings suggested that participants were unaware of how important the body was in their decision.”
To unravel the paradox, the researchers used eye-tracking equipment to determine where participants were actually looking while identifying the images, and the resulting data were clear: Participants spent relatively more time looking at the body when the face did not provide enough information to identify the subjects.
“Eye movements revealed a highly efficient and adaptive strategy for finding the most useful identity information in any given image of a person,” Rice explains.
By showing that humans don’t just rely on faces to identify others, this study opens up new avenues for developing and refining computer-based recognition systems.
“These systems are employed currently in law enforcement and security settings, but they sometimes fail when the quality of the facial image is poor,” says Rice.
Rice, a newly graduated student in psychology and criminology, conducted this new research with Prof. Alice O’Toole at The University of Texas at Dallas. The UT Dallas research team has been collaborating with researcher Jonathon Phillips and others at the National Institute of Standards and Technology (NIST) in examining how humans and computers stack up when it comes to recognizing faces and people. Every couple of years, NIST organizes an international competition for computer-based face recognition systems, introducing increasingly difficult problems.
“We will continue comparing humans to machines on these new challenges,” says O’Toole. “By looking at the way computer face recognition systems work, we often learn new and surprising things about the way humans recognize other people.”