New research provides evidence that psychopathic personality traits are associated with the creation and dissemination of deepfake pornography. The findings have been published in the journal Computers in Human Behavior.
Deepfakes, a portmanteau of “deep learning” and “fake,” describe realistic fake images, audio, and videos that are generated using artificial intelligence software. A deepfake video of former President Barack Obama using an expletive to describe President Donald Trump went viral in 2018. More concerningly, the technology has been used to create fake pornographic content involving real non-consenting individuals.
“Although practitioners and lawmakers within the United Kingdom have recently made great strides to make behaviors associated with image-based sexual abuse illegal (e.g., ‘revenge pornography’ and upskirting), there remains more to be done for equally impactful and damaging behaviors, such as deepfake sexual media production,” explained study author Dean Fido, a senior lecturer in forensic psychology at the University of Derby.
“Deepfaking, in this instance, refers to the production of sexually explicit images using artificial intelligence to transpose one image onto a secondary source to give the impression that somebody is engaging in sexual behavior. Research in this area is key due to both the frequency that such images and videos are created, and the social (e.g., reputation), professional (e.g., termination of employment), and physical and mental health consequences (e.g., poor mental wellbeing and potentially suicide and self-harm) of becoming a victim of deepfaking.”
For their study, the researchers randomly assigned 290 U.K. participants to read one of four vignettes describing a deepfake incident in which an individual had created and shared a fake sexualized image of another person after being unable to engage them in a physical relationship. The victim was described as either a man or woman and as either a celebrity or an ordinary individual.
The researchers found that participants who scored higher on a measure of psychopathy were more likely to believe that the victim was to blame for the incident, were less likely to view the situation as harmful, and less likely to believe the incident was a criminal matter. More psychopathic individuals were also more willing to create and disseminate deepfakes themselves. Those who scored higher on a measure of belief in a just world were also more likely to blame the victim and disseminate deepfakes.
“Personality traits associated with the support of deviant online behavior and sexual offending more broadly, for example psychopathy, predicted more lenient judgements of offending as well as a greater proclivity to create and share images – suggesting that through further research, we might be able to better predict people who are more likely to engage in such behavior and intervene accordingly,” Fido explained.
The researchers also found that deepfake pornography depicting female victims was associated with greater perceptions of harm and criminality than cases with male victims. Women tended to view the deepfake incidents as more harmful and criminal than did men. Men tended to view deepfake pornography cases involving celebrity victims as being less harmful and less criminal than cases involving non-celebrity victims. Women, in contrast, did not differentiate between celebrity and non-celebrity victims.
In a second study, the researchers replicated their findings in a sample of 364 U.K. participants. They also sought to examine whether deepfake images solely generated for personal use would be perceived differently than images that were shared. For both celebrity and non-celebrity victims, participants perceived greater victim harm when the images were shared compared to when the images were only created for personal use.
“We consistently observed that more lenient judgements of scenarios using deepfaking involved victims who were celebrities and male, and when images were created for self-sexual gratification rather than being shared – suggesting that situational elements and victim demographics impact how they will be seen by the general population,” Fido told PsyPost.
The researchers also found that participants who were attracted to women (a sample that mostly included heterosexual men but also some homosexual women) reported being less willing to share deepfake images.
“What is also interesting is a tendency for men to not express a willingness to share deepfaked material,” explained co-author Craig Harper, a senior lecturer at Nottingham Trent University. “For us, this seems to indicate that men who would be willing to create deepfake pornography would do so not to embarrass or humiliate any potential victim, but instead would primarily use this material for their own sexual satisfaction. With this private use in mind, it is quite possible that some victims of deepfaking might never know that their images have been used in such a way.”
As with any study, the new research includes some limitations.
“There are two major caveats to this research,” Fido explained. “The first is that it sampled entirely participants from the United Kingdom. Although this was designed to ensure consistency of context, broader values, and legislation, it is important to replicate these findings internationally and to explore potential cultural elements which might attribute to variation in results.”
“Second, although we are confident in the contextual accuracy of the stories that we presented to our participants, these were not informed by individuals with lived experience of perpetrating or being victims of deepfaking. Future research would benefit from this consultancy to better inform our research strategies.”
Deepfakes are a relatively new phenomenon, and the researchers noted that there is still much to learn about the psychological variables involved.
“To date, there is extremely limited research into predictors of engaging in deepfake media production and dissemination,” Fido said. “Although we were able to generate data related to the hypothetical generation of such images, we are yet to understand exactly why somebody would want to generate and/or disseminate such media. Developing this knowledge is an important future step, which will also feed into advisory information and intervention programs to prevent deepfaking. As such, we welcome contact from victims, policy makers, and academic/industrial collaborators alike to better our understanding in this area.”
“There is still so much that we don’t know about this kind of behavior,” Harper added. “In particular, we have not been able to capture the victim’s voice in this work, and this is something that we’re keen to do in future work to both improve the quality of our ability to detect people at risk of engaging in deepfaking, but also to tell the stories of those who have lived through their images being used without their consent.”
The study, “Celebrity status, sex, and variation in psychopathy predicts judgements of and proclivity to generate and distribute deepfake pornography“, was authored by Dean Fido, Jaya Rao, and Craig A. Harper.