Heinous abuse enabled by AI continues as generative tools and websites that create and host deepfake pornography thrive in the absence of legislative oversight in the U.S. and piecemeal actions from platforms enabling these sites. In 2023, deepfake pornography made up 98% of all deepfake content online. Women make up 99% of its victims, and 74% of the surveyed men — who drove monthly traffic of 34,836,914 to the most popular deepfake pornography websites — expressed they felt no guilt about their actions.
While most of these individuals feel inculpable, the victims of deepfake pornography are speaking openly about the horrific trauma such abuse brings. In a recent interview, U.S. Representative Alexandria Ocasio-Cortez spoke out about her encounter with a synthetic pornographic image of herself on the social platform X, as well as the shock and resurfacing of past traumas the image caused. Stressing that deepfake pornography and its consequences for survivors are far from imaginary, Ocasio-Cortez stated that deepfakes “parallel the same exact intention of physical rape and sexual assault…Deepfakes are absolutely a way of digitizing violent humiliation against other people."
Such humiliation and trauma are becoming commonplace as misogyny-forward spaces run amok, enabling and encouraging the spread of malicious pornographic deepfakes. Their owners and users remain unpunished and unopposed. The onslaught of deepfake pornography headlines dominates the news cycle, each story a fresh reminder of the devastating impact on victims, but we have yet to see a robust response. Interest in legislating against deepfakes among lawmakers spiked when disturbing pornographic images of Taylor Swift flooded social media. The halls of Congress are less stirred when reports of deepfake pornography concern women and girls without massive public platforms, like the students of MacArthur High School, who had no other recourse but to investigate their cases of deepfake abuse on their own.
Many of the women and girls targeted by these websites are searching for ways to fight back and reclaim control over their likeness and lives. Over half of people who have been targeted by image-based abuse contemplate suicide. So far, legislators and social platforms have not done enough to help them. Despite increased efforts from some states, there are still no effective deepfake pornography laws on the books in most of the U.S., meaning that women have no way of seeking meaningful help from the authorities. The issue of deepfakes is a bipartisan matter that affects people regardless of political affiliation, and it must be tackled as such, with non-negotiable and immediate cooperation from members of government.
What Can Be Done
Some experts have advised women to avoid posting images of themselves on social media, advice that follows the long-held tradition of placing the burden on the abused, rather than the abusers. With a single image of a person’s face, malicious actors create a sixty-second pornographic video of the victim in less than thirty minutes. It is not only unreasonable, but outrageous for us to expect victims to completely scrub their online identities, especially when life-ruining content can be created with one photo. Besides, this solution does nothing for women in the public sphere, who have even less control over the distribution of their likeness.
Much like legislators, tech companies have been slow to act on deepfake pornography. As of March, more than 13,000 copyright complaints — encompassing almost 30,000 URLs — have been made to Google concerning content on a dozen of the most popular deepfake websites, utilizing The Digital Millennium Copyright Act. While Google removes much of the reported content, experts question why search engines haven’t taken stronger measures against websites that openly exist to distribute non-consensual deepfake pornography, or haven’t removed them from the search ecosystem entirely. By continuing to drive traffic to these sites, companies are enabling a booming industry that has built itself around the most flagrant violations of consent.
The path toward obtaining justice for victims of deepfake pornography is clear. Legislators on both sides of the aisle must enact strict laws that punish the creation and distribution of non-consensual deepfake pornography, deterring those who engage in this vile behavior. Companies must accept responsibility for driving traffic to distributors and continue to find more powerful ways to scrub such content from online spaces. Digital platforms must integrate strong deepfake detection measures into their workflows to ensure that all harmful content can be flagged, labeled, and removed before it can ruin countless lives of women and girls.
We owe it to the most exploited members of our society to take every conceivable measure of effectively countering the epidemic of deepfake pornography before it grows beyond our reach.