Artificial intelligence-enabled deep fakes are typically associated with fake viral images of well-known personalities, such as Pope Francis in a puffer coat or Donald Trump under arrest. Still, experts say they are more widely used for generating non-consensual porn that can destroy ordinary lives.
Women are a particular target of AI tools and apps — widely available for free and require no technical expertise — that allows users to digitally strip off clothing from their pictures or insert their faces into sexually explicit videos.
“The rise of AI-generated porn and deep fake porn normalizes the use of a woman’s image or likeness without her consent,” Sophie Maddocks, a researcher at the University of Pennsylvania tracking image-based sexual abuse, told AFP.
In a tearful video, an American Twitch streamer who goes by QTCinderella lamented the “constant exploitation and objectification” of women as she became the victim of deep fake porn. She was harassed, she added, by people sending her copies of the deep fakes depicting her, according to the “Ahram Online” website.
The scandal erupted in January during a live stream by fellow streamer Brandon Ewing, who was caught looking at a website that contained deep-faked sexual images of several women, including QTCinderella.
The proliferation of online deep fakes underscores the threat of AI-enabled disinformation, which can damage reputations and lead to bullying or harassment.
While celebrities such as singer Taylor Swift and actress Emma Watson have been victims of deep fake porn, women not in the public eye are also targeted.
American and European media are filled with first-hand testimonies of women — from academics to activists — who were shocked to discover their faces in deep fake porn.
Some 96% of deep fake videos online are non-consensual pornography, and most depict women, according to a 2019 study by the Dutch AI company Sensity.
Among a new crop of text-to-art generators are free apps that can create “hyper-real AI girls” — avatars from real photos, customizing them with prompts such as “dark skin” and “thigh strap.”
New technologies such as Stable Diffusion, an open-source AI model developed by Stability AI, have made conjuring realistic images from text descriptions possible.
The tech advancements have given rise to what Duffield called an “expanding cottage industry” around AI-enhanced porn, with many deep fake creators taking paid requests to generate content featuring a person of the customer’s choice.
Last month, the FBI warned about “sextortion schemes,” in which fraudsters capture photos and videos from social media to create “sexually themed” deep fakes to extort money.
The victims, the FBI added, included minor children and non-consenting adults.