Stolen Instagram pics used in deepfake AI porn: What to know
LOS ANGELES - Southern California victims of deepfake pornography issued a warning that your social media photos could be stolen and manipulated to produce explicit and non-consensual content.
"I was in shock," said Regina Knoll. "How could people do this to someone they don't even know?"
Knoll has been victimized by deepfake pornography, as well as Uldouz Wallace. Both women reside in Los Angeles' San Fernando Valley.
"When I first found out, it was devastating," Wallace said.
An online perpetrator stole their images and, with the help of artificial intelligence, turned them into explicit photos. In Wallace's case, they turned her images into pornographic videos.
"Anyone can just take your picture from social media and use any of these apps that are really popular right now and upload pornographic materials and add your face to it, so it could literally be anyone," Wallace said.
When Wallace went online searching for deepfakes using her likeness, she was floored at the amount of material online. Just a few days ago, she found 2,000 links.
"Who has that much time to create all this non-consensual content unless they're profiting from it," laments Wallace.
Knoll is a model, and Uldouz is an actress with a massive 4.9 million followers on Instagram, but it was devastating when the deepfake material exploded online.
For both women, the aftermath has been devastating.
"All of the brands dropped me that I worked with, all of the sponsorships, agents, managers, all of the people that were influencers and actresses that I collaborated with, they were doing everything to basically push me out of the way," Wallace explained.
Knoll lost jobs, too. "I feel like only victims know how really truly terrifying it is and how life-altering it is," she said.
The typical reaction for most victims is hiding from the shame, which Wallace and Uldouz did.
"I thought about it, and I was like, you know, the worst has already happened. Everybody already called me every name in the book and thought the worst of me," said Wallace. Shortly after that pivotal moment, she created the non-profit Foundation RA to help other victims in similar situations.
There are no laws against AI technology and deepfake content, but Wallace hopes to change that.
SUGGESTED: Taylor Swift AI-generated explicit photos spark outrage
"I helped start a bill called "The Protect Act," and the Protect Act will basically hold our online platforms accountable, and it will verify the age and consent of everyone depicted in the content." Senator Mike Lee of Utah first introduced The Protect Act in 2022.
Both women share their stories to encourage others to do the same. Wallace urges victims to speak up.
"Break the silence, share your story, share it with someone, remove the shame from it, and take back your power," Wallace said.