The Deepfake Porn Crisis Is Here
These laws, however common sense they may feel, will still face an upward battle. In May the Elon Musk–owned platform X sued the state of Minnesota over its law banning the creation of deepfakes to influence an election, which it said violated free speech. In August, Musk won a similar lawsuit against the state of California for its deepfake ban.
For Guistolise, this event has caused immeasurable pain. She’s lost trust in others. She’s afraid of how this may affect her future and her career. However, there is a “next” for her. She gets to go on being the sister and friend she’s always been. She’s excited to go to work tomorrow. She’s training her pitbull puppy, whom she appropriately named Olivia Benson, how to give a high five. And despite it all, “I love humans,” she says, before pausing. “I guess I still do.”
Some Practical Steps to Take If You’re a Victim of Deep Fakes
It’s impossible to measure the toll and reach of deepfakes, as apps allow users to create them in mere moments. However, according to the experts we spoke to for this piece, there are practical steps to take if you find out you’re a victim.
Call a loved one.
“I recommend that somebody calls a friend to help them with this process,” Martone says, noting that it can be difficult for people to view the images over and over alone.
If you don’t feel comfortable talking with a friend or loved one, the SVPA has a hotline you can call at 800-656-HOPE (4673), and you can chat with RAINN online.
If you do want to move forward with any potential legal action, Martone recommends having a friend or loved one document all the instances of the deepfake so you have them easily available for lawyers or the police. (The one caveat is if the victim is underage: Do not screen-shot any content. Instead, take your phone directly to your local police station for them to catalog it.)
Alert the platform.
Next step is to alert any social platforms where the deepfake may have been distributed. On Instagram, you click on the image, then Report, then False Information, and Digitally Altered or Created.
Tell your school or work.
If you’re comfortable telling work or school, or think your deepfake may spread to these places, it’s a good idea to speak to either HR or an administrator (especially if you’re the parent of an underage child).
Deepen your media literacy.
Not all AI is bad. Take BitMind as a good example, which specializes in “detecting AI-generated and synthetic media.” Its founder, Ken Jon Miyachi, tells Glamour that the program can identify both synthetic and semisynthetic media that may be hard for humans to detect, which can help people understand whether what they are seeing is real or not. Here are more expert tips on spotting misinformation.
Talk about it.
Meghan Cutter, the chief of victim services at RAINN, isn’t just looking to policy makers for change, but to all of us as well. “How do we as a society have conversations about sexual violence and different forms of sexual violence, and how we can make communities safer places for survivors to speak up and ask for help and identify that they might need support?” she says. “[We want to] create that awareness, so that when this does happen, people know this isn’t okay, and there’s something that I can do.”
Know that this isn’t your fault, and that it is violence.
As Cutter says, “Just because someone hasn’t actually physically touched you, or maybe the image is your face, but not your body, that doesn’t mean this isn’t a form of violence. This is a form of assault. It is a form of sexual violence. I think it’s really important to be explicit about that, to help survivors understand that there are options available to them and to have words for their experience.”