What Deepfake Technology Means for Women
[music]
Melissa Harris-Perry: You're listening to The Takeaway. I'm Melissa Harris-Perry. There are many reasons to love web-based innovation. Platforms like Zoom kept us connected in a pandemic. iPhoto lets us instantly share our picks with far-flung family. With Twitter, we find communities to express ourselves, sell products and even watch sports from across the miles. One of my favorites is JibJab. JibJab is an app that allows you to upload a picture of yourself onto the body of a dancing elf or an adorable puppy. You can even put your head on the body of an animated James round.
[music]
Just point-click hit send, and within moments, your best friend is rocking out to a birthday card, featuring a mashup of you and the godfather assault.
[music]
Easy face-swapping technology is great when it's silly, consensual animated fun, but as many people know there's a more sinister side, deepfakes. Deepfake first emerge within pornography and most remained pornographic in nature. Deepfake use AI technology to take the faces of unsuspecting non-consenting women and put them onto the bodies of people in pornographic videos.
These are not adorable holiday animations, and typically, it's not even obvious that they're fake. Up until now, deepfake required significant coding skills, but now there's a new app that makes it possible for almost anyone to point, click, hit send, and within moments, your boss is watching what appears to be a genuine video of you engage in explicit sexual acts. For more on this, we spoke with Rebecca Delfino, Law Professor at LMU Loyola Law School in Los Angeles, and Karen Hao, the Senior AI Editor at MIT Technology Review. Rebecca started by explaining a bit more about the complicated world of deepfake technology.
Rebecca Delfino: Deepfakes emerged on the internet as we know them now in about the end of 2017 and it started on Reddit, in fact, through a Reddit user who developed the concept of mapping one person's face onto the body of another person. It started in the context of pornography.
Melissa Harris-Perry: It actually began with pornography.
Rebecca Delfino: It did. The individual who coined the term deepfake on Reddit was taking images of the face of famous actresses and placing them on the bodies of women engaged in intimate and pornographic acts and then this individual then posted them on the Reddit site that they had created.
Melissa Harris-Perry: Karen, I'm assuming that this is pretty clearly non-consensual porn?
Karen Hao: Absolutely. Consensual porn is when the person being portrayed in the pornography actually willingly participates and even knows that it's happening. In this particular case, the Reddit user absolutely did not in any way ask for the consent of these female celebrities and too many of the female celebrities felt quite traumatized when they discovered that their image was being manipulated in this way.
Melissa Harris-Perry: Karen, I think of myself, not as a complete Luddite as having a few tech skills, I can cut a little sound for a radio show here and there, but I don't think that I'd be able to make something like this that didn't look obviously fake. Has the technology evolved to a point where even someone who is a novice can create these deepfakes?
Karen Hao: Unfortunately, yes. Back in 2017, it was still a quite complicated process. There were essentially AI algorithms that were open-sourced online and you needed to understand how to code in order to use it, to do this kind of face swap and impose these females' various faces onto porn star bodies. What we've seen over the last four years is the technology has gotten easier and easier from a user experience perspective where it's packaged into these consumer-type apps where you literally just have to push a button to do the same face-swapping.
Unfortunately, what we've seen I just recently wrote this article about this new service, the latest service in this vein that came out that does exactly this. You upload a single photo, you push a couple buttons, and then immediately the image of the woman in the photo that you upload is now transposed onto a porn star.
Melissa Harris-Perry: Rebecca, you said that deepfake technology emerged in the context of non-consensual porn here, but is it pervasive? If someone is using a porn site, how likely is it that they may encounter something which is actually a deepfake?
Rebecca Delfino: It's actually quite likely now, the prevalence of deepfakes that appear in the pornographic space is exploded. In fact, in the last two years entities that have studied and attracted deepfakes have estimated that 96% of the deepfake videos online are pornographic and that 90% of them are women. Just to give you a sense of those numbers, in July of 2019, there were about 15,000 deepfakes that involve pornography. A year later, there were 50,000, and the entities that are tracking this predict that the growth will continue by a factor of two every six months. It is a virtual explosion of deepfake pornographic videos on the internet.
Melissa Harris-Perry: All right, Karen, why should we care? It's not really these women, it's not really their bodies. If you're faces out there as a celebrity, it could be out there as a celebrity at all kinds of different images. Why does it even matter?
Karen Hao: Right. There are some very severe consequences that come out of this kind of deepfake pornographic abuse essentially. One is the psychological impact that has on these women. It's very traumatizing to see your image represented in such a different way from what you see yourself as, and it exists on the internet basically forever. It very much ends up having these like economic and interpersonal impacts the same as real revenge porn might. Every time you go to a job interview, you might have to confront an answer for this pornographic image of you that you never even knew existed.
Every time you go on a date, you have to answer to the person that you're dating, and there already been multiple cases of women that have been harassed with rape and death threats because the public doesn't know that they're not real. They think that they're real, or who have lost their jobs. There was this teacher who lost her job after the school discovered these images we've thought they were real and even if they weren't real, they thought that it was inappropriate for a teacher with that kind of imagery online to be then teaching students.
The repercussions are extremely real. This is a story that I heard from the UK government-funded entity, a revenge porn hotline. They're a nonprofit organization that helps women who are the victim of revenge porn and what they've seen over the last couple years, and particularly during the pandemic is an uptake in faked pornography and specifically deepfaked revenge porn. They heard from this teacher who called into the service saying, "I have no idea where these images came from or who's making these of me."
Many people don't even know that deepfakes exist. According to her, the school they just didn't understand when she was like, "This isn't me. This is fake." That concept is really hard to wrap your head around because the images tend to be so realistic now that it looks like it is you. I think that cognitive dissonance caused the school to be extremely concerned about continuing to employ that teacher. They probably were wondering what the kids might think if they discover this, what parents might think, if they discovered this, and the impending confusion, they just didn't want to have to deal with.
Melissa Harris-Perry: Rebecca, what kind of legal recourse would someone have in a situation, whether it's like that, or whether it's a celebrity or even if someone just pulls your image off of your own social media site, like your Facebook page, what can you do about it?
Rebecca Delfino: Well, unfortunately, right now, in many places, there's very little that can be done aside from contacting the platforms and asking them to take the image down, but as Karen suggested, even if you can get a media platform to take it down, it still is out there and it can be copied and disseminated and so it's difficult for people to get a handle on it once it's been released.
In terms of the legality of it, currently, the United States federally, we don't have a national criminal law that speaks to this particular problem. I wrote an article and in fact, in 2019 that came out that discussed the need for that. Some work has been done and some legislation has been proposed at the federal level, but it is not really emerged out of the various committees in congress. Then it requires that folks look to the states where they live to see if there is recourse for them.
Currently, there's only one state in the nation, Virginia, that has a law that criminalizes the distribution and creation of deepfakes. In Virginia, what they did is Virginia is one of the 46 states that actually has revenge porn crime, and they took the revenge porn crime, and they amended the definition to include the concept of these simulated deepfake pornographic images. Now, if you live in Virginia, of course, you can complain and perhaps obtain a prosecution,
Melissa Harris-Perry: Given that you're saying Virginia, which is not typically the most progressive out in front on these kinds of things, per se, I'm wondering if that's about politicians. Is that about basically, DC folks not wanting or wanting to have recourse if this happens to them?
Rebecca Delfino: That's an interesting question. I don't know the answer but often, when it comes to laws that are protecting women and girls, politicians are often motivated by something that is personal to them. I suspect that in Virginia, the motivation might have been because the individual legislators who proposed it, someone brought it to their attention and said, "Hey, this has happened to someone that we know." That tends to be where these laws get traction and attention. In addition, there may be some truth to what you suggested, that the political class that is close to DC is also concerned about the impacts of these crimes.
Melissa Harris-Perry: We've talked about it using, Karen, the shorthand of women, and this happens to women. This has mostly happened to women, exclusively happened to women. Are there men who are also victimized by this, and I almost hate to ask but children?
Karen Hao: Rebecca mentioned the stat earlier that 90% of non-consensual deepfake porn is made of women. Probably when they say, women, they really mean females. Perhaps they might be 17, 16, 15. We don't know. There's certainly been known instances of this kind of technology being used on underaged girls. We don't know the extent because it's hard, like many of these victims are not known by name. It's just image forenthesis or looking at the images and trying to figure out how young they look.
It does, in rare occasions, also happen to men. In the service that I discovered and reported on earlier this week, it also allows for face swapping of men on to gay porn. If you think about the fact that being homosexual is criminalized in certain countries by the death penalty, that could have also severe consequences for men. I do think that, by far, the consequences, the brunt of it is really felt by women and maybe some underage girls.
Melissa Harris-Perry: Rebecca, I'm wondering if there are other kinds of policies at either the state or federal level, maybe mapping that Virginia policy if that's sufficient, or there are other ways that federal and state policy can address this?
Rebecca Delfino: I think there are a number of things that can be done and a lot of it's going to involve inviting the private sector in the social media companies, those that control the space in which we live on the internet, policies that can encourage them or incentivize them policing their sites and being more responsive to takedown requests in a swifter manner. I know that Facebook and Google are working on developing greater responsiveness to these issues, but encouragement at the federal level or in the state level as well. I know in the last year, they've been legislation that is targeted sites such as Pornhub making them responsible if they are posting certain things that are pornographic.
Melissa Harris-Perry: Rebecca Delfino is a Law Professor at LMU Loyola Law School in Los Angeles and Karen Hao is the senior AI Editor at MIT Technology Review. Thank you both for being here.
Karen Hao: Thank you, Melissa
Rebecca Delfino: Thank you for having me.
Copyright © 2021 New York Public Radio. All rights reserved. Visit our website terms of use at www.wnyc.org for further information.
New York Public Radio transcripts are created on a rush deadline, often by contractors. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of New York Public Radio’s programming is the audio record.