Law Can’t Help When People Put Your Face on Porn

But people can use machine learning and AI to swap faces onto porn performers.

The earlier boys are exposed to pornography, the more likely they are to develop the misogynistic belief that women should be controlled by men, according to a new study presented at the American Psychological Association. (Getty)
The earlier boys are exposed to pornography, the more likely they are to develop the misogynistic belief that women should be controlled by men, according to a new study presented at the American Psychological Association. (Getty)

There is a disturbing new trend sweeping the internet right now: Using machine learning and AI to swap celebrities’ faces onto porn performers in X-rated footage. These fake celebrity porn edits are so seamless, it could easily be mistaken for the real thing. According to Wired, early victims include Daisy Ridley, Gal Gadot, Scarlett Johansson, and Taylor Swift. The trend has been brewing for a few months, and even has its own subreddit. Someone has now created an app, which drastically lowers the technical threshold would-be creators have to clear, so it is about to become much more prevalent. The worst part of all of this? If you live in the United States, and someone does this with your face, there is law to protect you. Mary Anne Franks, who teaches First Amendment and technology law at the University of Miami Law School, told Wired, “I share your sense of doom. I think it is going to be that bad.” She wrote a lot of the US’s existing legislation that criminalizes nonconsensual porn, and unfortunately, it won’t help. The premise of any current legislation is that nonconsensual porn is a privacy violation. But even though this face swapping would be deeply, personally humiliating for anyone it happens to, you can’t sue someone for exposing the intimate details of your life “when it’s not your life they’re exposing.”

The InsideHook Newsletter.

News, advice and insights for the most interesting person in the room.