Deepfake porn videos banned by Reddit, Twitter, Pornhub

By Lisa Vaas, Sophos

February 7, 2018

The home of deepfakes Ė videos with celebritiesí or civiliansí faces stitched in via artificial intelligence (AI) Ė has kicked them out.

After keeping mum while the issue exploded over the past few weeks, on Wednesday Reddit banned deepfakes.

Or, rather, Reddit clarified that its existing ban on involuntary pornography, which features people whose permission has neither been sought nor granted, covers faked videos.

Up until Wednesday, Reddit had a single policy for two rule breakers: involuntary pornography and sexual or suggestive content involving minors. Itís split that single policy into two distinct policies, setting the involuntary pornography ban to stand on its own and adding language that specifies deepfakes:

Reddit prohibits the dissemination of images or video depicting any person in a state of nudity or engaged in any act of sexual conduct apparently created or posted without their permission, including depictions that have been faked.

The platforms that put up with the involuntary induction of people into becoming porn stars are steadily dwindling in number: both Twitter and the giant pornography site Pornhub have also banned deepfakes, likewise calling them nonconsensual porn that violates their terms of service (TOS).

From an email statement a Pornhub spokesperson sent to Motherboard:

We do not tolerate any nonconsensual content on the site and we remove all said content as soon as we are made aware of it. Nonconsensual content directly violates our TOS and consists of content such as revenge porn, deepfakes or anything published without a personís consent or permission.

Pornhub previously told Mashable that itís already taken down deepfakes flagged by users.

Corey Price, PornHubís vice president, told Mashable that users have started to flag deepfakes and that the platform is taking them down as soon as it encounters the flags. Price encouraged anyone who finds nonconsensual porn to visit Pornhubís content removal page to lodge an official takedown request.

As for Twitter, a spokesperson said that the platform isnít just banning deepfakes; itís also going to suspend user accounts of those identified as original posters of nonconsensual porn or accounts dedicated to posting the content.

You may not post or share intimate photos or videos of someone that were produced or distributed without their consent.

We will suspend any account we identify as the original poster of intimate media that has been produced or distributed without the subjectís consent. We will also suspend any account dedicated to posting this type of content.

Reddit, Twitter and Pornhub join earlier deepfake bans by chat service Discord and Gfycat, a former favorite for posting deepfakes from the Reddit crowd. Reddit is where the phenomenon first gained steam. Make that a LOT of steam: the r/deepfakes subreddit had over 90,000 subscribers as of Wednesday morning before it was taken down.

Itís not only celebrities who can be cast as porn stars, of course, and itís not just porn thatís being fabricated. Weíve seen deepfakes that cast Hitler as Argentina President Mauricio Macri, plus plenty of deepfakes featuring President Trumpís face on Hillary Clintonís head.

As far as deepfakes legality goes, Charles Duan, associate director of tech and innovation policy at the advocacy group R Street Institute thinktank, told Motherboard that the videos infringe on copyrights of both the porn performers whose bodies are used in the underlying videos and the celebrities whose faces, taken from interviews or copyrighted publicity photos, are glued onto those bodies. These groups could seek protection by filing a Digital Millennium Copyright Act (DMCA) report to initiate a takedown notice.

But he says itís the video makers whose work is being appropriated, not the celebrities, who would have the solid copyright claim:

The makers of the video would have a perfectly legitimate copyright property case against the uploader, and they would be able to take advantage of the DMCA to have the website take the vid down. Chances are, a similar practice would work as well for these sorts of videos Ö Not on behalf of the victim [whose face is being used] but the maker of the video. Which is the weird thing about that whole situation.

Reddit said that this is about making Reddit a ďmore welcoming environment for all users.Ē

Earlier on Wednesday, before the community was taken down, r/deepfakes moderators had informed subscribers that deepfakes featuring the faces of minors would be banned without warning and that the posters would be reported to admins. The subredditís rules had also included a ban on using the faces of non-celebrities.

Good! But too little, too late. Given that there was already a user-friendly app released that would generate deepfakes with a few button presses, the subredditís rules didnít mean that non-celebritiesí and childrenís faces wouldnít be used to generate the videos. They just wouldnít be tolerated on r/deepfakesÖ which itself is no longer tolerated on Reddit.

Öor on Discord, Twitter, Pornhub, Gfycat, Imgur (which is also known to remove deepfakes, according to what was a running list on r/deepfakes), andÖ well, there will likely be more deepfake-banning services by the time this article is posted, at the rate this is going.

Is the genie back in the bottle? Doubtful! R/deepfakes members had already been discussing taking the whole kit and kaboodle off Reddit et al. and setting up a distributed platform where they could happily AI-stitch away.

Redditís move is likely just a bump in the road. This technology isnít going anywhere: itís here to stay. And thatís a not necessarily a bad thing.

The main problem here hasnít been the technology; itís the fact that the video creators are using it to produce explicit content without the permission of those involved.

Terms of Use | Copyright © 2002 - 2016 CONSTITUENTWORKS SM  CORPORATION. All rights reserved. | Privacy Statement