AI could easily make fake adult film by swapping faces with porn stars

Natalie Portman's face used in deepfakes
The face of actress Natalie Portman is used in deepfakes Reddit

Computer scientists and experts in the field of machine learning have this general consensus that robots powered by artificial intelligence can be used against the humanity in the years to come if not strictly and properly regulated. Unbeknownst to many, however, the deep and dark side of AI has already been at play in the form a software program.

Recently, an AI-powered tool called FakeApp has been making rounds on the web, enabling people, even the most ordinary person with no photo or video editing skills, to create a fake porn video. Even more alarming is the fact that this app allows users to seamlessly overlay the face of a porn star in the act with a random person's face.

Also read: Denmark uses AI that can diagnose person's heart attack over the phone

A Reddit group named Deepfakes, with nearly 50,000 subscribers, caught the attention of netizens as it has become the ground zero for discussion and sharing of made-up adult films starring the faces of Emma Watson, Taylor Swift, Margot Robbie, Natalie Portman and so many other celebrities on top of porn actors' faces. Deepfakes was coined after its Redditor of the same pseudonym started sharing very convincing fake-swap porn videos in 2017.

Using FakeApp, a free desktop tool for creating deepfakes, users can freely replace the faces of porn actors with anyone's face even without a hint that they are processed.

Deepfakes chat room shut down

Apart from Reddit, thousands of active users on Discord are also doing a similar thing, sharing tips and stories how they faked obscene clips of ex-girlfriends, high school classmates and random people with photos publicly available on social media platforms like Facebook and Instagram.

While the Deepfakes chat room on Reddit remains alive, the one on Discord had already been taken down by the management after Motherboard ran a story exposing the anomalous activity.

Also read: AI can detect sexuality of people based on photos, study reveals

Meanwhile, Discord told Business Insider: "Non-consensual pornography warrants an instant shut down on the servers whenever we identify it, as well as permanent ban on the users. We have investigated these servers and shut them down immediately."

How Deepfakes works

How does fake-swap porn videos come to life? A deepfakes video maker gathers enough images of an unsuspecting victim. He will then find the right body of a porn actress to match the victim's face. However, web-based tools like Porn World Doppelganger, FindPornFace and Porn Star by Face make that process so much easier. These tools can find the most fitting porn star for the images available; the video maker just has to upload a sample photo.

Once these tools drop a name, the video maker can look for the actor's video on adult websites where he can download it. The video maker turns to FakeApp to make the magic happens.

Regulations

In the meantime, it remains a mystery as to how this matter can be addressed or regulated. This type of activity is not covered by revenge porn laws or defamation, and there are no known criminal laws existing with regards to using false images in porn videos.

At the moment, the responsibility heavily depends on platforms like Reddit and Discord to control this behaviour among deepfakes video makers. Reddit has not responded to media queries until now.

This article was first published on January 29, 2018
Related topics : Artificial intelligence
READ MORE