Filipino actress Maine Mendoza becomes victim of "deepfake" pornography; threatens to take legal action against those spreading the clip

Filipino actress Maine Mendoza has refuted the rumours of her involvement in a scandalous video which was doing rounds online. Through her agency, she has threatened to take legal action against those spreading the clip on the internet, thereby causing damage to her reputation.

Maine Mendoza
Maine Mendoza. Maine Mendoza Instagram

The 25-year old actress and TV personality was shocked when a netizen brought the video to her notice on Twitter. Responding to the post, she wrote, "Wtf???? Sobrang kamukha ko kinilabutan ako pero hindi ako to (Wtf? She really looks like me, but I am not her)! [sic]."

The video allegedly showcased a woman involved in an obscene act. The woman's face was edited to replace Mendoza's face which created the confusion in the minds of the fans.

In a press release, her agency, All Access to Artists, Inc., has stated that the video is fake and the clip is manipulated using "deepfake technology."

Maine Mendoza
Maine Mendoza. Maine Mendoza Instagram

"Ms Mendoza is not involved in the said video and neither does she participate in the making of pornographic and other explicit materials. The public is warned not to post, share or otherwise circulate the video. We will not hesitate to take appropriate legal action against any person circulating the same. We intend to hold those individuals criminally and civilly liable for the damage caused to Ms Mendoza," the statement read.

It asked the public to be "discerning of the content" that they consume online and helping the agency to take down the voyeuristic and defamatory content. The agency said that it is closely working with the government agencies to remove the content permanently online.

"Rest assured that we are taking all necessary steps to hold the individuals involved accountable and we are coordinating with the proper government agencies to remove this content online permanently," the agency further said in the statement.

Deepfake technology uses artificial-intelligence technology to swap the face of one person onto another person's body. Many Hollywood stars that include Kristen Bell and Ashton Kutcher have become victims of this practice.

This article was first published on December 25, 2020