Facebook cracks down heavily on child nudity by removing millions of images with AI's help

Facebook logo
Facebook Reuters

Facebook has been on the receiving end of a lot of flak from lawmakers and regulators, but it has revealed some statistics that shine light on the platform's efforts to combat one of the serious concerns in the society. Child nudity is an alarming concern with the rise of digitalisation, but Facebook is doing its bit to prevent such content appear on its social media platform.

Facebook announced on Wednesday that it removed 8.7 million images and content related to child nudity and sexual exploitation of children in the last quarter. According to the company, it did so with the help of artificial intelligence (AI) and machine learning, which allowed Facebook to remove 99 percent of the questionable content before anyone reported it.

"We have specially trained teams with backgrounds in law enforcement, online safety, analytics, and forensic investigations, which review content and report findings to NCMEC. In turn, NCMEC works with law enforcement agencies around the world to help victims, and we're helping the organization develop new software to help prioritize the reports it shares with law enforcement in order to address the most serious cases first," Facebook's global head of safety, Antigone Davis, said in a statement.

Facebook removes child nudity from its platform
Facebook removes child nudity from its platform Facebook Newsroom

The National Center for Missing and Exploited Children (NCMEC), however, expects to receive 16 million child porn tips from across the world this year from Facebook and other tech companies. The organisation had received 10 million such tips last year.

Machine learning minimises human intervention by sifting through billions of pieces of content that users post each day. In the process, the automated systems tend to wrongly block user posts, but Facebook is willing to take responsibility for the errors for the greater good.

"We'd rather err on the side of caution with children," Davis said.

Users can appeal such wrongly blocked posts, which include family photos of lightly clothed children uploaded with "good intentions," Venture Beat reported. But the Facebook tool makes exception for art and history, such as an award-winning photo of a naked girl fleeing a Vietnam War napalm attack.

Facebook is looking to apply the same technology to its Instagram app. The world's largest social media platform will join other tech corporations such as Microsoft to build tools that will help smaller companies to take similar actions.

Related topics : Facebook Artificial intelligence
READ MORE