Sci-Tech
5 years ago

Facebook removes 8.7m child nudity images in Q2

Published :

Updated :

Facebook has pulled 8.7 million pieces of child nudity content from its platform in the past three months to fight child exploitation, it said on Wednesday.

The US top social media network said it has been using artificial intelligence (AI) and machine learning technology in its efforts to prevent child exploitation and increased enforcement of its ban on photos that show minors in a sexualised context.

Antigone Davis, Facebook Global Head of Safety, wrote in an official post that 99 per cent of those images were removed before anyone reported them for violation of the company's policy.

She said Facebook has also removed accounts that promote child pornography, and it even took action on nonsexual content such as seemingly benign photos of children in the bath to avoid the potential for abuse.

Davis disclosed that Facebook has been working hard to develop new technology over the past year to combat child exploitation and keep children safe on the platform, reports Xinhua.

"In addition to photo-matching technology, we're using AI and machine learning to proactively detect child nudity and previously unknown child exploitative content when it's uploaded," she said.

Facebook has in recent months been under mounting pressure from US federal regulators and lawmakers to keep its platform clear of content containing hatred, extremism and other illicit material.

It pledges to join other industry partners, including Microsoft, next month to begin building tools to enable smaller companies to take similar actions to prevent the grooming of children online.

Share this news