Asia/South Asia
5 years ago

Facebook should do more to help prevent violence in Myanmar: Report

Rohingya Muslim children, who crossed over from Myanmar into Bangladesh, wait squashed against each other to receive food handouts distributed to them at Thaingkhali refugee camp in Cox's Bazar, Bangladesh, in October, 2017. Photo: AP
Rohingya Muslim children, who crossed over from Myanmar into Bangladesh, wait squashed against each other to receive food handouts distributed to them at Thaingkhali refugee camp in Cox's Bazar, Bangladesh, in October, 2017. Photo: AP

Published :

Updated :

Facebook said Monday that an independent report it commissioned found the company hasn't always done enough to prevent its platform from being used to spread hate speech that has fuelled deadly violence in Myanmar.

The report, conducted by the non-profit Business for Social Responsibility, also offered Facebook recommendations for helping improve human rights in the country, including stricter enforcement of content policies and regular publishing of data related to human rights violations.

"The report concludes that, prior to this year, we weren't doing enough to help prevent our platform from being used to foment division and incite offline violence," Alex Warofka, Facebook product policy manager, wrote in a blog post.

"We agree that we can and should do more."

The report comes amid reports of widespread genocide being committed by the military in Myanmar.

In March, UN human rights experts investigating violence in the country concluded that Facebook played a "determining role" in the crisis, in which hundreds of thousands Rohingya Muslims have fled the country.

BSR recommended Facebook improve enforcement of its community standards, which describe what is and isn't allowed on the social network. Facebook said that central to achieving this is its near-complete development of a team that understands local Myanmar issues along with policy and operations expertise.

Facebook said it's using the social-listening tool CrowdTangle to analyze potentially harmful content and understand how it spreads in Myanmar. The company is also using artificial intelligence to identify and prevent the spread of posts that contain graphic violence or dehumanizing comments.

Preserving and sharing data that can be used to help evaluate human rights violations was also suggested, especially data specific to the situation in Myanmar so the international community can better evaluate the company's enforcement efforts.

"We are committed to working with and providing information to the relevant authorities as they investigate international human rights violations in Myanmar, and we are preserving data for this purpose," Warofka wrote, noting it took this approach with content and accounts associated with the Myanmar military it removed in August and October, according to UNB news agency.

Another recommendation includes the establishment of a policy that defines Facebook's approach to content moderation with respect to human rights, a suggestion Warofka said Facebook is "looking into."

The UN's' top human rights officials recommended in August that Myanmar military leaders be prosecuted for genocide against Rohingya Muslims. More than 700,000 Rohingya Muslims have fled Myanmar's Rakhine state since rebel attacks sparked a military backlash in August 2017.

UN investigators have reportedly found numerous crimes committed against the minority in Myanmar, including gang rape, enslavement, torching villages and killing children. Roughly 10,000 people have reportedly been killed in the violence, and tens of thousands have fled the country.

Share this news