Sci-Tech
6 years ago

Facebook removes faulty Burmese translation feature

A cellphone user looks at a Facebook page at a shop in Latha street, Yangon, Myanmar on August 8, last — Reuters/File
A cellphone user looks at a Facebook page at a shop in Latha street, Yangon, Myanmar on August 8, last — Reuters/File

Published :

Updated :

Facebook has removed a feature that allowed users to translate Burmese posts and comments after a Reuters report showed the tool was producing bizarre results.

A Reuters investigation published on August 15 documented how Facebook was failing in its efforts to combat vitriolic Burmese language posts about Myanmar’s Rohingya Muslims. Some 0.7 million (700,000) Rohingya have fled into neighbouring Bangladesh from Myanmar over the past year amid a military crackdown and ethnic violence. In late August, United Nations investigators said Facebook had been “a useful instrument for those seeking to spread hate” against the Muslim minority group.

The Reuters article also showed that the translation feature was flawed. It cited an anti-Rohingya post that said in Burmese, "Kill all the kalars that you see in Myanmar; none of them should be left alive." Kalar is a pejorative for the Rohingya. Facebook had translated the post into English as "I shouldn’t have a rainbow in Myanmar."

A spokeswoman for Facebook said the Burmese translation feature was “switched off” on August 28. She said the Reuters article and feedback from users “prompted us to do this.”

“We are working on ways to improve the quality of the translations and until then, we have switched off this feature in Myanmar,” the spokeswoman wrote in an email.

Facebook has had other problems interpreting Burmese, Myanmar’s main local language. In April, the California-based social-media company posted a Burmese translation of its internal “Community Standards” enforcement guidelines.

Many of the passages were botched. A sentence that in English stated “we take our role in keeping abuse off our service seriously” was translated into Burmese as “we take our role seriously by abusing our services.” 

The Reuters investigation found more than 1,000 examples of hate speech on Facebook, including calling the Rohingya and other Muslims dogs, maggots, and rapists, suggesting they be fed to pigs, and urging they be shot or exterminated. Facebook’s rules specifically prohibit attacking ethnic groups with “violent or dehumanising speech” or comparing them to animals.

Shortly after the article was published, Facebook issued a statement saying it had been “too slow to prevent misinformation and hate” in Myanmar and that it was taking action, including investing in artificial intelligence that can police posts that violate its rules.

 

Share this news