Southeast Asia

Should do more to fight Myanmar violence: Facebook

Myanmar human rights violation
Myanmar people gather for refreshement at a teashop in Yangon on August 31, 2018 many hangout to chat and browse facebook with their mobile phone. Baffled, hurt or indignant, many inside Myanmar are struggling to digest a week of opprobrium heaped on their country by the UN and even Facebook over the treatment of the Rohingya, a stateless Muslim group whose plight elicits little sympathy in the Buddhist-majority nation. Photo: AFP

Facebook Inc has said a human rights report it commissioned on its presence in Myanmar showed it had not done enough to prevent its social network from being used to incite violence.

The report by San Francisco-based nonprofit Business for Social Responsibility (BSR) recommended that Facebook more strictly enforce its content policies, increase engagement with both Myanmar officials and civil society groups and regularly release additional data about its progress in the country.

"The report concludes that, prior to this year, we weren't doing enough to help prevent our platform from being used to foment division and incite offline violence. We agree that we can and should do more," Alex Warofka, a Facebook product policy manager, said in a blog post.

BSR also warned that Facebook must be prepared to handle a likely onslaught of misinformation during Myanmar's 2020 elections, and new problems as use of its WhatsApp grows in Myanmar, according to the report, which Facebook released yesterday.

READ Reuters special report Hatebook: Inside Facebook's Myanmar operation

Facebook failed to promptly heed numerous warnings from organizations in Myanmar about social media posts fueling attacks on minority groups such as the Rohingyas.

In August 2017 the military led a crackdown in Myanmar's Rakhine State in response to attacks by Rohingya insurgents, pushing more than 700,000 Muslims to neighboring Bangladesh, according to UN agencies.

The social media website in August removed several Myanmar military officials from the platform to prevent the spread of "hate and misinformation," for the first time banning a country's military or political leaders.

It also removed dozens of accounts for engaging in a campaign that "used seemingly independent news and opinion pages to covertly push the messages of the Myanmar military."

The move came hours after United Nations investigators said the army carried out mass killings and gang rapes of Muslim Rohingyas with "genocidal intent."

Facebook said it has begun correcting shortcomings.

Facebook said that it now has 99 Myanmar language specialists reviewing potentially questionable content. In addition, it has expanded use of automated tools to reduce distribution of violent and dehumanizing posts while they undergo review.

In the third quarter, the company said it "took action" on about 64,000 pieces of content that violated its hate speech policies. About 63 percent were identified by automated software, up from 52 percent in the prior quarter.

Facebook has roughly 20 million users in Myanmar, according to BSR, which warned Facebook faces several unresolved challenges in Myanmar.

BSR said locating staff there, for example, could aid in Facebook's understanding of how its services are used locally but said its workers could be targeted by the country's military, which has been accused by the UN of ethnic cleansing of the Rohingyas.

Comments

Should do more to fight Myanmar violence: Facebook

Myanmar human rights violation
Myanmar people gather for refreshement at a teashop in Yangon on August 31, 2018 many hangout to chat and browse facebook with their mobile phone. Baffled, hurt or indignant, many inside Myanmar are struggling to digest a week of opprobrium heaped on their country by the UN and even Facebook over the treatment of the Rohingya, a stateless Muslim group whose plight elicits little sympathy in the Buddhist-majority nation. Photo: AFP

Facebook Inc has said a human rights report it commissioned on its presence in Myanmar showed it had not done enough to prevent its social network from being used to incite violence.

The report by San Francisco-based nonprofit Business for Social Responsibility (BSR) recommended that Facebook more strictly enforce its content policies, increase engagement with both Myanmar officials and civil society groups and regularly release additional data about its progress in the country.

"The report concludes that, prior to this year, we weren't doing enough to help prevent our platform from being used to foment division and incite offline violence. We agree that we can and should do more," Alex Warofka, a Facebook product policy manager, said in a blog post.

BSR also warned that Facebook must be prepared to handle a likely onslaught of misinformation during Myanmar's 2020 elections, and new problems as use of its WhatsApp grows in Myanmar, according to the report, which Facebook released yesterday.

READ Reuters special report Hatebook: Inside Facebook's Myanmar operation

Facebook failed to promptly heed numerous warnings from organizations in Myanmar about social media posts fueling attacks on minority groups such as the Rohingyas.

In August 2017 the military led a crackdown in Myanmar's Rakhine State in response to attacks by Rohingya insurgents, pushing more than 700,000 Muslims to neighboring Bangladesh, according to UN agencies.

The social media website in August removed several Myanmar military officials from the platform to prevent the spread of "hate and misinformation," for the first time banning a country's military or political leaders.

It also removed dozens of accounts for engaging in a campaign that "used seemingly independent news and opinion pages to covertly push the messages of the Myanmar military."

The move came hours after United Nations investigators said the army carried out mass killings and gang rapes of Muslim Rohingyas with "genocidal intent."

Facebook said it has begun correcting shortcomings.

Facebook said that it now has 99 Myanmar language specialists reviewing potentially questionable content. In addition, it has expanded use of automated tools to reduce distribution of violent and dehumanizing posts while they undergo review.

In the third quarter, the company said it "took action" on about 64,000 pieces of content that violated its hate speech policies. About 63 percent were identified by automated software, up from 52 percent in the prior quarter.

Facebook has roughly 20 million users in Myanmar, according to BSR, which warned Facebook faces several unresolved challenges in Myanmar.

BSR said locating staff there, for example, could aid in Facebook's understanding of how its services are used locally but said its workers could be targeted by the country's military, which has been accused by the UN of ethnic cleansing of the Rohingyas.

Comments