For years, an ethnic cleansing campaign by Myanmar’s Buddhist majority against its Muslim Rohingya has torn the country apart. Facebook has faced scrutiny for its role in the spread of false information that incited violence. On Monday, it announced that it has banned several members of the Myanmar military and organisations that were named by the United Nations as complicit in the genocide.
Facebook’s action came within the hours following publication of a UN Human Rights Council Report that named Myanmar’s commander-in-chief and five generals as complicit parties in the mass killings and gang rapes of Muslim Rohingya with “genocidal intent”. Throughout the report, the Human Rights Council cites specific Facebook posts containing false information and propaganda as having a role in sparking the violence.
As far back as 2014, human rights activists have told Facebook that its platform was being used to spread rumours and call for violence against the minority Rohingya population.
According to the UN report, hundreds of thousands of Rohingya have fled the country to live in refugee camps, many dying along the way due to injuries they sustained in attacks by mobs or the Myanmar military. Sexual violence has been carried out on a massive scale. At least 392 villages have been partially or completely destroyed. And at least 10,000 people have been murdered.
The UN is now calling for leaders in the Myanmar military — which has de facto control over the country’s government — to be tried for genocide through a tribunal or referral to the International Criminal Court in the Hague.
After the UN released its report, Facebook published a blog post stating that it is banning 20 individuals and organisations from Facebook in Myanmar. It said that it has not found a presence on Facebook or Instagram for some of the subjects of its ban, but it is “removing a total of 18 Facebook accounts, one Instagram account, and 52 Facebook Pages” that are followed by almost 12 million people.
It named Senior General Min Aung Hlaing, commander-in-chief of Myanmar’s armed forces, and the military’s Myawady television network as being subject to the ban.
Earlier this winter, Wired chronicled the tragedy in Myanmar and traced Facebook’s role in it as far back as 2014. From the beginning, Facebook’s failure has been attributed to its lack of moderation and inability to understand Myanmar’s politics and culture.
After hundreds of people participated in riots and attacked a Muslim tea shop owner who was falsely accused of raping a Buddhist employee, Facebook’s director of policy for the Asia-Pacific region said the company’s immediate plan was to accelerate the translation of its user guidelines and code of conduct into Burmese. It took 14 months for that translation to be completed, and the violence has intensified since then.
Facebook has not been willing to say how many people it has working to moderate content in Myanmar or how many Burmese speakers it employs to conduct its content reviews. On July 18, it announced that it would begin removing misinformation that could lead to physical violence. In August, Reuters published an in-depth look at the company’s moderation failures and slow response.
In addition to citing many Facebook posts that were intended to spread false information and incite violence, the UN report specifically calls for an investigation into Facebook’s response. One passage reads:
The role of social media is significant. Facebook has been a useful instrument for those seeking to spread hate, in a context where for most users Facebook is the Internet. Although improved in recent months, Facebook’s response has been slow and ineffective. The extent to which Facebook posts and messages have led to real-world discrimination and violence must be independently and thoroughly examined. The Mission regrets that Facebook is unable to provide country-specific data about the spread of hate speech on its platform, which is imperative to assess the adequacy of its response.
In its post on Monday, Facebook acknowledged some of its failures, writing, “While we were too slow to act, we’re now making progress – with better technology to identify hate speech, improved reporting tools, and more people to review content.”
It also said that it faces a tough situation because “so many people there rely on Facebook for information — more so than in almost any other country given the nascent state of the news media and the recent rapid adoption of mobile phones”.
The acknowledgement that Facebook is a primary news platform for the people of Myanmar is a bit of a surprise for the social network that fights tooth and nail against any assertion that it’s a media company. The International Republican Institute conducted a survey in Myanmar last year that found 38 per cent of residents claimed they get their news from Facebook.
We’ve reached out to Facebook to ask how many people it estimates are getting their news from the platform and to clarify if it means people are sharing links or directly publishing un-vetted news. We did not receive an immediate reply.
We also asked if Facebook intends to release a complete list of the individuals and organisations that are subject to today’s ban. Further, we asked how many accounts Facebook has removed in total after finding ties to violence in Myanmar.
It’s become the standard operating procedure for Facebook to acknowledge it needs to “do better” when it comes to addressing abuses on its networks, but it regularly seems incapable of taking action until its hand is forced by authorities.
In the US, it made changes last week to its ad-targeting practices that facilitate housing discrimination. Though the company has been aware of the problem since at least October 2016, when ProPublica published an investigation into the discriminatory practices, it only managed to take swift action after when the Department of Housing and Urban Development issued a formal complaint against the company.