Cart ()

Facebook Admits Failure to Stop Incitement of Violence on Their Platform in Myanmar

Facebook Admits Failure to Stop Incitement of Violence on Their Platform in Myanmar

Facebook, the social media company focused on “bringing people together,” seems to have another human rights crisis on its hands. Last Tuesday, the company published an independent assessment which analyzed how their administrative efforts failed to stop the incitement of violence against the Rohingya population during the 2016 genocide in Myanmar. 

“The report concludes that, prior to this year, we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence. We agree that we can and should do more,” wrote Facebook’s public policy manager Alex Warofka in a public statement. “We know we need to do more to ensure we are a force for good in Myanmar.”

Conducted by the Business for Social Responsibility organization, a non-profit committed to human rights protections, the report found the site was “being used to foment division” that ultimately “resulted in offline violence,” later arguing that Facebook’s newly unspecified changes to their content policies will prevent such events from repeating themselves in the future, according to statements cited by The Verge. The investigation into Facebook’s administrative conduct began near the end of 2017, a few months after independent outlets began shining a light on the horrific conditions in Myanmar left ignored and suppressed by big tech institutions. 

In November of last year, TrigTent reported on how Facebook and YouTube were reportedly removing images documenting the ‘ethnic cleansing and torture’ being conducted by the Myanmar government, removing traces of these political hate crimes without reporting the evidence to proper human rights authorities before their eventual deletion. 

“Three years of documentation, just gone, in a moment,” pleaded human rights watchdog Obayda Abo-Al Bara, the manager at the Idlib Media Center, who spoke to The Intercept last year. He was also joined by a Rohingya activist, Mohammad Anwar, who said “I did feel that Facebook was colluding with the Myanmar regime in the Rohingya genocide.”

For critics of the social media empire, such as journalist Avi Asher-Schapiro who also happens to write for The Intercept, these blind image removals were “at best, a destruction of evidence” and “at worst, complicity in the atrocities” overall. He writes: “First-hand accounts of extrajudicial killings, ethnic cleansing, and the targeting of civilians by armies can disappear with little warning, sometimes before investigators notice. 

“When groups do realize potential evidence has been erased,” he continued, “recovering it can be a [nightmarish] ordeal. Facing a variety of pressures — to safeguard user privacy, neuter extremist propaganda, curb harassment and, most recently, combat the spread of so-called fake news — social media companies have over and over again chosen to ignore, and, at times, disrupt the work of human rights groups scrambling to build cases against war criminals.”

In September 2018, a 479-page United Nations report, which was summarized quite accurately by reporters for The Guardian, concluded that Facebook issued a “slow and ineffective response” when their “standard reporting mechanism alerted the company to a post targeting a human rights defender for his alleged cooperation” with the UN. The highlighted post, which investigators found was shared over a thousand times, labeled their unnamed human rights activist a “national traitor” that commenters thought should be “murdered in the street” due to his religious status as a Rohingya Muslim.

Facebook’s inability to address such genocidal extremism in Myanmar has been a well-covered catastrophe, resulting in a coalition of activists from Myanmar, Syria, and six other countries issuing specific demands for the social media company to increase transparency and enforce standards suitable for preventing extremism, all the while maintaining online free speech liberties. These demands were not addressed until last week. 

The report suspects the reason the company didn’t crackdown on Myanmar incitement was the estimated 20 million increase to overall user engagement from the crisis. “There are deep-rooted and pervasive cultural beliefs in Myanmar that reinforce discrimination and which result in interfaith and communal conflict,” the report said. “Facebook, being used to spread these opinions on an individual basis, as well as by organized groups, gain politically.”

By following the golden rule of capitalism, where profits are above morals, it seems catering to the extremist crowd, who were siphoned for their advertisement revenue and user inflation, was just good business for Facebook. Although Zuckerberg and co have agreed to publish more data on enforcement policy, these are promises tied to no mandate, with no details on how regularly these reports will be published, or the future process of how collecting evidence of war crimes will be treated in the future.

Facebook isn’t a villain, of course. While these problems of unaccountability persist, Zuckerberg has announced their new team of 99 native Myanmar speakers dedicated to addressing the issues of extremism specifically. The Verge details how the group has already taken action on over 64,000 pieces of content violating their policies on incitement of violence and their loosely defined “hate speech,” claiming 63 percent of these posts were manually reviewed. This still comes with the problem of Facebook outsourcing a sizeable portion of their administrating to automated reviewers, though shows a partial commitment to localized oversight.

“There are a lot of people at Facebook who have known for a long time that the company should have done more to prevent the gross misuse of its platform in Myanmar,” said Matthew Smith of Fortify Rights, a non-profit human rights organization focusing on Southeast Asia, who spoke with The New York Times. “This assessment is encouraging and overdue, but the key to any assessment is implementation.”

Related News