Rohingya refugees file $150 billion lawsuit against Facebook for alleged content moderation malpractices

Lumen Database Team
4 min readDec 13, 2021
“Facebook growth” by niallkennedy is licensed under CC BY-NC 2.0

On December 6, 2021, a refugee who fled Myanmar when she was sixteen, filed a class action lawsuit against Facebook in California’s Superior Court for alleged incitement to violence and facilitation of genocide in Myanmar (formerly Burma). The suit was on behalf of herself and all Rohingya who fled Myanmar on or after June 1, 2012, and who now reside in the USA as refugees or asylum seekers. A similar coordinated action is due in the United Kingdom representing Rohingya refugees in UK and Bangladesh, and a letter of notice to this effect was submitted to Facebook’s London office on the same day. The case comes two years after Facebook, in a statement, officially admitted that it hadn’t done enough to prevent its platform from “being used to foment division and incite offline violence in Myanmar.”

The Rohingya are an ethnic minority in Myanmar and represent the largest percentage of Muslims in the country. Even so, the government of Myanmar, which is predominantly a Buddhist country, denies them citizenship and even excluded them from the last census in 2014. For several decades, Rohingya Muslims have been persecuted in the country and the country’s military has been accused of ethnic cleansing and genocide. In 2017, an exodus labelled “clearance operations” by the Myanmar Government began when military troops and local mobs burned villages and killed civilians. The exodus led to the death of approximately 10,000 peoples and over 750,000 fled the country seeking refuge.

Advocacy groups, and various media outlets have reported or indicated that Facebook, which is used by over half of the country’s population, played an active role in exacerbating the violence in Myanmar, by not moderating or removing hate speech or extremist content from their platform. The United Nation’s human rights investigators have also concluded since that the hate speech on Facebook played a ‘determining role’ in fermenting violence in Myanmar.

The petition notes that the country’s military tasked its soldiers with conducting ‘information combat’, whereby thousands of soldiers were supposed to create fake accounts to spread radicalized views against Rohingya Muslims and duplicate other extremist content on the platform. Interestingly, it also points out that Facebook’s ‘free basics’ programme contributed to the genocide by reducing media plurality. It alleges that Facebook’s algorithm, which supports engagement-based ranking, prioritizes content related to hate speech and misinformation to increase user engagement. Further, it alleges, based on whistleblower Sophie Zhang’s revelations, that despite having the ability to detect and deactivate counterfeit accounts used by authoritarian politicians and regimes to generate ‘fake engagement’, Facebook devotes minimal resources to that task.

In order to have more detailed insight into the positive actions, or lack thereof, on part of Facebook during the exodus in Myanmar, it may also be useful to learn more about the kind of content takedown requests (possibly for hate speech?) that Facebook was receiving during this time from the platform’s users in Myanmar, and the extent to which the social media giant was complying with such takedown requests. However, at this time, Facebook does not share copies of the content removal notices it receives, whether from external parties or from its users, with any independent third parties, such as Lumen. On the other hand, Facebook seems to also have taken initiative to realign its platform with international human rights standards by hiring a Director of Human Rights Miranda Sissons, formerly of the International Centre for Transitional Justice and Human Rights Watch, and launching a corporate human rights policy to build strategies to tackle harmful content in countries such as Myanmar, that are prioritized as most ‘at risk of conflict’.

As the petition states, the “core of this complaint is the realization that Facebook was willing to trade the lives of Rohingya people for better market penetration in a small country in Southeast Asia.” On the basis of alleged strict product liability and negligence for breach of duty of care, the suit seeks $150 billion as damages and additional punitive damages as determined by the court. In what seems like a response to the lawsuit, Meta, the newly founded owner of Facebook and Instagram, furthered a prior clamp down on military content in Myanmar by announcing that it would ban military-controlled business accounts, pages and groups on Facebook.

While Section 230 of the Communication Decency Act, 1996, lends immunity from prosecution to Facebook and all other online platforms for any third-party in the US, the petition has invoked the laws of Myanmar as a ground for the legal proceedings. Should the court allow for the hearing of the suit to be based on foreign laws, it may open a floodgate for other suits alleging the culpability of social media platforms on varying grounds, placing the applicability of section 230 into question with respect to any cause of action that arose, or content that was created, outside the United States of America.

In the past, Facebook has at least partially recognized its culpability and stated that it would work more efficiently in tackling online propaganda, hate speech and misinformation in Myanmar. As a result, in 2018, the platform removed over 600 pages, groups and accounts from its platform. However, in October 2021, the social media platform refused to cooperate by turning in evidence at the International Court of Justice in the proceedings brought against Myanmar by Gambia for the alleged genocide of Rohingyas.

This is latest in the line of several questions being raised about Facebook and its algorithm and the dangerous effects that the content it prioritizes may have on user behavior, although none have previously been regarding alleged crimes of this gravity. It remains to be seen how Facebook will respond to the allegations, part of which the platform has self-admitted to in the past, and whether it will take any corrective actions to prevent similar problematic and complex situations that may arise in the future.

About the Author: Shreya is a Research Fellow at Berkman Klein Center’s Lumen Project.

--

--

Lumen Database Team

Collecting and facilitating research on requests to remove online material. Visit lumendatabase.org and email us if you have questions.