In 2018, Meta (formerly Facebook) released a statement accepting the role it played in the genocide against the Rohingya Muslims of Myanmar. “The ethnic violence in Myanmar is horrific and we have been too slow to prevent misinformation and hate on Facebook.”[1] This article serves three purposes. Firstly, to recognise the role Facebook, a subsidiary of Meta, has played in fuelling violence in Myanmar through engagement driven algorithms. Secondly, analysing Meta’s failure to filter hate speech and tackle the issue despite warnings. Lastly, to determine the flaws in the legal system with regards to holding social media corporations accountable across jurisdictions. When it comes to platforms like Facebook – to what extent should this tool be regulated? If violence is propagated by such platforms, and a genocide has already been committed, what repercussions should exist? Who should be held accountable? I will highlight the limitations of the legal system that have enabled the devastation in Myanmar, and discuss the need to reassess how to hold corporations like Meta accountable for human rights violations.
Why Facebook is Implicated
‘When social media giants preoccupy themselves with their annual returns instead of focusing on building safe online spaces, peace, justice, and accountability suffer.’[2]
Business corporations like Meta have the goal of maximising profit- often in spite of human rights violations. The way to do so is by driving up engagement, attracting users to content that would elicit reactions.[3] This leads to the creation of echo chambers, which is the most significant way in which Facebook is implicated. By guiding users toward extreme material, the algorithm conditions users to seek content that is often violent. These echo chambers amplify divisive voices,[4] and promote hate speech in areas like Myanmar. What makes this algorithm dangerous is that so little is revealed about it, and users are unaware of the sophisticated tool promoting the extreme content flooding on their feeds.
When the duty of gatekeeping information is passed from journalists and editors, bound by ethics, to tech corporations, focused on appeasing shareholders, combating hate speech becomes a lesser priority. While news organisations too have their own agenda, the focus is still mainly on conveying news, instead of just making profit. Content producers now create ‘junk news’[5]that is ‘sensational, conspirational, extremist and inflammatory’[6] which is accessible to users without verification. This is the second way in which Meta is implicated. Since Meta is protected against liability for user content under US law, there is minimal accountability, and users are often fed false, aggravated information. Consequences of this can be dire, as witnessed by the storming of the United States Capitol Building prompted by false claims. As CEO of Rappler, Maria Ressa, stated, ‘social media has become a behaviour modification system,’[7] backed often by sensational and false news.
The last way in which Meta is implicated is through the double standards that can be observed when it supports majoritarian regimes. Facebook actively banned the Arakan Rohingya Salvation Army (ARSA) from Facebook, and even removed posts praising the Rohingya insurgent group. However, accounts linked to the Myanmar military were rampant until August 2018, after military campaigns in Rohingya villages led to 800,000 Rohingya Muslims fleeing to Bangladesh. In 2018, four other ethnic armed organisations were restricted on Facebook, but the Tatmadaw (military) did not face the same removals. A human rights observer told The Guardian that Facebook was ‘tipping the scales’ toward the oppressive regime, giving a ‘big boost’ to the government.[8]
Genocide in Myanmar
For six decades, the military took control of media outlets in Myanmar and controlled the narrative. In what was considered a victory, access to social media platforms was allowed after the 2011 election. In 2016, Meta launched its Free Basics program in Myanmar. This free access made the platform explode, making Facebook the equivalent of the Internet. As Yangha Lee stated, ‘everything is done through Facebook.’ It became the main source of information, and after 2016, it became the sole platform of information- and the main avenue used to fuel hatred. Islamophobic content was plastered across newsfeeds, and anti-Rohingya propaganda was either published by the Tatmadaw, or by citizens who has absorbed the decades of hate. Most prominent was the role played by fake accounts and bots bought by the military, spreading the genocidal messages of the regime. Loud and clear was the message: Rohingya Muslims are outsiders seeking to destroy Buddhism, they must be squashed. Hardline nationalist monks like Wirathu continued to publish posts which gained traction, resulting in direct consequences. After a post gained attention, violence would spill.[9] In 2014, when Wirathu (an extremist monk) used Facebook to post about a Buddhist girl being raped by Muslim men, a mob attacked the accused, and the subsequent violence resulted in a loss of lives.[10]
Local internet policy advocates informed Facebook of the hate speech spewing across newsfeeds, targeting the Rohingya Muslim minority. The president of the Burmese Rohingya Organisation UK, Tub Khin, expressed that Facebook failed to invest in content moderators who spoke local languages, or fact checkers who were aware of the political situation and could close specific accounts. Major propagators of hate speech were negligently enabled.
In 2018, Darusman, chairman of the UN Fact Checking Mission in Myanmar, confirmed the direct contribution Facebook made in the ‘dissension and conflict.’ He further claimed, ‘social media is Facebook, and Facebook is social media.’ With most news outlets overtaken by the military, Facebook’s role as a source of information was all-powerful. Hence, the responsibility to monitor it should have been far greater.
In 2018, Meta admitted that it played a role in the ‘horrific’ ethnic violence, and that it would do more. However, Meta has made no comment on how much of its content moderators can read Burmese, what tools are being employed to sift through hate speech, and no results of any experimental algorithm has been made public. [11] Moreover, whistleblower Haugen came forward in 2021, claiming that the tools to combat hate speech were not fully developed despite warnings.
Meta’s Inaction
Meta promised to invest in creating a superior model for hate speech detection. They claimed to have built a team of Burmese speakers for content moderation. Further claims were made about banning military accounts, disrupting networks spreading misinformation, and investing in a Burmese-language technology to reduce hate speech. It was maintained that a new algorithm would be tested that would promote more ‘rational’ voices. Despite these claims, no regulatory method has been officially disclosed.
When trying to find evidence of Meta’s continued negligence, I came across research carried out by Global Witness on how Facebook’s ‘improved’ mechanisms were not effective.[12] To test the new measures, Global Witness paid Facebook to publish eight advertisements, all of which entailed hate speech. Much to my horror, all of them were allowed to be published. This was after Meta had claimed to put in its new measures for Facebook, and promised to do better.
To get greater insight, I contacted Global Witness to see the exact wording of the eight advertisements. Appreciative of their cooperation, I received the English translated version of the Burmese advertisements. They ranged from racial slurs referring to the Rohingya as ‘Kalar,’ to giving direct orders of shooting and burning the ‘dirty’ people. Despite fitting perfectly under Facebook’s guidelines of hate speech, these hateful messages were allowed to be published. Reading those advertisements cemented my view on Facebook’s participation as an enabler.
Global Witness sent me further research on Facebook’s ineffective hate speech detection systems. Shockingly, the investigations showcased similar cases in other parts of the world. Ethiopia, Kenya, and Brazil were all preyed upon by the same issue. To prove the ineffectiveness of the new content moderation systems, Global Witness attempted to publish hate speech advertisements in all these localities as well. All hate speech advertisements went through. This is clear evidence that Meta is unable to fulfil its duty of ensuring that hate speech is removed, and the abuse of Facebook’s platform has dire consequences in the form of human rights violations and manipulation. There are two questions raised from this discussion: who is to be held responsible, and how?
Accountability Gap: Steps to be Taken
There lacks an international legal framework through which corporations can be held accountable for actions, especially in overseas operations. How should we hold corporations liable? This can either be done by holding the corporation liable as a whole, or by prosecuting individuals within a structure, both of which are challenging due to it being a Limited Liability Company (LLC) that protects the owners from directly being responsible for business liabilities. The first school of thought encourages the creation of a new corporate criminal liability model, one in which the autonomy of corporations can be questioned, and would not only be restricted to individuals liable for corporate crimes. For this, the principle of ‘juridical entity participation’ can be applied to corporations. Hence, under the legal doctrine of respondeat superior, not only will the employees be held responsible, but the entire corporation itself. This model of criminal liability is considered more effective than going after individuals who act on behalf of corporations.
The second viewpoint is that of broadening the already established model of ‘individual criminal liability’ at the ICC to include corporate accountability. However, corporate structures are complex, and social media companies operating overseas have multiple subsidiaries through which they operate. This intricate web of complexities allows parent companies like Meta to be further removed from crimes, making it difficult to hold them accountable for international crimes.
The third way is by giving the responsibility to individual states to manage corporate liability within their own domestic legal systems. Over 40 countries currently have legislation in accordance with this. Yet domestic legal systems have approached this in differing ways. Most common law jurisdictions have recognised corporate criminal liability, while civil law jurisdictions have different responses. In countries like Germany, criminal liability is less stressed upon, and corporate wrongdoings are catered to through administrative law. On the other hand, France has codified corporate liability in their criminal codes. Other countries have used interpretation acts, interchanging legislation for war crimes, with violations committed by corporations.[13]This allows them to charge corporations under pre-defined domestic legislation. This approach is broadening the trend for greater accountability of international crimes by large corporations, and could be used to hold violations committed by social media companies as well. However, domestic systems may not be able to effectively tackle overseas disputes, and are significant in the complementary role alongside international models.
There are other ways to overcome the ‘accountability gap’ as well. For instance, the decision of the Appeals Chamber in the 2014 Al-Jadeed S.A.L & Ms Khayat (STL-14-05)[14] case before the Special Tribunal of Lebanon (STL), held a corporation criminally liable. This set a precedent of broadening the definition of ‘person’ to corporations by the STL. It also gave the Tribunal inherent power and jurisdiction over contemptuous conduct by corporations.[15] This case established that a hybrid criminal tribunal can successfully hold a corporation criminally liable. [16]
The international structure for human rights accountability is developing, and many multilateral treaties are also incorporating criminal liability of corporations. As of 2018, there are 17 multilateral mechanisms related to corporate liability, including the United Nations Convention against Corruption, the United Nations Convention against Transnational Organised Crime, and the International Convention for the Suppression of the Financing of Terrorism. The BHR framework (Business and Human Rights) set out in the UN Guiding Principles is a key reference point, since it sets out the responsibility of corporations like Meta to a) Protect, b) Respect, and c) Remedy.[17] If this framework is followed, it would ensure that human rights are protected, the local community is respected, and victims affected by business activities are given access to remedy. Most importantly, Meta has accepted this framework. The issue lies in the fact that, while states and corporations are urged to adhere to these treaties, these are non-binding and voluntary agreements. The discretion of the state to sanction corporations results in inaction. The BHR framework is not binding, and if breached, who prosecutes the corporation? To improve these existing frameworks, greater pressure from shareholders and consumers will be required.
While there are many possible routes to tackle the accountability gap, there will be a need to gather more evidence on corporations, regardless of whether the whole corporation is liable, or individual people. This can be done by encouraging whistleblowers to provide information, and funding complex investigations. Secondly, expanding a version of the Responsible Corporation Officer doctrine (allows the state to prosecute employees for corporate misconduct) to other parts of law, could be effective. By alleviating the need for excessive evidence regarding intent, it would be easier to hold corporations accountable.[18]
Although an effort to build these approaches is commendable, some seem more flawed than others. A stronger domestic legal system to hold corporations liable will not necessarily be beneficial, since many large multinational corporations have overseas subsidiaries. In that case, will the ‘domestic’ law be of the country in which the subsidiary is operating, or the country of the parent company? In the case of Facebook’s involvement in Myanmar, Meta’s headquarters are in the United States. If the United States’ (Meta’s headquarters) laws are to be followed, evasion is convenient under Section 230 of the Communications Decency Act of US law (which provides immunity to online platforms from civil liability based on third party content). This would mean that overseas subsidiaries could cause massive damage in other nations, yet not even be criminally liable for it according to the laws of that country. Similarly, multilateral treaties have less practical value due to the lack of execution. Without binding agreements, states can turn the other way and be influenced by powerful corporations to not impose sanctions.
The idea of a new corporate criminal liability model is promising, albeit requiring urgent development. Instead, if the ICC is broadened to entail corporations as well, it may well prove to be an effective legal framework that ensures future violations committed by corporations have consequences.
Legal Action Against Facebook
Attempts have been made to hold Facebook accountable for the atrocities in Myanmar.Rohingya refugees have sued Facebook in a Class Action Lawsuit worth $150 billion in the United States.[19] The plaintiff’s claim is that Facebook allowed for hate speech to perpetuate, drove user engagement by amplifying such content, and failed to take action even when informed of the disastrous consequences. Filed in December 2021, there is little information available about the proceedings of this case. That is somewhat alarming and raises questions of what is being decided behind closed doors.
Nay San Lwin, co-founder of the advocacy group Free Rohingya Coalition, argued that Facebook profited from their suffering, and the survivors should be compensated for their losses. [20] However, the premise of this lawsuit is that Facebook should be liable according to Myanmar law. This is essential because in the United States, platforms such as Facebook are protected by a law: Section 230, wherein the content users post cannot be used to hold the corporation liable. There is much speculation about the case, and Eric Goldman, a professor of law at Santa Clara University stated that, ‘based on precedents, this case should lose.’[21] However, this case may just be what creates a precedent.
A second case related to the conflict is ongoing in the ICJ, in which Gambia (backed by the Organisation of Islamic Cooperation) has brought forward claims to hold Myanmar accountable under the Geneva Convention of Genocide. While Facebook is not being prosecuted in this case, Gambia did request Facebook to reveal crucial information to prove the genocidal intent. Facebook refused, stating that this ask went against the principles of Facebook, and was too ‘vague’ a request.[22]
Conclusion
Facebook’s first defence is obvious enough: it is just a platform for communication. It does not create hate speech. While this distinction is valid, Facebook is still not exculpable. The civil lawsuits in the United States focus on product liability, opening up an alternative avenue for the accountability of hate speech and subsequent genocide. The core of the claims is that Facebook had actively designed its platform to encourage violent content, and failed to detect this despite many warnings. For this reason, these cases claim Facebook is liable for billions of dollars of harm.
There are many facts pointing toward Facebook’s involvement in the genocide undertaken in Myanmar. The enablement of racial hatred and violence, the acceleration of said violence due to the platform, the platform’s language support reducing other sources of available information, and the inability to develop hate speech detection systems alongside a complete lack of content moderators.
Despite this, who holds Facebook accountable? Can the ICJ be used as an avenue to deliver justice to the Rohingyas littered around Bangladesh’s slums, or is private litigation such as the lawsuit in the US the only way forward? How would the damages be distributed among people? Where should the displaced minority go?
It is clear that more effective legislation should be put into place to hold international corporations liable for crimes, and that the ICC should respond to this more efficiently. This can be done by broadening individual criminal liability and further the definition of ‘person’ under the Rome Statute. Furthermore, creating international forums to hold corporations liable, such as the Special Tribunal, could pave the way for success. Making treaties more binding and less discretionary could also prove to be fruitful. Even the focus on domestic legal action with relation to corporations is essential for complementary prosecution in the ICC.
Perhaps there is not much precedent to hold Facebook accountable. There is no doubt that the international legal framework to hold corporations liable is severely lacking. However, advancement in the international legal arena is gradually making attempts. There is hope for better.
References
[1] ‘Facebook Approves Adverts Containing Hate Speech Inciting Violence and Genocide Against the Rohingya’ (2022) 1(1) Global Witness <https://www.globalwitness.org/en/campaigns/digital-threats/rohingya-facebook-hate-speech/> accessed 14 October 2022
[2] Angshuman Chouhury ‘How Facebook is Complicit in Myanmar’s Attacks on Minorities’ (2020) 1(1) The Diplomat <https://thediplomat.com/2020/08/how-facebook-is-complicit-in-myanmars-attacks-on-minorities/> accessed 16 October 2022
[3] Adrienne LaFrance, ‘The Largest Autocracy on Earth’ (2021) 1(1) The Atlantic <https://www.theatlantic.com/magazine/archive/2021/11/facebook-authoritarian-hostile-foreign-power/620168/> accessed 14 October 2022
[4] Dr Simone Bunse, ‘Social Media: A Tool for Peace or Conflict?’ (2021) 1(1) SIPRI
<https://www.sipri.org/commentary/topical-backgrounder/2021/social-media-tool-peace-or-conflict> accessed 15 October 2022
[5] Ibid
[6] Ibid
[7] Ibid
[8] Angshuman Choudhary, ‘How Facebook is Complicit in Myanmar’s Attacks on Minorities’ (2020) 1(1) The Diplomat <https://thediplomat.com/2020/08/how-facebook-is-complicit-in-myanmars-attacks-on-minorities/> accessed 16 October 2022
[9] Tom Miles, ‘U.N Investigations Cite Facebook Role in Myanmar Crisis’ (2018) 1(1) Reuters
<https://www.reuters.com/article/us-myanmar-rohingya-facebook-idUSKCN1GO2PN> accessed 15 October 2022
[10] Sair Asher, ‘ Myanmar Coup: How Facebook Became the Digital Tea Shop’ (2021) 1(1) BBC
[11] Victoria Milko and Barbara, ‘Kill More: Facebook Fails to Detect Hate Against Rohingya’ (2022) 1(1) AP News <https://apnews.com/article/technology-business-bangladesh-myanmar-united-nations-f7d89e38c54f7bae464762fa23bd96b2> accessed 14 October 2022
[12] ‘Facebook Approves Adverts Containing Hate Speech Inciting Violence and Genocide Against the Rohingya’ (2022) 1(1) Global Witness <https://www.globalwitness.org/en/campaigns/digital-threats/rohingya-facebook-hate-speech/> accessed 14 October 2022
[13] Jaya Bordeleau- Cass, ‘ The Accountability Gap: Holding Corporations Liable for International Crimes’ 2019 3 PKI Global Justice Journal 65 <https://globaljustice.queenslaw.ca/news/the-accountability-gap-holding-corporations-liable-for-international-crimes> accessed 15 October 2022
[14] Al Jadeed S.A.L v Ms. Khayat (2014) STL-14-05
<https://www.stl-tsl.org/en/the-cases/contempt-cases/stl-14-05> accessed 1 November 2022
[15] Manuel Ventura, ‘The Prosecution of Corporations Before a Hybrid International Criminal Tribunal’ (2016) 2(1/2) 71
<https://www.jstor.org/stable/48581888> accessed 1 November 2022
[16] Ibid
[17] ‘Understanding BHR’ CEECA Resource Hub
<https://ceeca-bhr.org/eecacategory/understanding-bhr/>
[18] Vikramaditya Khanna, ‘Holding Corporations and Executives Accountable Depends on Our Legal System’ (2021) 1(1) ProMarket <https://www.promarket.org/2021/03/14/corporations-executives-accountability-wrongdoing-legal-system/> accessed 14 October 2022
[19] ‘Rohingya Refugees Sue Facebook for $US 150 Billion in California Court Over Prosecution’ (2021) 1(1) ABC News <https://www.abc.net.au/news/2021-12-07/rohingya-muslims-lawsuit-facebook-persecution-genocide/100680238> accessed 15 October 2022
[20] Rina Chandran and Avi Asher-Shapiro, ‘Analysis: Rohingya Lawsuit Against Facebook a ‘Wake-up Call’ for Social Media’ (2021) Reuters <https://www.reuters.com/technology/rohingya-lawsuit-against-facebook-wake-up-call-social-media-2021-12-10/> accessed 14 October 2022
[21] Ibid
[22]Md. Rizwanul Islam, ‘The Gambia v Myanmar: An Analysis of the ICJ’s Decision on Jurisdiction under the Genocide Convention’ (2022) 1(1) ASIL < https://www.asil.org/insights/volume/26/issue/9> accessed 15 October 2022