Ever since Facebook, now Meta, admitted in 2018 that its platform may have been complicit in the genocide committed against Rohingya Muslims in Myanmar, it became clear that social media’s impact transcended the realm of cyberspace and had very tangible consequences especially in the facilitation, exacerbation and incitement of violence in a state of armed conflict.[1]Several conflicts across the world – whether it is India’s formal constitutional annexation of the disputed Kashmir region in 2019, ongoing political unrest in Iran, rise and decline of ISIS in Syria, the war between Azerbaijan and Armenia, conflict between Israel and Palestine and of course, most recently Ukraine and Russia – have found some foothold across social media platforms used for raising awareness of situations on the ground, appealing for humanitarian aid, or spreading outright propaganda and disinformation.
In May 2020, in response to its growing global user-base and reach, Meta constituted the Oversight Board to increase legitimacy and transparency through the inclusion of topic experts (academics, law makers, human rights activists and other varying stakeholders etc.) to weigh-in on increasingly difficult decisions on what content should be allowed to stay on the platform and what should be removed. It would be too reductive to say that decisions merely come down to the “letter of the law”, or in this case, Meta’s Community Guidelines and Policies per its individual platforms.
Sometimes content may be infringing but due to its ability to impact the greater public good, through its “newsworthiness” or by bringing to light “gross human rights atrocities”, it may stay on the platform. Other times, content may be seemingly benign or implicit but can snowball into a massive humanitarian crisis. For example, part of Meta’s failure in taking down the inflammatory content against the Rohingya Muslims was also because of the linguistic complexities of posts containing words like “kalar” – something that is both a slur for Muslims in the region and also means “chickpeas” in the local language.[2] The Oversight Board is designed to deliberate on these nuances and suggest recommendations to Meta to alter, correct, and make additions to their content policies in accordance with unique circumstances.
Up until now, the Oversight Board has received fourteen unique cases that pertain to situations of armed conflict and civil unrest.[3] The most recent examples of cases include considerations regarding an uploaded picture of an individual killed in Ukraine[4], a video depicting graphic violence against a civilian in Sudan[5] and a post calling for violence in Ethiopia.[6] Of these cases the board has just recently released its decision on the case regarding violence in Ethiopia – henceforth referred to as the Tigray Communication Affairs Bureau case. This decision together with its predecessor case on alleged crimes in Raya Kobo (2021) will be the focus of this article.
Ethiopia’s Civil War & the Oversight Board Cases on Alleged Crimes in Raya Kobo and The Tigray Communication Affairs Bureau
From November 2020 until late November in 2022, Ethiopia had been embroiled in a civil war concentrated primarily in the Tigray region (hence why it is commonly known as the Tigray War). The warring factions consisted of the Ethiopian government led by Prime Minister Abiy Ahmad (and includes support from Eritrean forces) and the Tigray People’s Liberation Front (TPLF), a former rebel movement that ruled Ethiopia for nearly thirty years.[7] Tensions between the TPLF and the federal government had been rising since Mr. Abiy came to power in 2018[8] and reached a breaking point when the TPLF held regional elections in defiance of Mr. Abiy who had postponed these elections throughout Ethiopia. In anticipation of military mobilization by the government, the Tigrayans launched a preemptive strike which devolved the whole situation into a state of violence with the Prime Minister openly declaring military operations against the TPLF.
The crisis in Ethiopia would become one of the biggest humanitarian crises in the last decade.[9] Since the beginning of the conflict there have been reports of mass atrocities and war crimes committed by both sides. According to reports by Amnesty International,[10] sexual violence, forced starvation, ethnic cleansing and killing of civilians was widespread. It is estimated that the conflict in Tigray has resulted in over 500,000 casualties. Of this, 50,000 to 100,000 are victims of direct killings, 150,000 to 200,000 starvation deaths, and more than 100,000 additional deaths caused by a lack of health care.[11]
Both the TPLF and the federal government (and their supporters) have taken to social media, specifically Facebook and Twitter, to dominate the discourse around the war, each side painting the other as the ultimate aggressor. According to the Media Manipulation Casebook, a general trend in posts related to the conflict in Ethiopia is the Tigray side trying to raise awareness of the situation and the government side refuting the information in those posts. In its research, the Casebook categorizes this information war as “a complex case that interacts with the geopolitics of the Horn of Africa, historical trauma, activism, hate speech, misinformation, platform manipulation, and propaganda, all in the midst of an ongoing civil conflict.”[12]
In the alleged crimes in Raya Kobo case (2021),[13] a Facebook user in Ethiopia made a post in Amharic that levied allegations of war crimes against the Tigray People’s Liberation Front. It claimed that the TPLF forces, with help from Tigray civilians were raping women and children and looting civilian properties in the Raya Kobo region. According to the poster they received this information from a resident of Raya Kobo. Originally, Meta’s automatic language detection system flagged this post and Amharic language content moderators determined on two separate occasions that the post violated Meta’s Hate Speech Community Standard. When the user appealed to the Oversight Board, Meta restored the content on grounds that it did not make discriminatory claims on ethnic grounds. In its final decision, the Oversight Board found the restoration of this content to be “lacking in detail and incorrect”. According to the Board, the post violated Facebook’s Violence and Incitement Community Standard that prohibits “misinformation and unverifiable rumors that can contribute to the risk of imminent violence and harm”. The Board upheld Meta’s original decision to take down the post. Referring to its previous judgment, in the Armenians in Azerbaijan case[14] and citing what happened in Myanmar, the Board noted that “in situations of armed conflict in particular, the risk of hateful, dehumanizing expressions accumulating and spreading on a platform, leading to offline action affecting the right to security of person and potentially life, is especially pronounced”.
In October 2022, the Oversight Board published its decision regarding a post made by the Tigray Communication Affairs Bureau.[15] The post was made on November 5th 2021 by the aforementioned page which claimed in its page description that it was the official page of the Tigray Regional State Communications Bureau (TCAB). Written in Amharic and viewed more than 300,000 times, the post called out the “losses suffered by the Federal National Defense Forces under the leadership of Prime Minister Abiy Ahmed in the armed conflict with TPLF”. It also urged the national army to “turn its gun towards the fascist Abiy Ahmed group” and stated “the fate of the armed forces will be death” if they do not surrender to the Tigray. On being reported by users and flagged by Meta’s automated systems, a review by 2 Amharic speakers determined that the post did not violate Meta’s policies and the post was allowed to stay on Facebook. However, expert review through Meta’s Integrity Product Operations Center (IPOC) for Ethiopia,[16] keeping in cognizance the Board’s previous decision in the Raya Kobo case, determined that the post did violate Meta’s incitement policies and should be taken down. When referred to the Board, it also upheld Meta’s decision to remove the content and agreed that the post was in violation of Meta’s Community Standard on Violence and Incitement, which prohibits “statements of intent to commit high-severity violence”.[17]
Implications of the Oversight Board Decisions
The Oversight Board’s deliberation over the aforementioned cases on Ethiopia exposed a fundamental contention that exists between exercising ideals of free speech on social media with the preservation of human safety and dignity especially as it relates to the possibility of exacerbated violence in a state of armed conflict.
Independent Journalism and Reporting
In the Raya Kobo case, this contention was showacased when the original poster stated in their initial appeal that the information they had shared in the post was “true” and that Tigray forces with assistance from some civilians were killing, looting and raping civilians in Raya Kobo.[18] They claimed that making this post was an attempt to spread awareness about the issue and to protect his community. When the board rejected this appeal on the grounds that claims made in the post were unsubstantiated i.e. they lacked specific details, the poster was not based in the Raya Kobo region – it had redefined what sources and perspectives were “valid” enough to be considered newsworthy. Small-independent journalists, eye-witness accounts and ordinary citizens experiencing hostilities first hand, anyone unaffiliated with established news agencies will suffer the most on this account.
It should be noted that given the overall difficulty for journalists to cover news and conduct investigations of human rights violations during armed conflict, these voices on the ground are some of the most important sources of reporting. It should also be noted that this decision has potentially grave implications for civilians speaking about human rights atrocities in other contexts of political instability, where their “unverifiable” claims are taken down due to the potential of violence. One of the grounds that rendered the post as invalid was also the fact that the original poster was not based in Raya Kobo themselves. This poses a huge issue in instances where locals are unable to post on social media because of internet restrictions and social media blocking by governments during a period of conflict or unrest.
Whilst this author ultimately agrees with the decision of the Board to take down the post due to its potentially inflammatory nature, one cannot diminish the importance of independent, on-ground reporting or citizen reporting in raising awareness of the myriad of ways in which hostilities unfold during an armed conflict or the ability of people to disseminate information regarding these situations via word of mouth. In its decision to take down the post, the Oversight Board emphasizes its conscious decision to prioritize the curtailment of content that may potentially incite “near-term” ethnic violence over its potential news value.
Evidence of War Crimes
In the Tigray Communications Affairs Bureau case, the board deliberated on the opposite scenario i.e whether it was prejudicial to the interest of newsworthiness to take down a post made by a verified official account. Did the post constitute a violation of the Facebook Incitement of Violence Community Standard or should it be categorized under Meta’s “newsworthy allowance” because it involved “official governmental speech that could be considered newsworthy”?[19] The Board determined that such an allowance cannot be given due to the nature of the threat, influential status of the speaker, rapidly escalating situation in Ethiopia and the post did not “include sufficient information with strong public interest value”.[20]
Insofar as the positives are concerned, by making this decision, the Board is taking positive strides to combat state-sponsored disinformation and raises important questions about the application of international human rights law and international humanitarian law to the use of social media as a propaganda machine during times of conflict. But on the other hand, social media is also an important resource for the preservation of evidence of war crimes. Posts containing threats by a party to the conflict in conjunction with independent research documenting atrocities committed by said party during conflicts may be cited as necessary evidence to an international tribunal. The case highlights the necessity of social media companies like Meta, Google and Twitter to have some kind of secure data base where evidence of war crimes can be preserved without letting it widely proliferate on the platform.
Transparency and International Double Standard
In its recommendations for the Tigray Communications Affairs Bureau case, the Oversight Board emphasized the need for increased transparency on the implementation of Meta’s Crisis Policy Protocol especially in regard to the situation in Ethiopia. Doing so aligns with international human rights obligations enshrined in Article 19 of the ICCPR which claims that any restrictions on freedom of speech must be governed by a criteria of legality, legitimate aim, necessity and proportionality. To that end it is urgent for Meta to codify its crisis response in the public Crisis Policy Protocol that is available to users in its Transparency Center. What this does is also prevent a double standard in the moderation of content, which is made more obscure and varied due to the high stakes of an armed conflict. As an example, the Oversight Board also ruled in November 2022[21] to restore a post that likened the Russian Army in Ukraine to Nazis whilst quoting a poem that said “Kill the fascist! Kill him! Kill him!”. It also simultaneously restored a picture of a dead body (no wounds visible) shot in Bucha, Ukraine. Despite the on-going conflict between Russia and Ukraine, the Board construed the text of the poem to be “neutral reference[s] to a potential outcome”.[22] Would such a post be allowed to stay on the platform if it were coming from Ethiopia? How come the factors which resulted in one instance to be an “unverifiable claim” that may incite near-harm and the other to be a “threat of imminent physical harm” in the case of Ethiopia were declared “neutral” in the case of Ukraine?
Meta has been accused on numerous occasions of implementing content moderation policies in a biased/preferential manner and especially in its allowances in regards to graphic imagery from the Russia-Ukraine war. Meta responded to the invasion by rapidly updating protocols designed to broaden and protect the online speech of Ukrainians, specifically allowing their graphic images of civilians killed by the Russian military to remain up on Instagram and Facebook. Other crises such as in the case of the Tigray war or victims of the Israel-Palestine conflict have not received any such considerations.[23] In fact, research showed that Meta’s over-moderation of Arabic language content inadvertently suppressed Palestinian posts related to the conflict.[24]
The impact of Meta’s platforms on shaping the discourse around a conflict is undeniable. As a result it is imperative for Meta and the Oversight Board to push for greater transparency into their content moderation policies and apply a uniform standard to the kinds of violating content they allow on their platforms in the interest of newsworthiness or greater public interest. Meta must not become embroiled with inter-state politics and hold all users, regardless of geographical location/ideological affiliation, to the same standards of safety, dignity and the ability to exercise their freedom of speech online.
Conclusion
As the use of social media grows more and more pervasive globally, it becomes imperative to continue studying its impact especially in the context of wars and conflict. While the aforementioned cases act as a fascinating case study of the potential consequences of both the downsides of unfettered free speech and the decision making involved to determine whether or not a post can have disastrous repercussions, they only scratch the surface of issues that can arise as a result of technology that can allow any type of content to be transmitted instantly across the world. For example, the cases mentioned do not cover the discussion of the use of social media as a means of intimidation; they do not cover deadly disinformation emanated by government bodies even if that information contains no death threats; they do not cover state-requested censorship of legitimate human rights concerns on the grounds of such speech being inflammatory; they do not cover posting videos and pictures of prisoners of war on social media and other issues that can also fall within the ambit of international humanitarian law. In recognition of the complexities in which these issues can manifest, more social media platforms are turning towards the Oversight Board model. Twitter, for example, announced in September 2022 that it would be forming its own “Content Moderation Council” to deliberate on difficult issues around freedom of expression and its implications on other human rights concerns.[25] While these independent oversight models are a step in the right direction, they still have a ways to go in applying human rights principles in a consistent, transparent and unbiased manner. Whether or not they will succeed in doing so without any vested powers of implementation is yet to be seen.
References
[1] Milmo D, “Rohingya Sue Facebook for £150bn over Myanmar Genocide” (The Guardian, December 6, 2021) https://www.theguardian.com/technology/2021/dec/06/rohingya-sue-facebook-myanmar-genocide-us-uk-legal-action-social-media-violence
[2] “The Country Where Facebook Posts Whipped up Hate” (BBC News, September 12, 2018) https://www.bbc.com/news/blogs-trending-45449938
[3] “Oversight Board Cases” (Meta, Transparency Center) https://transparency.fb.com/en-gb/oversight/oversight-board-cases/
[4] “Oversight Board Selects a Case Regarding a Photo of an Individual Killed in Ukraine during the Russian Invasion with Text Containing an Excerpt from a World War II Era Poem” (Meta, Transparency Center) https://transparency.fb.com/en-gb/oversight/oversight-board-cases/photo-with-world-war-II-era-poem/
[5] “Graphic Video Depicting a Civilian Victim of Violence in Sudan” (Meta,Transparency Center) https://transparency.fb.com/en-gb/oversight/oversight-board-cases/graphic-video
[6] “Oversight Board Selects a Case Regarding a Post Calling for Violence in Ethiopia” (Meta, Transparency Center) https://transparency.fb.com/en-gb/oversight/oversight-board-cases/violence-in-ethiopia
[7] Walsh D and Dahir AL, “Why Is Ethiopia at War with Itself?” (The New York Times, November 5, 2020) https://www.nytimes.com/article/ethiopia-tigray-conflict-explained.html
[8] Ibid.
[9] “Humanitarian Emergencies around the World: USA FOR UNHCR” (Humanitarian Emergencies Around the World | USA for UNHCR) https://www.unrefugees.org/emergencies/#:~:text=After%20over%20a%20decade%20of,people%20who%20are%20internally%20displaced
[10] “Ethiopia: Investigation Reveals Evidence That Scores of Civilians Were Killed in Massacre in Tigray State” (Amnesty International, April 6, 2022) https://www.amnesty.org/en/latest/news/2020/11/ethiopia-investigation-reveals-evidence-that-scores-of-civilians-were-killed-in-massacre-in-tigray-state/ and “Ethiopia: Troops and Militia Rape, Abduct Women and Girls in Tigray Conflict – New Report” (Amnesty International, November 1, 2021) https://www.amnesty.org/en/latest/news/2021/08/ethiopia-troops-and-militia-rape-abduct-women-and-girls-in-tigray-conflict-new-report/
[11] York G, “Tigray War Has Seen up to Half a Million Dead from Violence and Starvation, Say Researchers” (The Globe and Mail, March 16, 2022) https://www.theglobeandmail.com/world/article-tigray-war-has-seen-up-to-half-a-million-dead-from-violence-and/
[12] Scott L, “How Social Media Became a Battleground in the Tigray Conflict” (VOA, October 19, 2021) https://www.voanews.com/a/how-social-media-became-a-battleground-in-the-tigray-conflict-/6272834.htm
[13] “Meta Oversight Board – Alleged Crimes in Raya Kobo Decision (2021-014-FB-UA)” (Oversight Board)https://www.oversightboard.com/decision/FB-MP4ZC4CC
[14] “Meta Oversight Board – Armenians in Azerbaijan Decision (2020-003-FB-UA)” (Oversight Board) https://www.oversightboard.com/decision/FB-QBJDASCV
[15] “Meta Oversight Board – Tigray Communication Affairs Bureau Decision (2022-006-FB-MR)” (Oversight Board) https://www.oversightboard.com/decision/FB-E1154YLY
[16] IPOCs are made to improve moderation in high risk situations.
[17]“Meta Oversight Board – Tigray Communication Affairs Bureau Decision (2022-006-FB-MR)” (Oversight Board) https://www.oversightboard.com/decision/FB-E1154YLY
[18] “Meta Oversight Board – Alleged Crimes in Raya Kobo Decision (2021-014-FB-UA)” (Oversight Board)https://www.oversightboard.com/decision/FB-MP4ZC4CC
[19] “Meta Submission to the UN Special Rapporteur on Freedom of Expression on Challenges and Approaches to Misinformation, Disinformation and Propaganda in Times of Conflict.” (OHCHR, 2022) https://www.ohchr.org/sites/default/files/documents/issues/expression/cfis/conflict/2022-10-07/submission-disinformation-and-freedom-of-expression-during-armed-conflict-UNGA77-cso-derechos-digitales.pdf
[20] “Oversight Board Case of Tigray Communication Affairs Bureau” (Global Freedom of ExpressionAugust 19, 2021) https://globalfreedomofexpression.columbia.edu/cases/oversight-board-case-of-tigray-communication-affairs-bureau/
[21] “Oversight Board Overturns Meta’s Original Decision in ‘Russian Poem’ Case” (Oversight Board)https://www.oversightboard.com/news/485045113689422-oversight-board-overturns-meta-s-original-decision-in-russian-poem-case/
[22]Ibid
[23] Biddle S and Speri A, “Facebook Tells Moderators to Allow Graphic Images of Russian Airstrikes but Censors Israeli Attacks” (The InterceptAugust 27, 2022) https://theintercept.com/2022/08/27/facebook-instagram-meta-russia-ukraine-war/
[24] Wong Q, “Facebook Parent Meta Impacted Palestinians’ Human Rights, Report Says” (CNETSeptember 27, 2022) https://www.cnet.com/news/social-media/facebook-parent-meta-impacted-palestinians-human-rights-report-says/
[25] Elon Musk Tweet on forming “Content Moderation Council” (Twitter, October 28th 2022) https://twitter.com/elonmusk/status/1586059953311137792