Olivium IndexAbout usContact us
Olivium Logo
Olivium Logo

Background 

In 2011, Myanmar emerged out of the decades of military rule, but religious violence has hindered the country’s transition into democracy. Violent clashes between ethnic Rakhine (Buddhists) and Rohingya (Muslims) have led to the death of hundreds of thousands of people. As of 2012, Myanmar was the least connected state on the Earth, with only 1% of its population using the internet; the junta rule kept the country isolated from the world. However, the situation changed in 2013 when a quasi-civilian regime supervised the deregulation of the telecommunication industry. As prices of SIM cards dropped from $200 to as low as $2, more and more people purchased them; in 2016, nearly 50% of the population had access to mobile internet. Many view the app ‘Facebook’ as an all-in-one solution; therefore, it went viral. 

In 2015, 6 out of 10 web pages in Myanmar getting the most engagement from Facebook were from legitimate media sources; however, after Facebook offered international access to Instant Articles, allowing publishers to monetize their content in 2018, legitimate publishers in Myanmar accounted for zero. The rapid degradation of the digital environment occurred in the backdrop of the crackdown on the Rohingya and anti-Muslim propaganda. Sheer volume of hate speech and disinformation propaganda acted as fuel on the fire amid ethnic and religious hatred.  

An investigation by the United Nations Human Rights group revealed in 2018 that Facebook played a determining role in perpetuating genocide against Muslims. A few months later, Facebook acknowledged that it did not utilize its full potential to prevent the platform from being used to entice violence in the real world. Moreover, a whistleblower, Frances Haugen, provided Congress with the internal documents that revealed how the defected Facebook’s algorithm has funded, yet only amplified the inflammatory content, and failed to moderate hate speech and misinformation outside the USA and Europe.

Hate speech in Myanmar continues to thrive on Facebook

Years after being scrutinized for fomenting violence against Rohingya Muslims, Facebook has still not been successful in moderating its content. While Facebook pledged to enhance its role and develop efficient tools and strategies to deal with the issue, numerous breaches continue to persist. In the aftermath of the military coup in Myanmar on 1st February 2021, various hostile actors have exploited the platform to accelerate gruesome human rights atrocities within the country. 

While using Facebook, it is not hard to find posts intending to inflict harm to Rohingya, including threats of rape or murder. Some posts even refer to Muslims as dogs, maggots, or rapists, suggesting them to be fed to pigs or to be exterminated. Some pages also feature crudely pornographic or anti-Muslim content, including photographs and videos. On 24th October, a user who supported military rule uploaded a 2.5 minutes long video calling for violent oppression of the opposition group; the video attracted more than 55,000 views. The Burmese man in the video declared himself as the ‘God of the death’ for all Rohingya Muslims while starring in the camera. Similarly, another user posted online the address of a military defector along with the photos of his wife. There are several other posts where Muslims are blindfolded, and dragged on the streets down a dirt part, with captions reading ‘Don’t catch them alive.” In extreme cases, people suggested the fate of Rohingya to be the same as Jews during Hitler’s rule. These remarks are only a few examples out of 1000s of such posts, comments, photos, and videos, with all of them being conveyed in the local Burmese language. 

The Facebook rules forbid assaults against particular ethnic groups, use of dehumanizing speech, or comparing people with animals; it also has a strict policy against graphic content. For years, Facebook has devoted significant resources to deal with hate speech in Myanmar; however, by 2015, there were only two individuals employed at Facebook who could read or speak Burmese. Before that, the majority of the people reviewing Burmese content were English-speaking; in Manila, the original site of outscored Project Honey Cadger, there were no reviewers who spoke Burmese. Even today, the platform relies heavily on the user to report hate content because of its system's inability to intercept Burmese language.

Rohingya Refugees sues Facebook for $150 billion:

On 6th October 2021, Rohingya refugees in the US and UK sued Facebook for $150 billion, claiming that the social media giant has failed to curb hate speech against a particular ethnic group, i.e., Rohingya Muslims. The complaint was lodged in a California court and asserted that Facebook promotes misinformation and extremism, which then translates into real-life violence. The law firms Edelson PC and Fields PPLC have backed the case of the Rohingya community and initiated a trans-Atlantic legal campaign against Facebook. 

The group demands $150 billion as compensation since Facebook undertakes full responsibility to promote a safe environment. The British law firm wrote a letter to Facebook, an accusation the company of the following: 

  • Its algorithms have amplified and promoted posts containing hate speech;
  • The company’s failure to invest in reviewers or moderators who were aware of the political scenario in Myanmar;
  • The inability to remove graphic content and delete profiles or pages that support violence against Muslims; 
  • The failure to act timely and undertake appropriate, in spite of repeated warnings by media and charity organizations. 

In The letter intended for Facebook’s London office, clients aim to bring proceedings against the social media giant in the High Court for its actions that led to fatal harm to the Rohingya Muslims. The British law firm declared that its clients want to remain anonymous, mainly due to the feat of reprisal.  

Has Facebook lost war on hate speech in Myanmar? – From a legal perspective

The American suit against Facebook was filled first; however, there exist several hurdles for the case to make it to the trial, let alone assure a favorable ruling. For a suit to be recognized as a ‘class action’ by the judge, the plaintiffs involved must have endured a ‘common’ problem. In the case of Myanmar, there is a great probability of variations in the experiences of the members that could potentially be categorized as a ‘class.’ It is difficult to imagine an existence of proof shared by the class that in turn would establish that Facebook action inflicted harm to an individual class member.

Similarly, the case may also be tricky in the legal aspect. For example, Facebook has asserted that it is safeguarded from liability over online content posted by internet users under American Internet Law, referred to as Section 230. The law states that online social media platforms are not accountable for material posted by a third party. However, on the contrary, the complaint aims to put into application Myanmar law as a defense in case Section 230 is raised. 

Although America can apply external law to cases where the harm by companies took place in other states, some legal expert states that they do not recall any successful precedent for external law being invoked in a lawsuit against social media company where it could seek protection from Section 230. On the assumption of invoking Myanmar law, Anupam Chander stated that while it was not inappropriate, there is a high chance that I would not be successful because it would be atypical for Congress to have foreclosed a particular action under its law, but permit it to proceed under foreign law.

Response of Mark Zuckerberg on the situation

The CEO of Facebook, Mark Zuckerberg, released a 1300-world long Facebook post criticizing the former Facebook employee’s, Franc Haugen, testimony that the company placed profit over the safety of people, permitting its algorithm to cause harm in the perusal of growth. Zukerberg argued that a ‘fake’ picture of the company is being painted and its reputation is being tarnished; he asserted that the majority of the claims put forward did not make any sense.

However, the company did acknowledge that it has not done enough to disallow Facebook from being used as a platform to ignite bloodshed and violence; later, Zuckerberg issued an open apology, promising to enhance its moderating efforts. The Facebook executives have also employed more individuals with language and subject expertise over states like Myanmar; it also hired 12 moderators that spoke the Burmese language.

Most recently, Facebook has expanded the ban on online posting associated with Myanmar military rule; the bans extend to profiles, pages, and groups that manifest military-regulated businesses. Previously, Facebook also banned advertisement from such businesses to combat the use of fake accounts to engage in what is commonly known as ‘information combat.’

Thus, the entire fiasco suggests that there indeed are real-world implications for the damage done by popular social media platforms. Although Facebook is reviewing its policies, one weakness in its repose is that repeat offenders can easily return to the platform with a new identity after being banned. Tacking such a huge problem would require a border and a more comprehensive response on part of Facebook.

Written By: Olivium's Staff.


References: 

Web Design & Development - PIT Designs
Top