Myanmar's national flag with Facebook overlaid

Molly Butler / Media Matters

Facebook has continued to fail Myanmar. Now its people have to pay the price.

A decadelong analysis of Facebook’s willful negligence in Myanmar and its consequences.

With over half of Myanmar’s population using Facebook, the platform is synonymous with the internet for many residents in the country, and many users consider the posts and pages on their news feed to be reliable news sources. But Facebook has time and again failed to take appropriate measures to fully eradicate hateful content and disinformation found in these posts, actively harming users and non-users in the country alike. 

Approximately 88% of Myanmar’s population practices Buddhism, with only 4% following Islam. The Rakhine state is home to the Rohingya minority, which is the country’s largest population of Muslims. Many other citizens falsely believe Rohingya Muslims are “illegal immigrants” from Bangladesh, and religious tensions -- fueled by anti-Muslim sentiment on Facebook -- have remained high in the recent past. The nation’s military has exacerbated these tensions in order to create violence and justify its control of the country -- also using Facebook as a vehicle for fanning these flames. 

On February 1, Myanmar’s military seized full control of the country’s government and detained civilian leader Aung San Suu Kyi in a move which has been widely recognized as a “coup.” (Suu Kyi herself had been widely criticized for amplifying hate speech against the Rohingyas while also allowing mass ethnic cleansing to occur.) According to a Reuters investigation, in the weeks leading up to the takeover, dozens of Facebook accounts and pages alleged election fraud and called for military intervention. These posts were widely circulated before Facebook intervened. 

In 2017, the company faced significant criticism after it neglected to take action against incitement and hate speech during a spike in violence against the Rohingya. Facebook's failure to properly address these issues in a timely manner -- despite forewarning -- was just the culmination of a larger pattern of negligence by the platform during the company’s decadelong stint in the country. 

Facebook’s lack of accountability in the region has resulted in extreme violence and countless deaths. Yet, it continues to quell its critics with empty promises to “do better” in the future, rather than invest resources into bettering its work through hiring additional moderators and properly identifying -- and addressing -- hate speech on the platform. 

Below, Media Matters has compiled a timeline of Facebook’s presence in Myanmar, cataloguing its multitude of failures in protecting its users and contributing to destabilizing the country. 

  • 2010 - 2012

    • Facebook entered Myanmar in 2010, around the time the internet restrictions implemented by the country's military were being lifted. The platform originally allowed users to use its mobile app without incurring data charges, leading to its rapid popularity within just a few years. 
    • In 2011, the military junta gave up power, allowing the country to return to a form of civilian rule for the first time since 1962. This facilitated the liberalization of Myanmar’s telecommunications sector and consequently initiated the start of Facebook’s widespread reach in the country.
    • Myanmar declared a state of emergency on June 10, 2012, following clashes between Buddhists and Muslims, which resulted in the death of 17 people. Even before Facebook became widely accessible in the country, New Mandala, a forum for analysis on Southeast Asia, discussed an “abundance of hate posts on Burmese language Facebook nodes” on the platform, and there were fears that it had the ability to exacerbate the existing tensions in the country which could result in mass violence.
  • 2013

    • In 2013, Facebook became more widely accessible in Myanmar when the government further liberalized the telecommunication sector. This lowered the price of SIM cards from thousands of dollars in 2011 to as little as $3. Since other big platforms -- such as Google -- did not yet have the capability to support Burmese script, Facebook’s popularity skyrocketed. An editorial in the New Light of Myanmar, a state-run newspaper, claimed, “It is said in Myanmar that a person without a Facebook identity is like a person without a home address.”​​​​​​

      Thet Swei Win, the director of Synergy, a group working to promote social harmony in Myanmar, claimed that since the population had such limited access to the internet prior to Facebook’s introduction, the public was more susceptible to misinformation and propaganda. “We have no internet literacy,” he told BBC Trending in 2018. “We have no proper education on how to use the internet, how to filter the news, how to use the internet effectively. We did not have that kind of knowledge.”

    • In a November 2013 meeting, Aela Callan, a foreign correspondent on a fellowship from Stanford University, warned Elliot Schrage, the vice president of global communications for Facebook, about the prevalence of fake user pages and prolific hate speech on the platform. However, as Wired reported, the company’s excitement over the potential of the Myanmar market seemingly overrode these concerns. Emails show that Schrage did not ever actually connect Callan with anyone within Facebook who could address these problems. Instead, Callan was connected with two Facebook officials who worked for Internet.org, a controversial initiative to bring basic internet access to developing countries with no additional data charges. Although one of the two officials worked with “civil-society organizations to assist the company in coping with hate speech,” Facebook had just one Burmese speaker at the time to review Burmese language content flagged as questionable -- limiting the efficacy of any moderation efforts.
  • 2014

    • In the summer of 2014, a Buddhist woman was bribed to falsely report that two of her Muslim co-workers had raped her in Mandalay, Myanmar. Although this allegation was fabricated, it gained thousands of interactions after the infamous Buddhist monk Ashin Wirathu published it on his Facebook page. Wirathu, the self-proclaimed “Burmese bin Laden” and spiritual leader of the Ma Ba Tha, an ultra-nationalist Buddhist movement, is known for his prominent anti-Muslim rhetoric and has instigated several instances of hate speech against the Rohingya Muslim population. As the accusation spread, Buddhist mobs began swarming the streets, circling the local mosque, and Muslim-owned businesses -- including the shop where the accused worked. Cries of “Kill the Kalars” (Kalars means Muslims) reportedly rang through the streets. Within 48 hours, as Reuters reported, “one Buddhist and one Muslim man were killed ... dozens were injured, Muslim shops ransacked, and a mosque badly burned.” Officials feared the violence would spread beyond Mandalay, prompting them to implement a curfew on both the city and its surrounding towns. 
    • Amid the chaos, Zaw Htay, a senior official in President Thein Sein’s office, asked Chris Tun, a longtime member of the country’s technology community who was involved with a U.S.-based lobbying group, to try and contact any representative from Facebook to see if anything could be done to mitigate the spread of misinformation and the escalation of tensions. Tun couldn’t reach any company officials despite several attempts, leading to the president’s office temporarily blocking Facebook in the Mandalay area. That decision was seemingly effective in stopping the clashes by limiting the spread of misinformation. But within a few hours, Tun found that he had received many emails from Facebook officials regarding the site being inaccessible.
    • On July 20, 2014, two weeks after the Mandalay riots, members of the country’s technology scene convened to discuss what led to the clashes and how to prevent something similar from happening in the future. Most questions were directed to Facebook’s director of policy for the Asia-Pacific region, Mia Garlick. According to Wired, Garlick said “that in response to the violence the company planned to speed up translation of the sites’ user guidelines and code of conduct into Burmese.” Despite this promise, these language community standards would not launch until September 2015, 14 months after the conference. 
    • Between March and December 2014, Matt Schissler, a doctoral student at the University of Michigan, spoke with several Facebook officials regarding the platform’s role in spreading hate speech and misinformation within Myanmar. He and other human rights activists discussed the use of fake accounts and provided specific examples of incitement, including a Facebook page that was called “We will genocide all of the Muslims and feed them to the dogs.” According to Reuters, though the page was removed, there were several other instances of dangerous content that were frequently getting past the so-called “multi-step reporting system for problematic content.”
  • 2015

    • At the start of 2015, with approximately 7.3 million users on the platform in the country, Facebook had only two people dedicated to reviewing problematic posts who spoke Burmese. Prior to that, a majority of the people reviewing content from Myanmar spoke English. Even by the end of the year, Facebook still had only four Burmese speakers reviewing the content.
    • In March 2015, University of Michigan’s Matt Schissler spoke to Facebook officials at the company’s California headquarters about anti-Muslim violence in Myanmar and how the platform was being exploited to foment hatred. According to him, more than a dozen officials attended his talk. Two months later, entrepreneur David Madden gave a talk in the same place about the same thing. As he told Reuters, “The whole point of this presentation was really just to sound the alarm, to show very vividly the context in which Facebook was operating, and already the evidence of how it was being misused.” (Later on, Madden’s work in Myanmar would become integral to identifying and removing hate rhetoric on the platform.)
    • “In early 2015, in collaboration with local tech civil society groups,” Wired reported, “Facebook made a set of digital stickers available on the messenger platform. The stickers were part of the broader ‘Panzagar’ campaign, ‘flower speech’ in English, launched by activists, including former political prisoners, to counter hate speech and promote online inclusion.” Though the project was deemed fairly popular, many claimed it was equivalent to putting a Band-Aid on a bullet wound; Facebook still refused to properly address the underlying problems on its platform and seemingly focused on expanding services with few safeguards and little to no regard for the impact the platform had on people’s lives. “People gave [Facebook] a lot of credit for that, but it seemed to be the smallest gesture to be made,” a U.S. tech company official who had worked in Myanmar told Wired. “People died, but now you can use this digital sticker.”
  • 2016

    • In 2016, several studies were released detailing how significant Facebook’s presence in Myanmar had become. According to LIRNEasia, a think tank conducting research on the digital sector, Myanmar appeared to have more Facebook users than at least four major South Asian countries, including India and Pakistan. Reuters reported that Facebook’s usership in the country jumped from 7.3 million in 2015 to 11 million in 2016. Another 2016 study stated that many people in Myanmar considered the platform to be the only entry point for receiving online information. Despite the power Facebook held in the country, efforts to ensure the platform was actually a reliable news source or spreading accurate information were entirely unsuccessful.
    • A February 2016 exhaustive study by C4ADS, a nonprofit organization focused on data-driven analysis of global conflict and transnational security issues, detailed specific patterns of hate speech displayed on social media platforms within Myanmar -- Facebook included. The 62-page report identified and analyzed the public content of 100 Facebook accounts, primarily consisting of Ma Ba Tha monks, government officials, and politicians. Researchers found that these accounts amplified false rhetoric about the Muslim minority in Myanmar, oftentimes boosting dangerous narratives.

      In one example, researchers detailed how the concept of inter-faith marriage, which is widely frowned upon by the Ma Ba Tha monks, was weaponized by these users on Facebook. Researchers explained how a widely circulated Facebook image showing an image of a beheaded woman -- apparently a victim of cartel violence in Mexico -- was shared with a caption stating that “if you marry a Muslim man, this is what happens to you” in order to foment fear. This is just one of several hundreds of examples included in the study, explicitly displaying Facebook’s lack of accountability regarding the content on its platform. 

    • Despite prior warnings, Facebook’s investments in automated translations of Burmese into English were insufficient at the time, and hate speech and incendiary content (like the example above) often slipped past automated content detection: “A company official acknowledged to Reuters that its systems have difficulty interpreting Burmese script because of the way the fonts are often rendered on computer screens, making it difficult to identify racial slurs and other hate speech.” In one example, Reuters identified a post that said in Burmese, “Kill all the kalars [Muslims] that you see in Myanmar; none of them should be left alive.” It was auto-translated to English by Facebook to say, “I shouldn’t have a rainbow in Myanmar.”
    • Following an attack on a security post along the border of Bangladesh which left nine officers dead, the Myanmar army conducted a monthslong crackdown on “unarmed Muslim civilians, causing more than one thousand civilian deaths and driving tens of thousands more to flee their homes in search of safety” in October 2016. Military leaders took to Facebook to circulate hateful disinformation against Rohingya Muslims to further justify their actions to their followers.
  • 2017

    • In late January 2017, Ko Ni, one of the country’s most prominent lawyers and an adviser to Aung San Suu Kyi, was assassinated by a gunman who was suspected to have been hired by former military officers. Ko Ni was Muslim and best known for his vocal support for constitutional reforms within Myanmar, “including rewriting the constitution and granting full citizenship to ethnic minorities including Rohingya Muslims.” His daughter later told Reuters that in the months leading up to his death, Ko Ni had been repeatedly threatened over his work. Ohn Maung, a well-known legal adviser who had been friends with Ko Ni for years, said, “He was targeted by a certain group of people on Facebook who just didn’t like him because of his religion and the role he was playing in government.”
    • By 2017, Myanmar had 15 million users on Facebook, effectively doubling the platform’s 2015 totals. However, as the number of real users grew, so too did the amount of fake accounts on the site. The Toda Peace Institute, a nonpartisan group focused on policy-oriented peace research, reported that government officials, politicians, and influencers used both their personal and fake accounts to spread disinformation on Facebook to advance their own political agendas and stoke nationalism within Myanmar. Even though several human rights and civil society groups kept warning the company of the impending danger of this pattern of usage, Facebook still chose not to take preemptive actions.
    • The military’s intelligence arm used Facebook to spread false rumors targeted at both Buddhist and Muslim groups, claiming that an attack by one against the other was imminent. Buddhists were warned that Muslims would carry out “jihad attacks,” while Muslim groups were told that nationalist monks were organizing anti-Muslim protests. These “warnings'' spread on Facebook Messenger “via widely followed accounts masquerading as news sites and celebrity fan pages,” according to The New York Times. Researchers observing these tactics said that the goal was to generate widespread fear and cause feelings of vulnerability that could only be solved with military protection. According to the Times, “Facebook said it had found evidence that the messages were being intentionally spread by inauthentic accounts and took some down at the time. It did not investigate any link to the military at that point.”
    • In August 2017, as a conflict between Myanmar security forces and Rohingya militias escalated, tens of thousands of Rohingya Muslims were displaced, attempting to find refuge across the border in Bangladesh. Bangladeshi aid workers claimed that within 24 hours, 70,000 Rohingya had crossed the border -- almost 10% of the entire population -- to escape the violence. Many Rohingya provided harrowing testimonies of rapes and executions carried out by Myanmar security forces.

      As the army engaged in the “ethnic cleansing” of the Rohingya, Facebook ordered moderators to delete any content “by or praising” the Arakan Rohingya Salvation Army (Arsa), a Rohingya insurgent group. The platform designated it a “dangerous organization,” which it defined, according to The Guardian, as a group engaged in “terrorism, organized violence or crime, mass murder, or organized hate.” Myanmar’s military, which had engaged in a “textbook example of ethnic cleansing” according to a senior United Nations official, meanwhile operated a verified Facebook page with 2.6 million followers. Several other government pages accused of stoking “anti-Rohingya fervor,” including posts by Aung San Suu Kyi, were also not held accountable for their role in inciting human rights abuses. Facebook’s acknowledgement that it banned Arsa came amid criticism from observers, one of whom told The Guardian that the company was trying to “suppress freedom [of] expression and dissent by colluding with the genocidaires in [the] Myanmar regime.”

  • 2018

    • Seven months after the deadly August 2017 crackdown, the United Nations revealed that, while investigating the possibility of a genocide in Myanmar, it had come to the conclusion that Facebook played an integral role in spreading hate speech. Marzuki Darusman, the chairperson of the U.N. Independent International Fact-Finding Mission on Myanmar, told reporters that social media has “substantively contributed to the level of acrimony and dissension and conflict, if you will, within the public. Hate speech is certainly of course a part of that. As far as the Myanmar situation is concerned, social media is Facebook, and Facebook is social media.” Only a month earlier, Facebook had finally banned Ma Ba Tha leader Ashin Wirathu’s account after allowing it to spread and fuel hateful rhetoric for years with impunity.
    • In April 2018, Facebook CEO Mark Zuckerberg spoke with Vox perplexingly crediting his company for the steps it took to detect and quash hate speech in Myanmar. Civil society groups, such as David Madden’s Phandeeyar, were quick to call him out on this mischaracterization. In an open letter, Phandeeyar and a number of civil society groups criticized Zuckerberg for claiming their hard work as Facebook’s efforts, detailing the blatant disregard Facebook had shown for improving the platform’s operations within the country.
    • Zuckerberg apologized to the organizations in a response letter and claimed the platform was focused on using “artificial intelligence to help us better identify abusive, hateful or false content even before it is flagged by our community.” The groups were quick to respond, calling Facebook’s handling of hate speech in Myanmar “grossly insufficient” and noting that Zuckerberg’s response actually reinforced their belief that Facebook was “not doing anywhere near as much as they should and could do to prevent the spread of hatred in Myanmar.”
    • In an August 16, 2018 statement, Facebook admitted it was “too slow” in addressing hate speech in Myanmar after an extensive Reuters investigation revealed how the company had failed to properly flag violent posts, comments, images, and videos attacking the Rohingya and other Muslims. At this point, nearly 700,000 Rohingya had fled their homes due to the 2017 crackdown.
    • Shortly after Facebook released its statement, the United Nations came out with a report calling for Myanmar generals to be tried for genocide and citing Facebook for its role in inciting violence and hatred in the country. Investigators for the U.N.’s fact-finding mission on Myanmar criticized Facebook for letting the platform be used as a tool for incitement and fear for months, if not years, including by the government of Myanmar’s de facto civilian leader Aung San Suu Kyi. The report claimed that the government's response to hate speech had been “inadequate” and that it “contributed to the commission of atrocity crimes” by failing to shield the Rohingya minority from war crimes.
    • On August 28, 2018, one day after Reuters reported on the humiliating U.N. report, Facebook announced that it had removed several military officials’ accounts. These accounts had massive followings, which had grown over the span of several years and had been integral to the spread of misinformation and hate speech against the Rohingya. This process continued for months, as Facebook began revealing and removing hundreds of fake accounts created by military personnel. Facebook’s head of cybersecurity policy, Nathaniel Gleicher, said the platform had found “clear and deliberate attempts to covertly spread propaganda that were directly linked to the Myanmar military.” 
    • During fall of 2018, Facebook conducted an independent assessment of its impact in Myanmar. According to the platform, “The report concludes that, prior to this year, we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence. We agree that we can and should do more.”
  • 2019

    • Facebook baffled local civil society organizations in 2019 when it banned four separatist groups from the site. This decision drew criticism for a multitude of reasons, including claims that these removals were tantamount to “tipping the scales” of online discourse toward the military and providing a “big boost for the government.” Facebook announced that it had designated four ethnic groups -- the Arakan Army, the Myanmar National Democratic Alliance Army, the Kachin Independence Army, and the Ta’ang National Liberation Army -- as “dangerous organizations.” This categorization (and the removals) raised questions about the platform’s decision making process. Myat Thu, a research adviser for the Myanmar Tech Accountability Network, told The Guardian, “This is a perfect example of why Facebook should have Myanmar [Burmese] speaking staff. A blogpost in English does not constitute adequate information.”

      Many were also concerned with the blatant laziness behind this decision, for instead of removing specific content that violated their terms of services, the platform chose to remove whole pages. Aaron L. Connelly, Southeast Asian politics and foreign policy expert at the International Institute for Strategic Studies, told CNN, “It’s not clear to me that Facebook has any underlying principles by which it is judging [the] legitimacy [of these groups].”

    • In June 2019, Foreign Policy reported that “within days” of the platform imposing the ban on the four separatist groups, “it was clear Facebook had failed.” According to the paper, “In the months since the ban, dozens of newly created pages that appear to support the insurgents, and the Arakan Army in particular, continue to share unambiguous propaganda.”
  • 2020

    • In 2020, the West African nation of Gambia began compiling information that would hold Myanmar’s military accountable for charges of genocide against the Rohingya population at the International Court of Justice. In June, Gambia filed an application with the United States federal court seeking information from Facebook that could further help in this investigation. Specifically, the application called for documents and communication from hundreds of pages and accounts, including some from Myanmar military officials, that were deleted and preserved. However, Facebook refused to cooperate and opposed Gambia’s application, claiming that the request was “extraordinarily broad,” “unduly intrusive or burdensome,” and failed to “identify accounts with sufficient specificity.” On the contrary, Gambia named “17 officials, two military units and dozens of pages and accounts.” 
    • Myanmar held its second democratic election in the country’s recent history on November 8, 2020. Leading up to it, many were concerned about the widespread reach of online misinformation. In response, Facebook began piloting a new, more hands-on approach to deal with political misinformation and hate speech prior to Election Day. This included “limiting users’ ability to reshare old pictures without context -- a common misinformation tactic -- and working with local partners to verify political parties’ pages and fact-check stories.” Most importantly, Facebook chose to expand its community standards to include anything that could “suppress the vote or damage the integrity of the electoral process.” This was the first time the platform had ever created a country-specific community standards. Following these changes, there was a noticeable difference in the circulation of information on the platform, with many civil society groups warily applauding the platform for finally taking sufficient actions. “We need to give them credit where it’s due. They did pretty well,” Victoire Rio, an adviser to the Myanmar Tech Accountability Network, said. “But if anything, it shows that when they want to, they can.”
  • 2021

    • In January 2021, Facebook’s Oversight Board -- a supposedly independent body that evaluates decisions by the platform to remove certain kinds of content -- reinstated an anti-Muslim post from a user in Myanmar showing images of a dead Muslim child with the text that the board translated as: “Those male Muslims have something wrong in their mindset.” American national civil rights organization Muslim Advocates was quick to call this out, shaming the platform for amplifying anti-Muslim rhetoric:

      Facebook’s Oversight Board bent over backwards to excuse hate in Myanmar—a country where Facebook has been complicit in a genocide against Muslims. It’s impossible to square Mark Zuckerberg’s claim that Facebook does not profit from hate with the board’s decision to protect a post showing images of a dead Muslim child with a caption stating that ‘Muslims have something wrong in their mindset.’ It is clear that the Oversight Board is here to launder responsibility for Zuckerberg and Sheryl Sandberg. Instead of taking meaningful action to curb dangerous hate speech on the platform, Facebook punted responsibility to a third party board that used laughable technicalities to protect anti-Muslim hate content that contributes to genocide.

    • On February 1, Myanmar’s armed forces stated power had been handed over to Commander-in-Chief Min Aung Hlaing. The military detained Aung San Suu Kyi, President Win Myint, and several other leaders of the National League for Democracy in an act that has been widely recognized as a “coup d’etat.” According to the BBC, the timing of the coup is easily explained by the fact that the first session of parliament was due to start, “which would have enshrined the election result by approving the next government.” On February 3, at the behest of the military, local telecom firms temporarily blocked Facebook as a means of stopping users and civil society groups from sharing information and organizing against the coup.
    • On February 4, Reuters published its investigation that uncovered dozens of pages and accounts alleging election fraud on Facebook -- the reason given by the army for seizing power. Despite Facebook receiving praise for its actions during Myanmar’s election, Reuters found a number of posts spanning back to October containing misinformation about the election. According to the review, these allegations continued through February, when “in the 48 hours before the coup, many of the pages called for military intervention.” In the aftermath of the coup, Reuters found that posts justifying the takeover and denouncing Suu Kyi also began circulating on the platform. Although Facebook claims it is treating Myanmar as an emergency and is attempting to reduce the distribution of “content in Myanmar that likely violates our hate speech and incitement policies … as well as content that explicitly praises or supports the coup,” this is likely too little, too late.
    • On February 21, a day after two protesters were shot and killed by police at a demonstration against the coup, Facebook removed the main page of the Myanmar military for violating its standards prohibiting “the incitement of violence,” Reuters reported. “In line with our global policies, we’ve removed the Tatmadaw True News Information Team Page from Facebook for repeated violations of our Community Standards prohibiting incitement of violence and coordinating harm,” a Facebook spokesperson said in a statement to Reuters. (The Myanmar army is formally known as Tatmadaw.) On February 24, Facebook released a statement officially banning the military and “military-controlled state and media entities from Facebook and Instagram, as well as ads from military-linked commercial entities.”