Myanmar’s social media genocide

Myanmar’s social media genocide

Myanmar’s social media genocide
Rohingyas take part in "Genocide Remembrance Day" rally at a refugee camp in Ukhia, Bangladesh on Aug. 25, 2022. (AFP)
Short Url

Social media is a powerful tool, for both good and ill. In Myanmar, social media — and Facebook in particular — has been put to cruel and dangerous use.

In Myanmar, as in other countries, before the most recent episode of genocide against the Rohingya Muslims, racists and bigots took to Facebook to spread their ideas. Celebrity Buddhist monks, most notably Ashin Wirathu, used Facebook to spread hatred and lies about the Rohingya.

Wirathu posted incendiary racist sermons to his millions of followers, in which he called the Rohingya “Bengalis” in a bid to convince his fellow Burmese that they were foreign interlopers rather than fellow citizens; claimed that the Rohingya were responsible for spates of crimes across the country; and demanded that they be driven out of the country, whatever that required.

So virulent was his rhetoric that Facebook eventually, in 2018, blocked Wirathu’s page for violating its terms of service relating to the promotion of violence. This came after widespread condemnation of Facebook, including a research paper from two Australian academics, who claimed that the site was “helping fuel a genocide against the Rohingya people.”

Wirathu was not alone. Before the most recent episode of genocide began in 2017 — when hundreds of thousands of Rohingya were forced from their homes, murdered in large numbers and driven out of the country — content denouncing the Rohingya as foreign invaders, serial criminals and terrorists was common online. This was notably true on Facebook, a service used by many in Myanmar from the very first time they started using the internet.

Researchers allege that the lack of digital literacy of newly online Burmese meant that wild claims and deranged rhetoric traveled further in Myanmar than it would have in other, more jaundiced markets.

The Myanmar Internet Project, a collective of scholars and activists, documented the beginnings of an explicitly anti-Rohingya campaign. Facebook, the project’s report notes, “was instrumental to the emergence of a mass Buddhist nationalist movement, which grew from 2012 to 2015 to encompass hundreds of thousands of members across the country and came to be known as Ma Ba Tha (Patriotic Association of Myanmar).”

Content denouncing the Rohingya as foreign invaders, serial criminals and terrorists was common online

Dr. Azeem Ibrahim

It added: “Ma Ba Tha made extensive use of Facebook, leveraging the platform to build hundreds of local chapters, recruit members, fundraise, organize protests and events and run campaigns.”

Those campaigns were not intended to promote civic peace. The Australian academics described a cycle, which they said began with “explicitly racist political cartoons, falsified images and staged news reports.” Think about that — explicit fake news demonizing a minority.

The academics continued: “This content goes viral, normalizing hate speech and shaping public perception. Violence against Rohingya people is increasingly welcomed, and then celebrated online.”

After that came violence, facilitated by hateful rhetoric and lies that spread like wildfire through dry brush.

This situation was already bad, but it was aided by what a former employee of Facebook, Frances Haugen, claimed in 2021 was a known “neglect” from the company of languages other than English. She claimed that Facebook used fewer moderators to service these markets because they were not as profitable as the English-language world and, as such, violent and hateful content went largely unpoliced — even though many millions of citizens of Myanmar were online and being influenced by these campaigns.

Hateful rhetoric and demands for violence and genocide, written in Burmese, was not moderated as stringently as it would have been had it been written in English.

These things have a way of spreading even in heavily moderated markets. In Myanmar, researchers allege, violent rhetoric spread like a plague, with government sanction.

When the armed forces began clearing Rohingya villages and killing their inhabitants, they did so on the back of years of vicious rumors, snide insinuations and outright lies. As the genocide was ongoing, the Facebook pages of government officials, including Aung San Suu Kyi, who was then state counsellor, decried what Suu Kyi’s page called “fake rape.” They claimed, against the evidence, that the Rohingya women who alleged that they had been subjected to sexual violence were lying — and they used Facebook to do so.

This is a challenge the big social media companies must not shirk. Meta, which Facebook became after a change of name, banned Wirathu years ago and has announced a suite of changes to how it moderates the platform in Myanmar, including a ban on pages linked to the Myanmar military.

But all of this is in retrospect. The Rohingya have long been demonized. They have already been expelled. New content moderation cannot give them their homes back. It is vital Meta and others learn the lessons of the Rohingya genocide. It must not happen again.

And meanwhile, those in political office around the world must use every means at their disposal to bring those who ordered and committed the genocide of the Rohingya to justice.

Dr. Azeem Ibrahim is the director of special initiatives at the Newlines Institute for Strategy and Policy in Washington D.C. and author of “The Rohingyas: Inside Myanmar’s Genocide” (Hurst, 2017). Twitter: @AzeemIbrahim

Disclaimer: Views expressed by writers in this section are their own and do not necessarily reflect Arab News' point of view