Facebook owner has moral obligation to compensate Rohingya
The Rohingya crisis, characterized by widespread violence, displacement and human rights abuses, has been one of the most severe humanitarian emergencies of our time. As the world becomes increasingly interconnected through social media platforms, it is crucial to hold technology companies accountable for their role in amplifying hate speech and misinformation and facilitating violence against vulnerable communities. In this context, Meta, the owner of Facebook, should take responsibility and provide compensation to the Rohingya people for the harm caused by the misuse of its platform.
Meta’s platform was instrumental in the spread of hate speech and incitement to violence against the Rohingya population in Myanmar. During the peak of the crisis in 2017, Facebook was used as a tool to disseminate anti-Rohingya propaganda, fueling hatred and enabling the coordination of attacks against Rohingya communities. The company’s algorithms and recommendation systems exacerbated the problem by amplifying divisive and inflammatory content, contributing to the escalation of violence and the displacement of hundreds of thousands of Rohingya people.
The consequences of this misuse of technology have been devastating. The UN has referred to the Rohingya crisis as a textbook example of ethnic cleansing, with reports of mass killings, sexual violence and the destruction of entire villages. More than 700,000 Rohingya have been forced to flee to neighboring Bangladesh, where they live in overcrowded refugee camps, deprived of their basic rights and dignity. Many Rohingya continue to suffer from trauma and struggle to rebuild their lives.
Meta, as the owner of the platform that enabled this harm, has a moral obligation to take action. The company has made public commitments to addressing hate speech and misinformation on its platform, but mere policy changes and increased moderation are not enough. Meta must go beyond lip service and take concrete steps to provide compensation to the Rohingya people.
There are also significant legal implications. As I have previously argued in Arab News, Gambia’s unprecedented action at the International Court of Justice was long overdue. However, it is widely understood that establishing genocide accountability poses a formidable challenge, as the plaintiff must prove genocidal intent. The burden of proving intent is commonly regarded as one of the most difficult standards to meet in legal proceedings.
The Gambian legal team has taken a logical step by requesting a US court to compel Facebook to provide data pertaining to the key Myanmar army officials responsible for ordering the “clearance operations” against the Rohingya. Among these officials is Gen. Min Aung Hlaing, the commander-in-chief of Myanmar's armed forces and now the de facto ruler. This strategic move aims to obtain crucial evidence that can support the case against those involved in the atrocities committed against the Rohingya, which can lead to compensation of some form.
Meta’s platform was instrumental in the spread of hate speech and incitement to violence against the Rohingya population in Myanmar.
Dr. Azeem Ibrahim
Most recently, joining the chorus of voices supporting compensation was Pat de Brun, head of big tech accountability and deputy director of Amnesty Tech. De Brun strongly emphasized the urgent need for Meta to act during its annual shareholder meeting, when a series of shareholder resolutions challenging the company’s business practices will be addressed. De Brun said in a statement: “It is way beyond time that Meta fulfilled its responsibilities and provided an effective remedy to the Rohingya people of Myanmar. It is reprehensible that Meta still refuses to repair the harms it contributed to despite the overwhelming evidence that the company played a key role in 2017’s ethnic cleansing.”
Compensation would serve several important purposes. First and foremost, it would acknowledge the harm inflicted upon the Rohingya community and demonstrate a commitment to accountability. Compensation would also contribute to the material and psychological recovery of the survivors, enabling them to rebuild their lives and communities. Furthermore, it would send a strong message to other technology companies about the consequences of neglecting their responsibilities in preventing and mitigating the negative impacts of their platforms.
The argument against Meta paying compensation often centers on the notion that the company is merely a neutral platform, not directly responsible for the content shared by its users. However, this argument fails to recognize the active role that Meta plays in shaping the user experience through algorithms and content curation. By designing systems that prioritize engagement and maximize user attention, Meta bears a significant responsibility for the consequences of its platform’s impact on vulnerable communities.
Moreover, Meta’s responsibility goes beyond legal obligations. While it may not be directly liable under existing laws, it cannot escape its ethical duty to address the harm to which it has contributed. Companies like Meta wield immense power and influence over public discourse and they must be held accountable for the consequences of their actions.
Meta has the financial means to provide compensation to the Rohingya people. As one of the wealthiest technology companies in the world, it has a moral obligation to allocate a portion of its resources to repair the damage caused. This compensation should be directed toward initiatives that support the welfare, rehabilitation and empowerment of the Rohingya community, including education, healthcare, infrastructure development and livelihood opportunities.
Meta, as the owner of the platform that facilitated the spread of hate speech and violence against the Rohingya people, should accept responsibility and provide compensation for the harm caused. Compensation would acknowledge the harm inflicted, support the recovery of survivors and demonstrate Meta’s commitment to accountability. It is time for Meta and other technology companies to recognize the impact they have on vulnerable communities and take meaningful steps to rectify the damage caused by their platforms. By doing so, they can contribute to a more responsible and ethical technology ecosystem that prioritizes human rights and the well-being of all.
• Dr. Azeem Ibrahim is director of special initiatives at the New Lines Institute for Strategy and Policy in Washington, DC, and the author of “The Rohingyas: Inside Myanmar’s Genocide” (Hurst, 2017).