Rohingya refugees filed a $150 billion lawsuit against Facebook over its failure to curb misinformation and hate speech on its platform, which “amounted to a substantial cause, and eventual perpetuation of, the Rohingya genocide” in Burma, also known as Myanmar.
Lawyers in the United Kingdom and the United States launched legal campaigns against Facebook’s parent company, Meta, for the social media giant’s role in facilitating violence against the persecuted Muslim ethnic group in Burma.
“Facebook has options for moderating its algorithms’ tendency to promote hate speech and misinformation, but it rejects those options because the production of more engaging content takes precedence,” the court document reads.
The same document noted that Facebook arrived in Burma around 2011 and arranged for millions of Burmese to access the Internet for the first time. But it claimed that “Facebook did nothing” to warn users about the dangers of misinformation and fake accounts on its systems, a tactic used by the Burmese military to generate hate speech against the Rohingya.
“Human rights and civil society groups have collected thousands of examples of Facebook posts likening the Rohingya to animals, calling for Rohingya to be killed, describing the Rohingya as foreign invaders, and falsely accusing Rohingya of heinous crimes,” it stated.
The law firms organizing the lawsuits noted that the U.K. legal claim would be on behalf of the Rohingya community living anywhere outside of the United States, while a separate U.S. claim would be on behalf of those residing in the U.S.
The claimants have accused Facebook of using algorithms that amplified hate speech against the Rohingya people on its platform and not investing sufficiently in content moderators who spoke the local language or understood the political situation in Burma.
They claimed that the platform has failed to take down posts inciting violence against the Rohingya people and remove accounts used to propagate hate speech or incite violence.
In 2018, Facebook officials said that the company hadn’t done enough to limit the spread of posts fuelling violence against the Rohingya, but earlier this year the company pledged to curb the spread of misinformation following recent bloodshed and a military coup in the country.
Haugen suggested that to prevent the viral spread of content and misinformation, something she said could fuel repressive actions in such countries, Congress could make changes to Section 230 of the Communications Decency Act, which protects online platforms from being held responsible for content posted by third parties.