The vulnerable and persecuted Myanmar minority, Rohingya have sued social media giant Facebook for $150bn, for failing to check the spread of hate speech leading to violence against the community.
The complaint, lodged in a California court, said the algorithms that power the US-based company promote disinformation and extreme thought that translates into real-world violence.
“Facebook is like a robot programmed with a singular mission: to grow,” the court document states.
“The undeniable reality is that Facebook’s growth, fueled by hate, division, and misinformation, has left hundreds of thousands of devastated Rohingya lives in its wake.”
The mainly Muslim group faces widespread discrimination in Myanmar, where they are despised as interlopers despite having lived in the country for generations.
A military-backed campaign that the United Nations said amounted to genocide saw hundreds of thousands of Rohingya driven across the border into Bangladesh in 2017, where they have been living in sprawling refugee camps ever since.
Many others remain in Myanmar, where they are not permitted citizenship and are subject to communal violence, as well as official discrimination by the military that seized power in February.
In 2018, UN human rights investigators also said the use of Facebook had played a key role in spreading hate speech that fueled the violence. A Reuters investigation that year, cited in the US complaint, found more than 1,000 examples of posts, comments and images attacking the Rohingya and other Muslims on Facebook.
Facebook To Blame.
Rights groups have long charged that the social media giant is not doing enough to prevent the spread of disinformation and misinformation online.
Critics say even when alerted to hate speech on its platform, the company fails to act.
They charge that the social media giant allows falsehoods to proliferate, affecting the lives of minorities and skewing elections in democracies such as the United States, where unfounded charges of fraud circulate and intensify among like-minded friends.
This year, a huge leak by a company insider sparked articles arguing Facebook, whose parent company is now called Meta, knew its sites could harm some of their billions of users – but executives chose growth over safety.
Whistleblower Frances Haugen told the US Congress in October that Facebook is “fanning ethnic violence” in some countries.
The Rohingya lawsuit, anticipating this defence, argues that where applicable, the law of Myanmar – which has no such protections – should prevail in the case.
Facebook has been under pressure in the US and Europe to clamp down on false information, particularly over elections and the COVID-19 pandemic.
READ ALSO: Facebook Shutdown: Technical Reasons or Cover Up? WhistleBlower Reveals Dark Secrets.
(Inputs from Al Jazeera)