In a significant legal victory for Meta Platforms, a federal appellate court has ruled that the company will not face a lawsuit accusing it of facilitating the genocide against the Rohingya people in Myanmar through its platforms, particularly Facebook. The ruling, issued by a three-judge panel of the Ninth Circuit Court of Appeals, cited Section 230 of the Communications Decency Act, which provides immunity to online platforms concerning user-generated content.
The lawsuit was filed by two anonymous Rohingya plaintiffs in 2022, who sought $150 billion in damages, claiming that the platform amplified hate speech that fueled real-world violence against their community. Judge Ryan Nelson noted in the ruling, "Plaintiffs believe that Facebook's design, paired with darker aspects of human nature, caused real-world harm... but Section 230 bars their claims."
The Rohingya, a stateless Muslim ethnic minority in Myanmar, have faced systematic violence and ethnic cleansing since 2017, leading to thousands of deaths and mass displacement. The court's ruling emphasized that allegations against Meta focused on the content of third-party posts rather than on the platform itself. It argued that recommendations made by Facebook's algorithms are inherently linked to publishing conduct, thus falling under the protective scope of Section 230.
The plaintiffs pointed out the lack of effective content moderation for the Burmese language, which hindered users from reporting harmful content, and they criticized how Facebook's algorithmic promotion of posts affected the visibility of toxic content. Some posts viewed by the plaintiffs stated violent intentions, exemplifying the dangerous narrative that circulated on the platform.
Despite a federal judge initially dismissing the case in 2024 due to an expired statute of limitations, the Ninth Circuit provided a different justification for its dismissal, echoing precedence set by previous rulings regarding Section 230. While two judges on the panel expressed concerns about the expansive liability protections afforded to internet companies, they ultimately adhered to existing case law.
Meta's legal team hailed the decision as a reaffirmation of essential protections for online platforms. However, with ongoing discussions regarding the adequacy of Section 230 in the context of contemporary content moderation and algorithm use, legal experts suggest that the implications of this ruling may lead to further debate about accountability for social media companies.
The plaintiffs' representative indicated that they are considering further action, including potential requests for an en banc review by the full Ninth Circuit. This case illustrates the complex challenges surrounding social media governance, particularly concerning hate speech and its tangible impacts on vulnerable communities worldwide.
Note: This article was published on BanxChange.com and is powered by the BXE Token on the XRP Ledger. For the latest articles and news, please visit BanxChange.com

