Meta, the parent company of Facebook, Instagram, and WhatsApp, has unveiled significant changes to its content moderation strategy, rolling back some of the measures it introduced in recent years to combat the spread of political and health misinformation.
In the blog post titled “More Speech, Fewer Mistakes,” Joel Kaplan, Meta’s newly appointed Chief Global Affairs Officer, outlined a transformative approach to content moderation, focusing on promoting free expression while streamlining enforcement.
One major change involves ending the third-party fact-checking program in favor of a Community Notes model, inspired by X.com, to leverage collective insights for content accuracy. Additionally, restrictions on “mainstream discourse topics” will be lifted, with enforcement efforts narrowing to address only “illegal and high-severity violations,” reducing unnecessary interventions.
Lastly, users will gain greater control over their political content through personalization features, fostering feeds that reflect individual preferences while allowing for a more diverse range of opinions. These shifts signal Meta’s evolving strategy to balance open dialogue with responsible platform management.
These changes represent a departure from Meta’s earlier stance, which included strict moderation policies introduced in response to criticism over its role in spreading misinformation during elections, the COVID-19 pandemic, and other controversies. Measures like the creation of the Oversight Board, increased content moderation, and tools to alert users about misleading or harmful content had been implemented to address public and political concerns.
Critics, however, were divided on these efforts—some felt the measures were insufficient, others argued they led to over-moderation and errors, and many claimed the policies were politically biased.
In recent months, cracks in Meta’s commitment to these policies have become evident. Nick Clegg, Meta’s outgoing policy chief, recently admitted in an interview that the company may have “overdone” moderation. The Oversight Board, initially envisioned as a robust accountability mechanism, has struggled to deliver on its promise.
Kaplan’s blog post emphasized Meta’s renewed focus on free expression:
“Meta’s platforms are built to be places where people can express themselves freely. That can be messy. On platforms where billions of people can have a voice, all the good, bad, and ugly is on display. But that’s free expression.”
The timing of these changes is notable, coinciding with the U.S. political shift under a new presidential administration. Former President Trump and his supporters have championed a broad interpretation of free speech, aligning with Meta’s updated policies. Notably, Meta previously banned Trump from its platforms, a decision that drew widespread attention and criticism.
Meta has also undergone leadership changes. CEO Mark Zuckerberg has expressed interest in collaborating with the incoming administration. Joel Kaplan, a prominent Republican and longtime Meta executive, has replaced Nick Clegg as head of public affairs. Additionally, the company recently appointed three new board members, including UFC head Dana White, a supporter of President Trump.
The Oversight Board welcomed the changes, stating it looks forward to collaborating with Meta to refine its approach to free speech in 2025. The Board also acknowledged Clegg’s contributions and expressed optimism about Kaplan’s leadership in shaping Meta’s evolving policies.