Meta, the parent company of Facebook, Instagram, and WhatsApp, announced its decision to end its third-party fact-checking program. This system, previously used to combat misinformation, will be replaced by a crowd-sourced moderation tool known as Community Notes, modeled after a similar feature on X (formerly Twitter).
Meta’s CEO, Mark Zuckerberg, explained the shift as part of a strategy to refocus on free expression while addressing severe violations such as child exploitation and terrorism. “We’re going back to our roots, reducing mistakes, and simplifying our policies,” Zuckerberg said in a video announcement.
This policy change follows the controversial 2024 U.S. presidential election, which saw Donald Trump return to office. While Zuckerberg criticized the fact-checking system for its perceived biases, experts warn that the new approach could have dire consequences in regions like Nigeria.
Shirly Ewang, a senior specialist at public strategy firm Garfield, expressed deep concerns about the potential fallout of Meta’s policy shift in Nigeria, a country already grappling with the effects of misinformation.
During Nigeria’s 2023 general elections, false claims circulated widely on social media, fueling religious and ethnic tensions. The African Digital Democracy Observatory reported that misinformation was systematically used to manipulate public opinion and advance specific political agendas.
Fake images and misleading narratives have also exacerbated conflicts, such as the farmers-herders clashes in Nigeria. According to FactCheckHub, doctored content often frames localized incidents as ethnic or religious violence, inciting retaliatory attacks.
A notable example occurred in 2021 when Nigerian politician Femi Fani-Kayode shared an image of a man holding a burned child, claiming it depicted violence by Fulani herders in Nigeria. The photo was later debunked as originating from the Southern Cameroon crisis, highlighting the dangerous role misinformation plays in escalating tensions.
Ewang expressed skepticism about Meta’s new Community Notes feature, arguing that it lacks the immediacy needed to combat misinformation effectively. “By the time Community Notes catch up, misinformation will have already traveled far and wide,” she noted. While the system encourages transparency, Ewang emphasized that it merely facilitates discussion rather than preventing the spread of false information.
With over 51 million Facebook users and 12.6 million Instagram users in Nigeria, the country is highly vulnerable to the ripple effects of unchecked misinformation. Ewang urged citizens to take responsibility for verifying information before sharing it and called on governments to partner with civil society organizations to educate the public on identifying fake news.
She also advocated for regulatory measures to ensure that tech platforms like Meta are held accountable for the content shared on their platforms. “African governments must establish clear guidelines to ensure our online spaces remain safe,” Ewang stressed.
Meta’s policy changes come at a critical juncture for many African countries, where democratic processes are fragile. Without robust content moderation, platforms risk becoming breeding grounds for disinformation, potentially undermining social cohesion and stability.
As Meta rolls out its new policies, the stakes for regions like Nigeria are particularly high. Experts warn that if misinformation continues to spread unchecked, it could further deepen existing divides and destabilize communities already grappling with religious and ethnic tensions.