Meta’s decision to replace its longstanding fact-checking model with community notes on Instagram, Facebook, and Threads will significantly impact content moderation companies, including Pesacheck and business outsourcing firms. It could mean job losses for hundreds of contractors in Kenya, Nigeria, Egypt, and South Africa, and potentially affect capacity to combat misinformation, at least three civil rights activists, managers at business outsourcing firms and media professionals told TechCabal on Wednesday.
The decision to drop fact-checking is also a huge blow to Africa, where misinformation thrives. In Kenya, WhatsApp, Facebook, and Instagram have million of users who face unchecked exposure to manipulative content without fact-checking. Governments across the content have weaponized disinformation for over a decade, which could be a crisis soon, Emmanuel Chenze, the COO at African Uncensored, told TechCabal.
“Tanzania has elections this year, Uganda next year, and Kenya is coming right after that in 2027,” Chenze said.
“Think of the mess that social media was in 2017 when we didn’t have any of the fact-checking initiatives that exist now. Those Real Raila videos, remember them? We had Cambridge Analytica here running all manner of psyops, and there was no machinery to counter them.”
Cambridge Analytica’s psyops, doctored videos like “Real Raila,” and algorithmic amplification of propaganda. Fact-checking initiatives helped counter this mess, Chenze said.
The decision to stop the current moderation model was announced on January 7 and was in response to claims that the fact-checking program, launched in 2016 “too often became a tool to censor.” This shift could result in job losses for African content moderators who monitor Meta’s platforms for harmful content. It may also translate to financial losses for fact-checking firms.
“We’ve seen this approach work on X—where they empower their community to decide when posts are potentially misleading and need more context,” Meta, which has over 3 billion social media users across its platforms, said on Wednesday. “People across a diverse range of perspectives decide what sort of context is helpful for other users to see.”
Adopting community notes will impact Meta’s financial relationships with content moderation partners. While the company has worked with third-party fact-checkers to address misinformation, it has not disclosed the financial details of these partnerships.
“There are other implications as well, like fact-checking organizations, especially in Africa, losing a major source of funding and not being able to do the work they have been doing this last few years,” Chenze added. His organisation, African Uncensored, has been involved in fact-checking in Kenya.
Firms like PesaCheck may struggle to sustain operations, which could limit their ability to address harmful content and safeguard public discourse. PesaCheck’s 2023 funding was led by Meta at 43%, followed by Tiktok, which contributed 18%. In 2022, Meta contributed 53.4% of PesaCheck’s total funding.
PesaCheck did not respond to requests for comments on this story.
Meta did not immediately respond to request for comments.
The social media giant also funds fact-checkers like U.K.’s Full Fact, which received $461,000 in 2023, making it a key contributor. Meta has over 100 fact-checking partnerships globally, meaning it is spending totals tens of millions of dollars annually.
Fact-checkers employ trained moderators who are key in addressing nuanced issues. They risk losing their jobs as Meta pivots to a user-driven system, a Nairobi-based media company owner who wished not to be named so he could speak freely told TechCabal.
Content will depend on organic reach or paid promotions, which Meta heavily filters for anything political. It’s a grim outlook for a region already battling manipulation, Chenze argued.
“There’s also the priority ranking the algorithm gave content from these organisations and their platforms,” Chenze stated. “That’s now gone, and they’ll either have to rely on organic reach or pay up Meta to be promoted. And even for such promotions, the content will be subjected to Meta’s restrictions.”
Although the community notes system will first launch in the U.S., Meta’s content moderation has had mixed results in Africa with multiple legal actions. Over 180 ex-moderators claim they flagged violent content for little pay, without psychological support from their employers.
Despite severing ties with business outsourcing firms Sama and Majorel in Kenya in 2023—both of which confirmed they were exiting content moderation—Meta relies on them for AI labeling. PesaCheck and Africa Check, both non-profits with offices in Kenya, fact-check information published online and on social media platforms.
Before halting content moderation, Sama disclosed the practice accounted for less than 4% of its business.
Sama now specialises in AI data labelling for tech giants like Microsoft, Meta and Walmart. This helps social media companies flag harmful content online.
Sama and Majorel have been criticised for worker treatment and compensation. 184 ex-Sama content moderators sued Sama for unfair dismissal and claimed that the outsourcing firm failed to protect them from the psychological toll of flagging violent content.
Kenya is pursuing a law to hold outsourcing firms liable for employee claims.
Meta has also been sued over content moderation lapses that allegedly fueled ethnic violence in Ethiopia. The petitioners, represented by Mercy Mutemi of Nzili and Sumbi Advocates, want a ban on harmful content recommendations and a $1.6 billion victim compensation fund from Meta.
The shift to community notes mirrors X’s 2023 approach and appears more like a political overture to the incoming Trump administration than a strategic policy change.
Meta’s changes are limited to the U.S., leaving the European Union’s stricter regulatory environment untouched. Under the 2023 Digital Services Act, platforms like Facebook must address illegal content or risk fines of up to 6% of global revenue.
The European Commission, which has been investigating X’s community notes system since late 2023, said it is closely monitoring Meta’s compliance. Meta plans to phase in community notes in the U.S. over the coming months and has promised updates to the system throughout the year.