Meta Ditches Fact-Checkers, Embraces Community Moderation
Meta, the parent company of Facebook and Instagram, has announced a significant shift in its content moderation strategy. The company is replacing its reliance on independent fact-checkers with a user-driven system similar to X’s “community notes.” The move, according to CEO Mark Zuckerberg, aims to reduce political bias and reaffirm Meta’s commitment to free expression.
In a video accompanying a blog post on Tuesday, Zuckerberg argued that third-party fact-checkers had become “too politically biased” and that it was “time to return to our roots of open dialogue.”
This change comes as Meta seeks to improve its relationship with U.S. President-elect Donald Trump, who has previously criticized the platform’s moderation policies as being biased against conservative voices.
Trump Praises the Move
Responding to the announcement, Trump lauded Zuckerberg’s decision, calling it a step in the right direction. “Meta has come a long way,” Trump said during a press conference, adding that Zuckerberg may have been influenced by past criticisms from his team. When asked if the decision was a response to previous threats, Trump quipped, “Probably.”
Joel Kaplan, a prominent Republican and Meta’s new global affairs chief, echoed this sentiment in a blog post. Kaplan, who is replacing Sir Nick Clegg, stated that while the fact-checking program was “well-intentioned,” it often led to “unnecessary censorship.”
Campaigners Voice Concerns
Not everyone is on board with the shift. Advocacy groups and anti-hate speech campaigners have expressed concern that the change could lead to a rise in disinformation. Ava Lee from Global Witness described the move as “a dangerous attempt to appease the incoming administration at the expense of accountability.” She added, “This isn’t about free speech; it’s about avoiding responsibility for the spread of harmful content.”
Meta’s current fact-checking system, introduced in 2016, flags potentially misleading posts for review by independent organizations. Posts deemed inaccurate are labeled with additional context or relegated lower in users’ feeds. This system will now be replaced, starting in the U.S., with the community notes system, modeled after a similar feature on X (formerly Twitter).
Zuckerberg Acknowledges Risks
Zuckerberg acknowledged the trade-offs in his video, admitting that the new system might fail to catch some problematic content. “We’ll catch less bad stuff,” he said, “but we’ll also reduce the number of innocent posts and accounts mistakenly removed.”
While Meta says it has no immediate plans to end fact-checking partnerships in the UK or EU, critics argue the U.S.-only rollout reflects a shift in the company’s political priorities.
Preparing for Trump’s Inauguration
The timing of the announcement coincides with preparations for Trump’s inauguration on January 20. Meta reportedly informed Trump’s team of the changes ahead of the public announcement. The company has also made gestures to align itself with the new administration, including donating $1 million to Trump’s inauguration fund.
Additionally, the appointment of Dana White, president of the Ultimate Fighting Championship and a close Trump ally, to Meta’s board of directors signals a political recalibration for the tech giant.
A Broader Trend in Tech
Experts say Meta’s decision reflects a broader trend among tech companies toward loosening content moderation policies. Kate Klonick, a law professor at St. John’s University, noted, “We’re seeing a shift back toward prioritizing free speech, especially since Musk’s acquisition of X.”
While this shift may align with U.S. political trends, it runs counter to recent regulations in the UK and EU, which require tech companies to take greater responsibility for content on their platforms.
For now, Meta’s strategy signals a radical departure from its previous stance, prioritizing user-led moderation in the name of free expression, even as concerns about disinformation and hate speech persist.