Meta to End Fact-Checking Program as Trump’s Second Term Approaches
In a major policy shift, Meta CEO Mark Zuckerberg announced on January 7, 2025, that the company will shut down its third-party fact-checking program and transition to a user-driven moderation system, similar to the model used by Elon Musk’s X (formerly Twitter). This decision comes just weeks before President-elect Donald Trump’s second inauguration, highlighting the growing tensions between technology, politics, and free speech.
The End of Meta’s Fact-Checking Efforts
Meta’s fact-checking program, which was launched in 2016 to address misinformation during the U.S. presidential election, had long been considered a cornerstone of its efforts to combat false information. The program involved nearly 100 organizations working in over 60 languages globally, including U.S.-based partners like PolitiFact and Factcheck.org. However, Zuckerberg now claims the program has become “too politically biased” and has “damaged trust more than it has built,” particularly in the U.S.
The decision to end the program was abrupt, leaving many of Meta’s fact-checking partners unprepared. Organizations such as Lead Stories and Check Your Fact, which depended heavily on Meta’s funding, now face significant challenges in adapting to the loss of support. Critics argue that this move undermines efforts to fight misinformation, as independent fact-checkers were essential in verifying and contextualizing questionable content.
A New Approach: Community Notes
To replace the fact-checking program, Meta will introduce a “Community Notes” system, inspired by X’s model, which allows users to flag and annotate posts they believe are inaccurate. This crowd-sourced moderation system requires notes to gain broad agreement from users with diverse perspectives before they are displayed publicly.
While Zuckerberg presents this shift as a step toward reducing bias and promoting free expression, critics warn that such systems are vulnerable to manipulation and often fail to address misinformation effectively. For example, X’s Community Notes has been criticized for allowing harmful content, including medical misinformation and hate speech, to spread unchecked.
Political Realignment and the Move to Texas
Meta’s decision is widely seen as an effort to align with the incoming Trump administration. In recent weeks, Zuckerberg has made several overtures to Trump, including a $1 million donation to his inaugural fund and a private meeting at Mar-a-Lago, where he reportedly gifted the president-elect a pair of Meta’s AR glasses.
Meta has also appointed prominent Trump allies to key positions, including UFC CEO Dana White to its board and Republican strategist Joel Kaplan as head of global policy. Kaplan, a former deputy chief of staff to George W. Bush, has been vocal about framing Meta’s new policies as a return to free speech principles.
Additionally, Meta is relocating its trust and safety teams from California to Texas, a move Zuckerberg claims will reduce perceived bias and align with the political climate of the incoming administration.
Implications for Misinformation and Public Discourse
Meta’s decision to dismantle its fact-checking program has raised concerns among experts and advocates for digital accountability. Angie Drobnic Holan, director of the International Fact-Checking Network, expressed concern that the move could lead to fewer fact-checking reports and a resurgence of hoaxes and conspiracy theories.
Critics argue that Meta’s shift reflects a broader trend of tech companies prioritizing political expediency over the public good. By dismantling its fact-checking infrastructure, Meta risks fueling the spread of misinformation, especially in an era where false narratives can have profound societal and political consequences.
Conclusion
Meta’s decision to end its fact-checking program and adopt a user-driven moderation model marks a dramatic shift in its content governance strategy. While the move may appeal to conservative critics and align the company with the incoming Trump administration, it raises significant questions about the future of misinformation on social media. As Meta navigates this new political landscape, the balance between free speech and accountability remains uncertain, leaving users and society to confront the potential consequences of this bold and controversial experiment.