In recent years, Meta has taken steps to address misinformation on its social media platforms, evolving from using professional factcheckers and automated filters to exploring crowdsourced solutions. The company announced it would end its third-party factchecker program in the United States, opting instead for a crowdsourced system called Community Notes. Community Notes, which originated as Birdwatch on Twitter (now X), enables users to flag potential misinformation and add contextual notes once a consensus is reached across diverse perspectives, using algorithmic support to determine consensus.
Research suggests that crowdsourced approaches like Community Notes can effectively reduce the spread of misinformation and even prompt users to retract misleading posts. These systems benefit from the ´wisdom of crowds´ by leveraging the collective expertise and perspectives of millions of users. Meta´s move to adopt this model reflects a broader shift in content moderation strategies, emphasizing scalability and responsiveness while recognizing the limitations of a purely algorithmic or human-driven model. However, the complexity of content moderation means that no single approach is sufficient, and combining automated systems, crowdsourcing, and professional factchecking is essential for best results.
Historical solutions to online misinformation, such as crowdsourced spam filtering for email, demonstrate the potential effectiveness of this model. Similarly, large language models often respond to potentially harmful queries by refusing answers or displaying disclaimers, suggesting a tiered approach to content moderation. While professional factcheckers offer thorough analysis, they cannot scale to the level required for global platforms. Notably, studies indicate that Community Notes and professional factchecking complement each other, reaching different types of content and amplifying factual corrections. The ongoing challenge for platforms like Meta is to iteratively refine these systems, avoid overreliance on any one method, and maintain flexibility as definitions of truth and consensus evolve. The article underscores that successful moderation requires continuous adaptation and a blend of diverse strategies, rather than retreating from the responsibility altogether.