In January 2025, Meta announced that it would discontinue its third-party fact-checking program and instead move to a community notes model based on user input. The change, coinciding with President Trump's inauguration, raises many questions about the careful balance between freedom of speech, censorship, and accessibility of reliable information in an increasingly digital era.
Previously, Meta’s fact-checking program utilized professional journalistic organizations and other qualified individuals to monitor the site, examining posts gaining traction and investigating the truthfulness of claims made, either adding a warning or providing more evidence to a particularly contentious claim. The platform now operates with a community notes model in which users report and review each other’s comments. If users who typically disagree find a note helpful, it will be published as a label on the post without affecting its visibility or reach, according to Meta. The program vaguely echoes the Community Notes system that X uses, representing a widespread change away from third-party moderation and a larger share of user input.
In a statement, Meta wrote that the fact-checkers were “too politically biased and…destroyed more trust than they created.” Supporters of Meta’s decision went even further, arguing that the fact-checkers were almost entirely composed of left-leaning academics with little oversight when mistakes were made and harbored implicit biases. Regardless, the rollback of third-party fact-checkers means a future of community-based regulation.
The fact-checking change also decreased the use of automatic content filters, especially relating to “immigration and gender.” Now, users may fully express their views around those topics with little platform oversight. Such changes to content moderation are accompanied by shifts in Meta’s community standards. For example, users can refer to “women as household objects or property” or assert others have a “mental illness or abnormality when based on gender or sexual orientation, given political and religious discourse about transgenderism and homosexuality.”
Meta’s policy change marks an important precedent in the uncharted world of digital citizenship. Further, worries about the shift targeting marginalized groups are not without merit, as smaller identities or groups that don’t have a large enough voice or online presence could be singled out. In the presence of larger, more authoritative groups, especially regarding the community notes voting method, correcting misinformed, biased, or extreme narratives may prove difficult, and certain minority groups are unfairly targeted.
Telegram as an Unlikely Parallel
Meta’s decision to move away from content moderation and the potential for tussles with the European Union invites comparisons to the popular broadcasting platform Telegram and its hands-off content moderation policies. While Meta’s Instagram and Facebook have distinctly different functions and uses compared to Telegram, their trajectory in the face of the European Union’s Digital Services Act (DSA) and other censorship laws could paint a picture of Meta’s future.
Telegram primarily functions as a social media app with over 950 million monthly users, differing from Facebook and Instagram due to its large broadcasting channels and group chats that accommodate up to 200,000 people, as compared to WhatsApp’s 1,024 user maximum. Furthermore, Telegram’s group chats do not default to end-to-end encryption, meaning that Telegram can access information in those group chats. By comparison, all chats on Meta’s social media apps—WhatsApp, Instagram, and Facebook—are end-to-end encrypted. Telegram does offer an encrypted “Secret Chat” functionality, but the option is not added by default and exists only for one-on-one conversations.
Telegram’s failure to appropriately moderate and monitor the app has posed multiple challenges to governmental authorities. In 2022, the German government fined Telegram over 5.125 million euros (US$5467677.80) for failing to articulate a clear path to report illegal content on the app and name a German entity to receive official communication, both violating German law. Just the following year, Brazil temporarily suspended Telegram after the app’s managers refused to surrender the data of neo-Nazi users on the platform supposedly linked to a November 2023 school shooting that killed five people.
Telegram’s altercations with governmental authorities reached a peak when the app made headlines: its Chief Executive Officer, Pavel Durov, was arrested in August 2024 in Paris for charging that Telegram was being used for illegal activity like drug trafficking and child sexual abuse images. Authorities argued that Telegram refused to give up information when investigators requested it.
Durov is not unfamiliar with governmental interference on his apps: he previously founded VKontakte, Russia’s largest social network, which encountered government resistance after the 2011 and 2012 Russian pro-democracy protests. Authorities demanded that VKontakte remove pro-democracy chats on the app and hand over the data of users who supported the protests.
While Durov’s resistance to government intervention isn’t novel, his response encapsulates the distinct clash between the European Union’s DSA and Telegram’s previous promise to its users that private chats are immune to government inquiries. His 2024 Paris arrest saw careful changes in Telegram’s Terms and Services and Frequently Asked Questions sections. Notably, Telegram removed the phrasing that all chats were immune to government intervention and hastened to highlight that Telegram is less of a messaging app and more of a social media app.
The change also saw a heightened emphasis on the app’s reporting program and clarified that the app would hand over IP addresses and phone numbers of users who violate the rules in response to “valid legal requests” and would now disclose all user data in quarterly transparency reports with law enforcement. Telegram’s experience invites questions about the future of Meta’s user base and how their app use will change without continued oversight.
Meta To Follow in Telegram’s Footsteps
The last year has marked similar interactions with the European Union for Meta, raising questions about whether Meta will follow Telegram’s footsteps in content moderation. Meta’s shift away from fact-checkers and censors has invited further investigation from the European Commission under the DSA’s rules on the protection of minors in May 2024. Meta also saw an April 2024 request for information after a failure to provide adequate election monitoring tools.
Meta’s decision to take down their 27 election monitoring tools in August invited another investigation into their failure to provide equal access for the public during a widespread election season.
Ursula von der Leyen, President of the European Commission, noted Meta’s responsibility, writing in a statement that “big digital platforms must live up to their obligations to put enough resources into this.” The European Union’s decision to pursue Meta’s non-compliance marks an important shift in EU moderation laws, especially after AI Forensics, a European-based non-profit specializing in algorithm investigations, released a report proving that Meta had allowed a flood of pro-Russia propaganda information ads to proliferate on their platforms.
While Meta repeatedly threatened SDA, a Russian IT firm with ties to the Kremlin, Meta made US$338,000 between August 2023 and October 2024 from ads run by SDA, even after governmental authorities sanctioned the firm. The campaigns continued as the firm posted thousands of ads, and Meta’s flawed oversight contributed to the infringement.
The EU Commission is more stringent about content moderation laws than the United States, and its growing demands for compliance may strain the tech leader’s presence in the European Union. While the EU’s investigations into Meta do not match Telegram’s issues with foreign governments—garnering millions of dollars worth of fines—and high-profile arrests- the app serves as a possible future version of Meta’s trajectory in the wake of its decision to stop fact-checking.
Facebook’s Spotted Past
Telegram and Meta have distinct methods of censorship and online content moderation. Yet, their futures are increasingly intertwined, given Meta’s move away from fact-checkers and increasing European Union attempts to have apps adhere to a specific system of digital obligations. Their future as crucial communication apps for millions of users also creates questions around the possible obligation of apps and their creators to step in when their app isn’t being used as intended. While the ethics of free speech and censorship will remain contentious, online moderation adds another facet to an already embroiled discussion, creating new layers to sift through.
There is already a clear connection between technological interactions and real-life actions, as marked by the 2018 divisions between the Muslim minority and Buddhist majority in Sri Lanka. There, Facebook served as the platform for anti-Muslim sentiment, stoking a burgeoning flame of anger and animosity as misinformation and mistakes circulated.
The divide culminated in a video where a Muslim man confusedly agreed that there were “sterilization pills” in a meal without knowing that a false Facebook rumor was spreading that the police had seized 23,000 sterilization pills from a Muslim pharmacist the day before. Upon hearing the man’s answer, a mob immediately beat him, destroyed his restaurant, and burned a nearby mosque. Hateful sentiment quickly spread as groups used Facebook to coordinate attacks on local buildings or Muslim-owned businesses. Sri Lanka promptly called the situation a state of emergency, with the telecommunications minister, Harin Fernando, arguing that the “whole country could have been burning in hours.”
This example, borne out of chats and channels on Meta’s platforms, only begins to show the fraught relationship between speech on unregulated platforms and the manifestation of that speech in dangerous, tangible actions. The dynamic between minority groups and the general public also hangs in the balance with a new future of Meta without fact-checking.
Meta’s Tangled Future
In 2020, the US Federal Trade Commission sued Meta, arguing that the company bought Instagram and WhatsApp to secure a monopoly on the industry. Judge James Boasberg rejected Meta’s attempt to dismiss the case, instead setting the trial for April 14, 2025. The case creates a larger issue for Meta’s future as a single entity, embroiling them in litigation and creating a challenging future for the trio of social media giants.
Given the European Commission’s previous anger towards social media giant Telegram, the future of the app in the European Union and worldwide could face numerous challenges, given its move away from fact-checkers, both with concerns stemming from legal authorities and the general atmosphere that reduced moderation will inevitably create. Meta is placing significant trust in its users, essentially using a vigilante method where neighbors are expected to hold each other accountable. However, Meta ignores the fact that those neighbors might not have each other’s best interests in mind.