Menlo Park, USA: Mark Zuckerberg, the founder of Meta, has announced major changes to the company’s content moderation policies, which will see a reduction in censorship and a greater emphasis on free speech across platforms like Facebook, Instagram, and Threads.
In a video message, Zuckerberg revealed that Meta would eliminate fact-checkers and replace them with community-based notes similar to those used by X, the platform owned by Elon Musk, where users add context to posts.
Zuckerberg criticized Meta’s current fact-checkers, stating that they have been “too politically biased” and have undermined more trust than they have built. To address this, the company’s content moderation teams will be relocated from California to Texas, where there is “less concern about the bias of our teams.”
This shift will mean that Meta’s systems will likely “catch less bad stuff,” but it is expected to reduce the number of innocently censored posts and accounts.
The new approach will also ease restrictions on topics like immigration and gender, which are seen as “out of touch with mainstream discourse.” Additionally, Meta plans to work with former U.S. President Donald Trump to oppose global efforts that aim to censor American companies.
This includes the growing number of laws in Europe that institutionalize censorship and the secret courts in Latin American countries that issue takedown orders for content.
These changes are framed as a return to a stance on prioritizing free speech, similar to Zuckerberg’s argument at Georgetown University in 2019. He described the recent U.S. presidential election as a tipping point that reinforced the importance of speech over regulation.
Meta’s content moderation will now focus only on illegal and high-severity violations, leaving users responsible for reporting lower-severity violations. This shift is intended to “dramatically reduce the amount of censorship” across Meta’s platforms, aligning the company with the global push for more freedom of expression.
He acknowledged the trade-off involved in dialling back content restrictions, as it would lead to fewer harmful content being flagged but also reduce the number of posts mistakenly removed.
In response to the announcement, Meta’s oversight board, which includes former Danish Prime Minister Helle Thorning-Schmidt, welcomed the revisions to Meta’s approach to fact-checking.
They emphasized the importance of involving external voices in content moderation decisions and expressed interest in working with Meta to refine the new system. The oversight board also acknowledged Nick Clegg’s leadership in establishing the board and welcomed the appointment of Joel Kaplan as his replacement.
The changes have already attracted the attention of global regulators, including the UK’s Department for Science, Innovation, and Technology, which stated it would monitor Meta’s actions closely.
In particular, the UK’s Online Safety Act, which requires companies to remove illegal content and protect children from harm, will be a key area of focus. The UK government has urged social media platforms to tackle misinformation and disinformation effectively.