San Francisco: Meta allegedly shut down internal research after discovering evidence that Facebook and Instagram were harming users’ mental health, according to newly unredacted filings in a lawsuit brought by US school districts against major social media companies.
Documents reveal that in 2020, during an internal study known as Project Mercury, Meta researchers partnered with survey firm Nielsen to measure the effects of users temporarily deactivating Facebook and Instagram. To the company’s concern, the results showed that people who stopped using Facebook for a week experienced a lower level of depression, anxiety, loneliness, and social comparison.
Instead of releasing the findings or deepening the research, Meta halted the project, internally claiming the results were influenced by the ‘existing media narrative.’ Privately, however, staff acknowledged to Nick Clegg, then the company’s head of global public policy, that the conclusions were valid.
One researcher reportedly wrote that the study demonstrated a ‘causal impact on social comparison,’ even adding a sad-face emoji. Another employee compared withholding the findings to the tobacco industry suppressing evidence of health dangers.

Despite its internal research showing a direct link between platform use and negative mental health outcomes, Meta told Congress it had no way to measure whether its products were harmful to teenage girls.
In response to the accusations, Meta spokesperson Andy Stone said that the study was discontinued due to methodological flaws and insisted the company has continually worked to improve safety, especially for teens. Stone stated, “The full record will show that for over a decade, we have listened to parents, researched issues that matter most, and made real changes to protect teens.”
The allegations come from a detailed court filing submitted by law firm Motley Rice on behalf of school districts suing Meta, Google, TikTok, and Snapchat. The plaintiffs argue that the platforms knowingly concealed internal warnings about the risks their products pose to children, parents, and educators.
Other claims include platforms subtly encouraging underage users, failing to address child sexual abuse content, and pushing to increase teen engagement, even during school hours. The filing also alleges that companies attempted to sway child-focused organizations by offering sponsorships. For example, TikTok allegedly sponsored the National PTA and later bragged internally about being able to influence its public messaging.

Although allegations against the other platforms are less extensive, the filing outlines several internal Meta documents suggesting:
- Meta intentionally created youth safety tools that were ineffective and rarely used.
- Meta tolerated up to 17 attempts at sex trafficking before removing a user, a threshold described internally as ‘very, very, very high.’
- Meta knowingly served teens more harmful content as part of engagement-boosting strategies.
- Efforts to block child predators were delayed for years due to concerns over user growth.
- Leadership, including Mark Zuckerberg, said that he wouldn’t say that child safety was his top concern, “when I have a number of other areas I’m more focused on like building the metaverse.” Zuckerberg also shot down or ignored requests by Clegg to better fund child safety work.
Stone denied these claims, arguing they rely on selective quotes and misinterpretations, adding that Meta’s current policies remove accounts flagged for sex trafficking immediately.
The internal documents referenced in the case are not publicly available, and Meta has moved to strike them from the record. A hearing on the matter is scheduled for January 26 in the US District Court for Northern California.

