Meta looks to be in some real trouble after it was revealed that it tampered with evidence to hide the fact that Facebook caused harm to its customers. Meta shut down internal research into the mental health effects of Facebook after finding causal evidence that its products harmed users’ mental health, according to unredacted filings in a lawsuit by U.S. school districts against Meta and other social media platforms.
In a 2020 research project code-named “Project Mercury,” Meta scientists worked with survey firm Nielsen to gauge the effect of “deactivating” Facebook, according to Meta documents obtained via discovery.
To the company’s disappointment, “people who stopped using Facebook for a week reported lower feelings of depression, anxiety, loneliness and social comparison,” internal documents said.
READ: Meta’s Louisiana AI data center upends local community (November 25, 2025)
Meta disputes the allegations, stating that Project Mercury was discontinued due to methodological flaws and that the results were not conclusive. The company maintains that it is committed to improving user safety and mental health through ongoing research and platform updates.
“The Nielsen study does show causal impact on social comparison, (unhappy face emoji),” an unnamed staff researcher allegedly wrote. Another staffer worried that keeping quiet about negative findings would be akin to the tobacco industry “doing research and knowing cigs were bad and then keeping that info to themselves.”
Despite Meta’s own work documenting a causal link between its products and negative mental health effects, the filing alleges, Meta told Congress that it had no ability to quantify whether its products were harmful to teenage girls.
The situation highlights the ethical challenges social media companies face when internal research conflicts with business interests. How regulators, policymakers, and the public respond could influence future governance and accountability of social media platforms.
In a statement Saturday, Meta spokesman Andy Stone said the study was stopped because its methodology was flawed and that it worked diligently to improve the safety of its products.
“The full record will show that for over a decade, we have listened to parents, researched issues that matter most, and made real changes to protect teens,” he said.
The allegations surrounding Meta’s Project Mercury research illustrate the ethical and societal challenges posed by major social media platforms. When internal studies suggest that widely used products may negatively affect users’ mental health, particularly among teenagers, companies face a tension between business interests and public welfare.
The situation underscores the importance of transparency, independent oversight, and accountability, as internal findings can have far-reaching consequences for users and society at large. Even when companies dispute the claims or cite methodological issues, the controversy highlights the need for rigorous and publicly accessible research into digital platforms’ psychological impacts.
Policymakers, regulators, and the public must carefully weigh corporate disclosures, internal research, and independent investigations to ensure that social media platforms prioritize user safety, though some findings remain unverified and legally unproven. How these issues are addressed in the coming years may set precedents for the governance, ethical standards, and societal responsibilities of social media companies worldwide.


