Nearly every one in five users aged 13 to 15 told Meta that they saw “nudity or sexual images on Instagram” that they didn’t want to view, according to a court filing. The document was made public on Friday as part of a federal lawsuit in California and includes portions of a March 2025 deposition of Instagram head Adam Mosseri.
In another document made public as part of the lawsuit, a Meta researcher recommends the company focus on teen users because they are “catalysts” for their households and influence how their younger siblings and parents use the app. The document is dated January 20, 2021.
“If we’re looking to acquire (and retain) new users we need to recognize a teen’s influence within the household to help do so,” the researcher said in the memo.
Meta, which owns Facebook and Instagram is facing allegations that its products harm young users. The company, along with others like Google and TikTok are facing lawsuits that claim their products have been made intentionally addictive, causing harm to minors.
READ: Meta Executive called encryption plan ‘irresponsible’ over child safety risks (
The statistics on explicit images came from a 2021 survey of Instagram users about their experiences on the platform, said Meta spokesperson Andy Stone, and not a review of posts themselves.
Mosseri’s deposition also revealed that about 8% of users in the 13 to 15 age group said in the 2021 survey they had “seen someone harm themselves or threaten to do so on Instagram. In late 2025, the company had said it would remove images and videos “containing nudity or explicit sexual activity, including when generated by AI,” with exceptions considered for medical and educational content. “We’re proud of the progress we’ve made, and we’re always working to do better,” Stone said.
Most sexually explicit images were sent via private messages between users, Mosseri said in his deposition, and Meta must consider users’ privacy when reviewing them. “A lot of people don’t want us reading their messages,” he said.
Read: Meta: Parental oversight not enough to reduce teen social media compulsion
This comes during a time of widespread concern over the impact of the internet and social media over mental health, especially among young people. Many have criticized major tech companies and held them responsible for this issue.
It was reported in November 2025 that Meta tampered with evidence to hide the fact that Facebook caused harm to its customers. Meta shut down internal research into the mental health effects of Facebook after finding causal evidence that its products harmed users’ mental health, according to unredacted filings in a lawsuit by U.S. school districts against Meta and other social media platforms.
Character.AI and Google recently agreed to settle lawsuits accusing them of causing mental health crises and suicides among young people. That settlement came as a resolution to some of the first and most high-profile lawsuits related to the alleged harms to young people from AI chatbots.


