The head of Instagram Adam Mosseri took the witness stand on Wednesday, in a high stakes trail taking place in Los Angeles that could reshape how Silicon Vally treats its youngest users.
Testifying in the LA courtroom, Mosseri defended the social media platform against allegations that claims Instagram was intentionally designed to hook young minds and be addictive, contributing to mental health crisis among adolescents.
The case was filed by 20-year-old women from California identified as Kayle, saying that the app’s “endless scroll” and quick dopamine triggering features led to years of depression and body dysmorphia from a very young age.
During the testimony, Mosseri countered the term “addiction” by rather describing it as “problematic use” that varies from person to person.
He also addressed internal emails from 2019 regarding face-altering “plastic surgery” filters.
Read: California trial puts Instagram, Facebook addiction claims on trial
While some teams warned these tools harmed teen girls’ self-esteem, Mosseri and Meta CEO Mark Zuckerberg initially weighed lifting a ban on them to maintain user growth. The company eventually kept the ban on filters overtly promoting cosmetic surgery.
“I was trying to balance all the different considerations,” Mosseri told the jury, according to court reports.
Several parents who lost their children to effects of social media took the front seat of the court room, putting forward their own cases of grief as added proof to the ongoing case. Victoria Hinks, whose daughter died by suicide at age 16, said their children had been “collateral damage” to Silicon Valley’s “move fast and break things” culture.
“Our children were the first guinea pigs,” she told reporters outside the courthouse, to which Mosseri pushed back in his testimony that the “Move fast and break things, motto coined by Zuckerberg in the early stages of the app is no longer appropriate.
Read: Tech giants to go through landmark trial over social media addiction
The plaintiff’s attorney, Mark Lanier, countered that the platform functions like a “slot machine in a child’s pocket,” designed to exploit developing brains for profit. Lanier argued that Meta knew of the psychological toll but prioritized engagement over well-being.
The trial is a critical “bellwether” for over 1,500 similar lawsuits nationwide.
It also tests the limits of Section 230, the federal law that typically shields platforms from liability for user-generated content. If the jury finds Meta negligent in its product design, it could open the floodgates for billions in damages and force radical changes to social media algorithms.
Meta maintains that it has implemented dozens of safety tools for teens, including parental controls and time limits. Zuckerberg is expected to testify later this month as the trial continues to scrutinize the intersection of tech profits and the fragility of the teenage mind.

