The Massachusetts supreme judicial court ruled that Meta must face a lawsuit by Massachusetts’ attorney general alleging that the Facebook and Instagram parent deliberately designed features to addict young users.
This ruling marks the first time a state high court considered whether federal law that generally shields internet companies from lawsuits over content posted by their users would also bar claims that companies like Meta knowingly addicted young users.
Meta has denied all allegations, and claimed that the company takes extensive steps to keep young users safe on the site.
This comes following a landmark trial in which a panel in Los Angeles ordered Meta and Google to pay $3 million to a 20-year-old woman who accused the tech giants of making her addicted to Instagram and YouTube as a child. The jury of twelve held the companies responsible for designing products with features hampering the plaintiff’s mental health.
READ: Meta deepens ties with CoreWeave with $21 billion deal (April 10, 2026)
The jury instructed Meta to pay 70% of the damages, with the remaining 30% to be paid by Google. An independent lawsuit will decide the financial amount of compensation.
A day earlier, a separate jury found Meta owed $375 million in civil penalties in a lawsuit by New Mexico’s attorney general accusing the company of misleading users about the safety of Facebook and Instagram and of enabling child sexual exploitation on those platforms. Thirty-four other states are pursuing legal action against Meta in a federal court.
The case by Massachusetts Attorney General Andrea Joy Campbell, a Democrat, is one of at least nine that state attorneys general have pursued in state court since 2023.
Campbell’s lawsuit alleges that features on Instagram such as push notifications, “likes” of user posts and a never-ending scroll were designed to profit off teens’ psychological vulnerabilities and their “fear of missing out.” According to the state, internal data showed the platform was addicting and harming children, yet top executives rejected changes its research showed would improve teens’ well-being.
READ: Meta pulls lawyer ads targeting underage social media harm cases (April 9, 2026)
Meta tried to avoid the blame by pointing to Section 230 of the Communications Decency Act of 1996, a federal law that broadly shields internet companies from lawsuits over content posted by users. The state, however, argued that Section 230 does not apply to the allegedly false statements Meta Platforms made about the safety of Instagram, its efforts to protect young users’ well-being, or its age-verification systems designed to keep children under 13 off the platform.
A trial court judge agreed and said the law also did not apply to allegations concerning the negative impacts of Instagram’s design features because the state was “principally seeking to hold Meta liable for its own business conduct,” not content posted by third parties.

