Internal documents unsealed in a New Mexico state court reveal that a top Meta executive, once labeled the company’s plan to encrypt its messaging services as “irresponsible,” warning that the shift would severely compromise the platform’s ability to protect children from predators.
The records surfaced this month as part of a high-stakes jury trial brought by New Mexico Attorney General Raúl Torrez. The proceedings, which began with opening statements on Feb. 9, highlight a sharp divide between the tech giant’s public commitment to privacy and its private fears over user safety.
In a 2019 chat exchange, Monika Bickert, Meta’s head of content policy, reportedly expressed grave concern as CEO Mark Zuckerberg prepared to announce the move toward default end-to-end encryption for Facebook Messenger and Instagram.
“We are about to do a bad thing as a company,” Bickert wrote in the internal message, according to the filing. “This is so irresponsible.”
Read: Meta: Parental oversight not enough to reduce teen social media compulsion
The documents suggest that several high-ranking safety and policy officials within Meta were aware that the technical change would “blind” the company to child-exploitation material. By encrypting messages, only the sender and recipient can see the content, meaning Meta itself cannot scan for illegal imagery or grooming behavior unless a user manually reports it.
Historically, Meta has been the primary source of referrals to the National Center for Missing & Exploited Children. Critics and law enforcement have long argued that default encryption allows abusers to operate in the shadows.
The lawsuit by New Mexico alleges that Meta’s leadership prioritized its public image and competitive standing over the safety of its youngest users, effectively creating a “predator’s playground.”
Read: Meta heads to trial as jury selection begins monday
A Meta spokesperson defended the company’s actions, noting that the rollout of encryption which was completed for Messenger in late 2023, followed years of development on “safety mitigations.” The company maintains that it can still use metadata and account signals to identify suspicious patterns without reading the actual text of messages.
The ongoing trial in Santa Fe is the first of its kind against Meta to reach a jury. It humanizes a long-standing technical debate, pitting the fundamental right to digital privacy against the urgent need for corporate accountability.
As testimony continues this month, these internal warnings serve as a stark reminder of the ethical trade-offs made in the pursuit of a more private internet.


