📈 Explore REIT Investing with a Smarter Trading App

Perfect for investors focused on steady income and long-term growth.

📈 Start Trading Smarter with moomoo Malaysia →

(Sponsored — Trade REITs & stocks with professional tools and real-time market data)

Internal Meta documents reveal executives feared encryption would block child exploitation reporting, as the company faces a major lawsuit over child safety.

NEW YORK/SAN FRANCISCO: Meta executives proceeded with plans to encrypt its Facebook and Instagram messaging services despite internal warnings it would severely hinder the company’s ability to detect and report child exploitation.

Internal company documents, filed in a New Mexico state court case and made public on Friday, show senior safety officials expressed deep concern. “We are about to do a bad thing as a company. This is so irresponsible,” wrote Monika Bickert, Meta’s head of content policy, in a March 2019 chat.

The filing contains emails and messages obtained for a lawsuit brought by New Mexico Attorney General Raul Torrez. Torrez alleges Meta allowed predators unfettered access to underage users, often leading to real-world abuse.

READ MORE: Zuckerberg to testify in landmark social media addiction trial

A trial in the case began this month. It is the first lawsuit of its kind against Meta to reach a jury.

The documents show executives feared the move to end-to-end encryption for Messenger and Instagram Direct. This technology prevents anyone except the sender and recipient from reading messages.

Child safety advocates argue it poses a heightened risk on public social networks that easily connect children to strangers. Senior Meta safety executives internally expressed that same fear.

Bickert said the company was making “gross misstatements of our ability to conduct safety operations.” She added there would be “no way to find the terror attack planning or child exploitation” proactively.

A 2019 Meta briefing document estimated reporting of child sexual exploitation imagery would have fallen by 65% if Messenger had been encrypted the previous year. A later update said Meta would have been “unable to provide data proactively to law enforcement” in over 2,000 concerning cases.

Meta spokesperson Andy Stone said these concerns led the company to develop new safety features before launching default encryption in 2023. “The concerns raised in 2019 represent the very reason we developed a range of new safety features,” Stone said.

These features include special accounts for underage users that prevent unknown adults from initiating contact. Users can also still report objectionable messages for review.

In a 2019 email, Meta’s Global Head of Safety Antigone Davis highlighted the specific risk. “FB allows pedophiles to find each other and kids via social graph with easy transition to Messenger,” she wrote.

Davis contrasted this with WhatsApp, Meta’s existing encrypted service. She noted WhatsApp is not connected to a social media platform, making encrypted Messenger “far, far worse” for safety risks.

The information emerges as Meta faces a wave of global litigation and regulatory threats linked to young users’ welfare. A coalition of over 40 attorneys general is pursuing claims that Meta’s products broadly harm youth mental health.

 The Sun Malaysia

📈 Explore REIT Investing with a Smarter Trading App

Perfect for investors focused on steady income and long-term growth.

📈 Start Trading Smarter with moomoo Malaysia →

(Sponsored — Trade REITs & stocks with professional tools and real-time market data)

About the Author

Danny H

Seasoned sales executive and real estate agent specializing in both condominiums and landed properties.

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}