Meta was aware of a significant amount of child sexual harassment occurring on its platforms, according to an unredacted complaint.

Meta was aware of a significant amount of child sexual harassment occurring on its platforms, according to an unredacted complaint.
Meta was aware of a significant amount of child sexual harassment occurring on its platforms, according to an unredacted complaint.
  • According to a new legal filing, a 2021 internal estimate by Meta revealed that up to 100,000 children were subjected to sexual harassment on Facebook and Instagram daily.
  • The attorney general of New Mexico has filed a complaint against Meta, accusing the company of failing to adequately protect children online.
  • A Meta spokesperson stated that the complaint misrepresented their work by using selective quotes and cherry-picked documents.
After Hours
A worker picks up trash in front of a new logo and the name 'Meta' on the sign in front of Facebook headquarters on October 28, 2021 in Menlo Park, California.
A worker picks up trash in front of a new logo and the name ‘Meta’ on the sign in front of Facebook headquarters on October 28, 2021 in Menlo Park, California. (Justin Sullivan | Getty Images)

A legal filing about child exploitation on Facebook and Instagram alleges that an internal company estimate from 2021 found up to 100,000 children daily received sexual harassment, including images of adult genitalia, on the platforms.

The attorney general of New Mexico disclosed new information in a lawsuit against Facebook regarding the company's efforts to safeguard children online as the platform's popularity among young people increased.

The complaint also includes a description of a 2020 Meta internal company chat, where an employee asked a colleague about the company's efforts to combat child grooming, which they had heard was a prevalent issue on TikTok.

Explicitly, the colleague stated that child safety is not a priority this half, falling somewhere between zero and negligible.

In the same year, Apple's executive lodged a complaint against Meta, alleging that his 12-year-old child was solicited on Facebook, as revealed in the unredacted filing.

"This behavior infuriates Apple so much that they are considering removing us from the App Store," a Meta employee informed his colleagues. Additionally, he inquired about when it would be possible to put an end to adults messaging minors on Instagram Direct.

The Meta spokesperson stated that the company has resolved several issues mentioned in the complaint. In just one month, the company disclosed that it had disabled over 500,000 accounts due to violations of child safety policies.

The company stated that they aim to provide safe and age-appropriate online experiences for teens, with over 30 tools to support both teens and their parents. They have spent a decade working on these issues and have hired professionals dedicated to keeping young people safe and supported online. However, the complaint misrepresents their work by using selective quotes and cherry-picked documents.

The lawsuit claims that Facebook and Instagram did not adequately safeguard young users from online predators, and that Meta employees recommended safety measures that were not implemented by the company.

A suit, filed on Dec. 5, claims that the company rejected the recommended changes due to its focus on social media engagement and advertising growth over child safety. Mark Zuckerberg, the founder and CEO of Meta, is named as a defendant.

Raul Torrez, the New Mexico Attorney General, stated on Thursday that Meta employees had been trying to raise concerns about the dangers of solicitations and sexual exploitation inflicted on children as a result of decisions made by Meta executives for years.

Torrez stated that Meta executives, including Zuckerberg, prioritized growth over children's safety. Despite downplaying the harmful activity children are exposed to on its platforms, Meta's internal data and presentations reveal the issue is severe and widespread.

In 2021, whistleblower Frances Haugen revealed that Meta, the parent company of Facebook and Instagram, was aware of the harm caused to teenage girls by toxic content on its platforms but failed to take action to address the issue.

In front of a Senate panel, Haugen was questioned by outraged lawmakers about the company's prioritization of profits over user safety.

by Eamon Javers

politics