A second whistleblower has testified that Meta, the parent company of Facebook, failed to take action to protect teenagers from harmful content on its platform.

A second whistleblower has testified that Meta, the parent company of Facebook, failed to take action to protect teenagers from harmful content on its platform.
A second whistleblower has testified that Meta, the parent company of Facebook, failed to take action to protect teenagers from harmful content on its platform.
  • On Tuesday, a second whistleblower, Arturo Bejar, appeared before a Senate subcommittee.
  • Bejar tried to bring attention to the potential harm caused by the company's platforms to teenagers to the top management.
  • Like Frances Haugen, a former Meta employee, he reveals internal documents and research to the media and the Senate.
After Hours
Arturo Bejar, former Facebook employee and consultant for Instagram, testifies before the Senate Judiciary Subcommittee on Privacy, Technology, and the Law during a hearing to examine social media and the teen mental health crisis, Tuesday, Nov. 7, 2023, on Capitol Hill in Washington.
Arturo Bejar, former Facebook employee and consultant for Instagram, testifies before the Senate Judiciary Subcommittee on Privacy, Technology, and the Law during a hearing to examine social media and the teen mental health crisis, Tuesday, Nov. 7, 2023, on Capitol Hill in Washington. (Stephanie Scarbrough | AP)

On Tuesday, a second whistleblower testified before a Senate subcommittee, describing the futile attempts to alert Meta's top leadership about the detrimental effects of their platforms on teenagers.

Arturo Bejar, a former Facebook engineering director from 2009 to 2015 and a consultant at Instagram from 2019 to 2021, testified before the Senate Judiciary Subcommittee on Privacy, Technology and Law that top Meta officials did not take sufficient steps to protect its youngest users from harm on the platforms.

Tech lobbying was blamed by lawmakers on both sides of the aisle for Congress' inability to pass laws safeguarding children online. Although Senate committees had backed bills aimed at protecting kids online, they remained stagnant, awaiting a vote on the Senate floor or action in the House.

Large tech companies are believed to have largely unchecked power by lawmakers, as evidenced by Bejar's appearance.

Bejar’s allegations

Bejar has accused the company of safety issues, following in the footsteps of Frances Haugen, another former Meta employee who exposed internal documents and research to the media and Senate.

On Tuesday, Bejar informed lawmakers that despite being aware of the harms affecting its youngest users, meta leadership chose not to take sufficient action to address the issue.

Bejar had previously informed Blumenthal about a conversation with Chief Product Officer Chris Cox prior to the hearing. During that meeting, Bejar discussed research on platform harms to teens and recalled Cox acknowledging his awareness of the statistics.

After meeting with Cox, Bejar no longer believed that they didn't know when he returned in 2019.

Bejar found it heartbreaking that they knew but didn't act on it.

Bejar believes that Meta's focus on tackling a limited definition of harm is part of the issue. He emphasizes the importance of examining the prevalence of different harms among different user demographics to accurately assess the extent of harm to specific groups.

On October 5, 2021, the day Haugen testified in the Senate, Bejar emailed top Meta executives, including Zuckerberg, Sandberg, and Mosseri.

Zuckerberg was addressed in an email by Bejar, who shared the document as part of a collection with the committee. Bejar stated that he had previously brought up the concerns to Sandberg, Mosseri, and Cox.

In an email to Mosseri on Oct. 14, 2021, Bejar discussed a survey of 13-15-year-olds on Instagram that he had prepared for a meeting scheduled for the next day.

In the last seven days, 13% of Instagram respondents experienced unwanted sexual advances, 26% witnessed discrimination against people on the platform based on their identities, and 21% felt more negative about themselves due to others' posts on Instagram.

Since she was 14, Bejar's teenage daughter has received unsolicited genitalia pictures from male users via email. She stated that she would block users who sent such photos.

She replied to my email saying that if the only consequence of their behavior is getting blocked, then they wouldn't see the harm in continuing to do it.

He proposed that resources be allocated and efforts be focused on determining the causes of negative user experiences, the percentage of content that violates policies, and the product modifications that could enhance the platform's user experience.

Zuckerberg and Sandberg never responded to or met with Bejar about the email.

Meta spokesperson Andy Stone stated that numerous individuals, both within and outside of Meta, are working daily to ensure the safety of young people online. He emphasized that surveys like the one mentioned are just one aspect of this effort, and have led to the creation of features such as anonymous notifications of potentially harmful content and comment warnings. Additionally, Meta has collaborated with parents and experts to develop over 30 tools to support teens and their families in having positive online experiences. All of this work continues.

Stone highlighted the tool "Restrict," created based on teen feedback, which allows only the second user to view their own comments on user one's posts if one user restricts another. Additionally, he referenced Meta's 2021 content distribution guidelines aimed at addressing borderline content that violates the company's policies.

Blaming tech money for lack of new laws

Richard Blumenthal and Marsha Blackburn, the Subcommittee Chair and Senator, respectively, presented their bill, the Kids Online Safety Act (KOSA), as a crucial solution to the harms described by Bejar. KOSA aims to hold tech companies accountable for the safety of their products designed for children.

Before the hearing started, Blumenthal told reporters that it is time for Congress to give parents and kids protection tools to disconnect from algorithms and black boxes that promote toxic content.

Some progressive groups raised concerns that the bill could harm vulnerable children, including LGBTQ youth, and the speaker responded by stating that modifications had been made to address these concerns.

Blumenthal stated that the measure is not about content or censorship, but rather the product design that promotes toxic content to children. The aim is to allow children to disconnect from algorithms that present content they do not desire, without interfering with their access to it.

Blumenthal stated that although some fear that passing narrow legislation may delay broader privacy protections in Congress, he believes that it is necessary to take action rather than aim for perfection. He supports a broader privacy bill but believes it should be approached one step at a time. The more bipartisan consensus there is on protecting children, the better positioned we will be to pass a broader privacy bill.

Josh Hawley, R-Mo, stated that it is a reflection of this body's failure to act, as we have not taken any action, and the reason for this is clear: Big Tech is the most influential lobby in Congress, and they have successfully blocked every significant piece of legislation.

The Judiciary Committee Chair, Dick Durbin, D-Ill., criticized the Senate for not acting on bills aimed at safeguarding children online, despite their approval at the committee level with broad support.

Sen. Lindsey Graham, R-S.C., claimed that Section 230, which provides tech companies with legal immunity, is responsible for their lobbying activities. "The other bills won't move until they believe they can be sued in court," he stated.

Attorney generals around the country file lawsuit against Meta alleging addictive features
by Lauren Feiner

technology