Former Meta employee says Senate company failed to protect teens’ safety
NEW YORK: A former Meta employee testified before a US Senate subcommittee on Tuesday, alleging that the parent company of Facebook and Instagram knew about harassment and other harms faced by teenagers on its platforms, but failed to remedy this.
The employee, Arturo Bejar, worked on wellness for Instagram from 2019 to 2021 and previously served as director of engineering for Facebook’s Protect and Care team from 2009 to 2015, he said.
Bejar testifies before the Senate Judiciary Subcommittee on Privacy, Technology and the Law during a hearing on social media and its impact on adolescent mental health.
“It’s time the public and parents understand the true level of danger posed by these ‘products’ and it’s time young users have the tools they need to report and crack down on online abuse,” he said. in written remarks made available before the hearing.
Bejar’s testimony comes amid a bipartisan push in Congress to pass legislation that would require social media platforms to provide parents with tools to protect children online.
The goal of his work at Meta was to influence the design of Facebook and Instagram in ways that would influence users to adopt more positive behaviors and provide young people with tools to manage unpleasant experiences, Bejar said during the audience.
Meta said in a statement that it is committed to protecting young people online, emphasizing its support for the same user surveys cited by Bejar in his testimony and its creation of tools such as anonymous notifications of potentially hurtful content.
“Every day, countless people inside and outside of Meta are working on how to keep young people safe online,” Meta’s statement said. “All this work continues.”
Bejar told senators that he met regularly with the company’s top executives, including Chief Executive Mark Zuckerberg, and considered them supportive of the job at the time. However, he later concluded that leaders had decided “time and time again not to address this issue,” he testified.
In a 2021 email, Bejar reported to Zuckerberg and other top executives internal data revealing that 51 percent of Instagram users had reported having a bad or harmful experience on the platform in the last seven days and 24.4 percent of children aged 13 to 15. had reported receiving unwanted sexual advances.
He also told them that his own 16-year-old daughter had received misogynistic comments and lewd photos, without adequate tools to report these experiences to the company. The existence of the email was first reported by the Wall Street Journal.
In his testimony, Bejar said that during a meeting, Chris Cox, Meta’s chief product officer, was able to spontaneously cite specific statistics about harm to adolescents.
“I found it heartbreaking because it meant they knew and they weren’t acting on it,” Bejar said.