Washington: Meta CEO Mark Zuckerberg and the chief executives of TikTok, X, Discord and Snap faced a grilling by hostile US lawmakers on Wednesday over the dangers that children and teens face on social media platforms.
The tech chiefs were convened by the US Senate Judiciary Committee where they were put to task about the effects of social media in a session titled “Big Tech and the Online Child Sexual Exploitation Crisis.”
The executives are confronting a torrent of political anger for not doing enough to thwart online dangers for children, including from sexual predators and teen suicide.
During one round of particularly heated questioning, Zuckerberg was made to stand up and apologize to the families of victims who had packed the committee room.
“Mister Zuckerberg, you and the companies before us, I know you don’t mean it to be so, but you have blood on your hands. You have a product that’s killing people,” Senator Lindsey Graham told the chief executives.
“Keeping young people safe online has been a challenge since the internet began and as criminals evolve their tactics, we have to evolve our defenses too,” he added.
Zuckerberg also told the lawmakers that according to research, “on balance” social media was not harmful to the mental health of young people.
TikTok’s Chew said “as a father of three young children myself I know that the issues that we’re discussing today are horrific and the nightmare of every parent.”
“I intend to invest more than $2 billion in trust and safety. This year alone, we have 40,000 safety professionals working on this topic,” Chew said.
Meta also said 40,000 of its employees work on online safety and that $20 billion has been invested since 2016 to make the platform safer.
Ahead of their testimony, Meta and X, formerly Twitter, announced new measures in anticipation of the heated session.
Meta, which owns the world’s leading platforms Facebook and Instagram, said it would block direct messages sent to young teens by strangers.
By default, teens under age 16 can now only be messaged or added to group chats by people they already follow or are connected to.
Meta also tightened content restrictions for teens on Instagram and Facebook making it harder for them to view posts that discuss suicide, self-harm or eating disorders.