Meta whistleblower: Regulators are our ‘last hope’ at fixing social media

10 months ago 63
ARTICLE AD

LONDON — A former insider at tech giant Meta has said that social media companies are failing to keep kids safe online, and that government regulators must step in.

“Regulators are our last hope at peace. They really are our last hope,” Arturo Béjar told POLITICO at a cafe in central London on Tuesday, shortly before a meeting with the country’s media regulator Ofcom, which will be tasked with enforcing Britain’s sprawling new internet rulebook, the Online Safety Act.

Béjar, who worked on safety at Meta during two stints over the last decade, claims his efforts to flag problems to senior leadership at the company went unheeded. He decided to go public with his concerns last year, testifying in evidence to the United States Congress in November that children, including his own daughter and her friends, were receiving unwanted sexual advances on Instagram.

According to the whistleblower, many of the measures he had implemented during his first stint at Meta, such as tools designed to make it easier to report problems, had gone when he returned to the company as a consultant in 2019.

“When you hear from the inside from Arturo [Béjar] that it isn’t getting better, it’s getting worse, what other mechanism do we have to say to these companies that this is not good enough? The only mechanism I can see is regulation,” said Ian Russell of the Molly Rose Foundation, which is shepherding Béjar around meetings with U.K. policymakers, including the Science and Tech Secretary Michelle Donelan, Security Minister Tom Tugendhat and Labour’s Shadow Technology Minister Peter Kyle.

“It’s patently obvious that self-regulation has failed. Molly would still be alive today if the platforms had been better regulated,” said Russell, referring to his daughter Molly Russell, who took her own life after being exposed to a stream of self-harm content on platforms like Instagram and Pinterest.

Meta said it had introduced “more than 30 tools and resources to support teens and their families.”

Béjar, who lives in California, said he was in the U.K. because it is “the furthest ahead in the world when it comes to trying to put in a framework that protects kids.”

Britain’s “Children’s code,” a framework designed to protect kids online, is widely considered world leading and has inspired lawmakers as far away as California. Last year Britain followed jurisdictions including the EU by bringing in tough new online content rules, known as the Online Safety Act.

In a nod to Meta’s roll out of end-to-end encryption on Facebook and Instagram, Béjar said that anyone who thought the technology should be introduced to kids’ private messages should be ruled out of a job at Ofcom, which has launched a major recruitment drive ahead of the act coming into force.

“When a child disappears, that’s how you find them,” he said, referring to the ability for law enforcement to get into private messages that aren’t end-to-end encrypted.

Encryption emerged as a key point of contention during the Online Safety Act’s journey into law, with some legal experts arguing it would invest Ofcom with powers to order companies to break encryption over child safety concerns.

A Meta spokesperson said: “Every day countless people inside and outside of Meta are working on how to help keep young people safe online. Working with parents and experts, we have introduced over 30 tools and resources to support teens and their families in having safe, positive experiences online. All of this work continues.”

Measures introduced by Meta to protect kids includes automatically setting teens’ Instagram accounts to private and limiting the ability of adults to message children. The company also allows people to report issues like bullying and harassment, and nudity and sexual activity on the platform.

Read Entire Article