Court Says Section 230 Doesn’t Shield TikTok From ‘Blackout’ Challenge Lawsuit

2 months ago 29
ARTICLE AD

In 2021, a 10-year-old girl named Nylah Anderson accidentally choked to death on a dog leash she had wrapped around her neck. Anderson had been compelled to engage in this dangerous behavior by the “blackout challenge,” a viral game circulating on TikTok at the time. In 2022, Bloomberg reported that this challenge, which encouraged children to choke themselves with household items and then film their own loss of consciousness (and, in most cases, subsequent revival), had been linked to as many as 20 deaths.

A court previously held that Anderson’s mother, Tawainna Anderson, couldn’t sue TikTok because of Section 230, the controversial internet law that affords internet platforms legal immunity for the content posted by third parties on their sites. Now, a U.S. court has sought to overturn that previous ruling, saying that TikTok will have to defend itself against the lawsuit without using Section 230 as an excuse for its actions.

In an opinion handed down by the Third Circuit Court of Appeals in Pennsylvania, a three-judge panel has argued that TikTok cannot hide behind the internet law to shield itself from prosecution. Indeed, the opinion argues, Anderson’s daughter didn’t just happen to come across the “blackout challenge” while perusing TikTok’s site. Instead, the platform’s algorithm served Anderson’s daughter the “challenge” via her “For You Page,” which indicates that the site played an active role in distributing the material.

“TikTok’s algorithm is not based solely on a user’s online inputs,” the decision reads. “Rather, the algorithm curates and recommends a tailored compilation of videos for a user’s FYP based on a variety of factors, including the user’s age and other demographics, online interactions, and other metadata.” This signals that TikTok wasn’t just passively hosting the content but was actively feeding it to the little girl. “ICSs are immunized only if they are sued for someone else’s expressive activity or content (i.e., third-party speech), but they are not immunized if they are sued for their own expressive activity or content,” the decision continues.

A previous court decision held that “a platform’s algorithm that reflects ‘editorial judgments’ about ‘compiling the third-party speech it wants in the way it wants’ is the platform’s own ‘expressive product’ and is therefore protected by the First Amendment,” the opinion states. As such, if those algorithms count as platforms’ “speech,” then such content is not passive and is, therefore, not protected by 230, the judges argue.

One judge, Judge Paul Matey, wrote in a partial legal concurrence attached to the opinion that Section 230 had evolved away from the original intent of the law, and had now become a statute that “immunizes platforms from the consequences of their own conduct and permits platforms to ignore the ordinary obligation that most businesses have to take reasonable steps to prevent their services from causing devastating harm.”

Gizmodo reached out to TikTok for comment.

The court’s decision obviously raises big questions about the future of Section 230, as well as about the future of the social media industry. For years, social media platforms have largely operated as black boxes, using secret closed-source algorithms that manipulate what kinds of content site users interact with. This algorithmic curation has arguably had a lot of negative side effects. Algorithms have been blamed for political radicalization, mental health maladies, and, in cases like this, encouraging children to engage in dangerous or risky behavior. If companies’ algorithms become a source of litigation in the future, it could drastically change the way they host content—which would, in turn, drastically change the shape of the internet.

Read Entire Article