States Sue TikTok For Harming Young People In Latest Challenge To Social Media Giant

2 hours ago 1
ARTICLE AD

A bi-partisan group of 14 attorneys general filed a lawsuit against juggernaut TikTok for misleading the public about the safety of its platform and harming young people’s mental health.

The lawsuits, filed individually by each member of the coalition, allege that the wildly popular video-sharing platform violated state laws by falsely claiming its platform is safe for young people. TikTok in a statement today called many of the claims “inaccurate and misleading.”

The action, co-led by New York Attorney General Letitia James and California Attorney General Rob Bonta, says many young users are struggling with poor mental health and body image issues due to the platform’s addictive features and are being injured, hospitalized, or dying because of TikTok challenges (viral videos that encourage users to perform certain activities) created and promoted on the platform. The attorneys general are seeking “civil penalties, punitive damages, and injunctive relief.”

The NY suit aims “to remedy past and ongoing fraudulent, deceptive, and unlawful practices by TikTok … and to hold TikTok accountable for the harms it has inflicted on the youngest New Yorkers by falsely marketing and promoting its addictive and otherwise harmful mobile social media app and website in this State.” It said depression, anxiety, eating disorders, and suicidal ideation “have all reached record levels among children in New York and elsewhere” and “a growing body of evidence isolates addictive social media as a key driver of the youth mental health crisis.”

“We strongly disagree with these claims, many of which we believe to be inaccurate and misleading,” TikTok said. “We’re proud of and remain deeply committed to the work we’ve done to protect teens and we will continue to update and improve our product. We provide robust safeguards, proactively remove suspected underage users, and have voluntarily launched safety features such as default screentime limits, family pairing, and privacy by default for minors under 16. We’ve endeavored to work with the Attorneys General for over two years, and it is incredibly disappointing they have taken this step rather than work with us on constructive solutions to industrywide challenges.”

Today’s suits echo various complaints against Instagram, Snap and other social media platforms. A difference, however, is that TikTok, a division of Chinese conglomerate BytDance, risks being banned by the Biden administration over national security concerns. The President signed a national security bill in April requiring that ByteDance divest TikTok or face removal on U.S. app stores.

TikTok CEO Shou Chew said then that “the facts and the Constitution are on side, and we expect to prevail again. He was referring to a challenge by the Trump administration that bogged down in courts and faded away. The latest action, which may be more durable, gave ByteDance nine months to divest, although that could be extended by lawmakers.

TikTok sued the federal government in May saying the law violated the First Amendment. A federal appeals court in Washington, D.C, heard arguments last month. It’s thought the case will wind up at the Supreme Court.

In another suit, TikTok was sued in August by the U.S. Department of Justice and the Federal Trade Commission for violating the Children’s Online Privacy Protection Act by knowingly collecting and retaining personal information from children under the age of 13.

According to the lawsuits filed today, TikTok’s underlying business model focuses on maximizing young users’ time on the platform so the company can boost revenue from selling targeted ads. TikTok uses an addictive content-recommendation system designed to keep minors on the platform as long as possible and as often as possible, despite the dangers of compulsive use.

“Young people are struggling with their mental health because of addictive social media platforms like TikTok,” said AG James in a statement. “TikTok claims that their platform is safe for young people, but that is far from true. In New York and across the country, young people have died or gotten injured doing dangerous TikTok challenges and many more are feeling more sad, anxious, and depressed because of TikTok’s addictive features. Today, we are suing TikTok to protect young people and help combat the nationwide youth mental health crisis. Kids and families across the country are desperate for help to address this crisis, and we are doing everything in our power to protect them.” 

“Our investigation has revealed that TikTok cultivates social media addiction to boost corporate profits. TikTok intentionally targets children because they know kids do not yet have the defenses or capacity to create healthy boundaries around addictive content,” saidAG Bonta. “When we look at the youth mental health crisis and the revenue machine TikTok has created, fueled by the time and attention of our young people, it’s devastatingly obvious: our children and teens never stood a chance against these social media behemoths. TikTok must be held accountable for the harms it created in taking away the time — and childhoods — of American children.”

The U.S. Surgeon General, after issuing a series of advisories starting in 2021, in June called on Congress to require warning labels on social media platforms given the impact on the mental health of young people, the suit noted.

Addictive features the suits noted include “around-the-clock notifications” that can lead to poor sleep patterns; “Autoplay of an endless stream of videos that manipulates users into compulsively spending more time on the platform with no option to disable”; TikTok stories and live content “that is only available temporarily to entice users to tune in immediately or lose the opportunity to interact”; a highlighted likes and comments section as a form of social validation, which can impact young users’ self-esteem; and Beauty filters that alter one’s appearance and can lower young user’s self-esteem.”

Read Entire Article