TikTok Faces Lawsuit Over 10-Year-Old Girl's Death After Viral Challenge, U.S. Court Rules

U.S. court allows lawsuit against TikTok after a 10-year-old girl's death from a viral challenge, questioning platform's content recommendations

Aug 28, 2024 - 09:22
Aug 28, 2024 - 09:22
 44
TikTok Faces Lawsuit Over 10-Year-Old Girl's Death After Viral Challenge, U.S. Court Rules
TikTok Faces Lawsuit Over 10-Year-Old Girl's Death After Viral Challenge, U.S. Court Rules

A recent U.S. appeals court decision has ruled that TikTok must face a lawsuit from the mother of a 10-year-old girl who tragically died after attempting a dangerous "blackout challenge" that went viral on the platform. The challenge, which dares participants to choke themselves until they lose consciousness, was reportedly suggested to the girl by TikTok's algorithm.

Court Allows Lawsuit Against TikTok to Proceed

Although internet companies are usually protected by federal law from being sued over user-posted content, the 3rd U.S. Circuit Court of Appeals in Philadelphia has decided that this protection does not apply to TikTok in this case. The court ruled that TikTok could be held liable because its algorithm allegedly promoted the dangerous challenge to the young girl, Nylah Anderson, who later died attempting it.

Judge Patty Shwartz, writing for the court, explained that while Section 230 of the Communications Decency Act of 1996 protects platforms from liability for third-party content, it does not cover situations where the platform itself recommends content through its algorithms. According to the judge, TikTok’s algorithm actively selects and promotes specific content to users, making it part of the platform’s own speech rather than simply hosting user-generated content.

Get Your Domain at Name.com

Advertisement

Supreme Court Ruling Influences the Decision

This ruling signals a shift from earlier interpretations of Section 230, which had generally protected online platforms from being held accountable for user content. The change follows a recent U.S. Supreme Court decision that clarified how content curation by algorithms is treated under the law. The Supreme Court found that when a platform’s algorithm promotes certain content, it is making an "editorial judgment," and this type of decision is not protected by Section 230.

Judge Shwartz noted that TikTok’s content promotion practices are considered the company’s own actions, which means they are not shielded by the law.

The Case Behind the Lawsuit

The lawsuit was filed by Tawainna Anderson, the mother of Nylah Anderson, who tragically passed away in 2021 after attempting the "blackout challenge" using a purse strap at her home. Anderson argued that TikTok’s algorithm recommended the dangerous challenge to her daughter, ultimately leading to her death.

This decision from the appeals court overturns a lower court ruling that had previously dismissed the case on the grounds of Section 230 immunity. Anderson's attorney, Jeffrey Goodman, praised the ruling, stating that it removes a significant legal defense often used by tech companies. “Big Tech just lost its ‘get-out-of-jail-free card,’” Goodman said.

Implications for TikTok and Other Social Media Platforms

Judge Paul Matey, who partly agreed with the decision, criticized TikTok for allegedly prioritizing profit over user safety. He suggested that TikTok could no longer claim legal immunity when it actively promotes harmful content to children. This case could set a new precedent for how tech companies are held accountable for their algorithm-driven content recommendations.

The ruling may encourage further legal actions against social media platforms when harmful content reaches vulnerable users. As of now, TikTok has not responded to the court’s decision, but the company may need to reassess its content moderation and recommendation practices to avoid similar lawsuits in the future.

Possible Consequences for TikTok and Other Social Media Platforms

This court decision could significantly affect how TikTok and similar social media companies manage their content. By challenging the existing legal protections provided under Section 230, this ruling opens the door to more scrutiny of how these platforms use algorithms to suggest content. As a result, social media companies might need to reconsider their content recommendation systems and policies to avoid legal risks, particularly regarding content that could potentially harm young users.

Stay tuned to Shook Finance for more updates on this case and its implications for the tech industry.

Also Read: Global Outage Hits Social Media Platform X, Thousands of Users Affected

iShook Opinion Curated by iShook Opinion and guided by Founder and CEO Beni E Rachmanov. Dive into valuable financial insights at ishookfinance.com for expert articles and latest news on finance.