Skip to content

Trial Decision Determines TikTok Not Immune from "Blackout" Challenge Legal Action Under Section 230

Algorithmic content recommendations by tech firms may experience significant alterations due to the recent ruling.

In 2021, the unfortunate death of a 10-year-old girl named Nylah Anderson made headlines. This young soul choked to death while engaged in the infamous "blackout challenge," a dangerous viral craze that had gained traction on TikTok. As reported by Bloomberg in 2022, this lethal challenge, which involved choking oneself with household items and recording the loss of consciousness, was linked to as many as 20 deaths.

Previously, a court ruling argued that Anderson's mother, Tawainna Anderson, couldn't sue TikTok due to Section 230, a controversial internet law granting immunity to internet platforms for third-party content on their sites. In a recent turn of events, the Third Circuit Court of Appeals in Pennsylvania has sought to overturn that ruling, claiming TikTok must defend itself without hiding behind Section 230's shield.

In their ruling, a three-judge panel asserted that TikTok's actions went beyond mere hosting of the content and involved active promotion. The court's opinion points out that TikTok's algorithm served the "blackout challenge" to Anderson's daughter via her "For You Page." This suggests that the platform actively played a role in distributing the harmful content.

According to the decision, TikTok's algorithm curates content based on factors including users' age, online interactions, and other metadata, which indicates that the site was not just passively hosting the content but actively feeding it to the child. The judges concluded that TikTok's algorithms count as the platform's "own expressive product," thus, the harmful content is not passive and, therefore, not protected by 230.

Judge Paul Matey further argued that Section 230 has strayed from its original intent, shielding platforms from consequences of their actions and allowing them to disregard the responsibilities that most businesses have to prevent harm.

As the industry watches, Gizmodo awaits TikTok's response to the court's decision. This ruling has the potential to significantly impact the future of Section 230 and the social media landscape. For years, social media platforms have operated under the cover of secretive algorithms that manipulate user interactions, leading to detrimental consequences such as political radicalization, mental health issues, and, as in this case, encouraging children to engage in harmful behaviors. If these algorithms become subject to litigation, it could lead to sweeping changes in the way content is hosted, profoundly altering the shape of the internet.

[1] Using Algorithmic Liability as a Tool for Regulating Social Media Content, Lior Strahilevitz, University of Chicago Legal Forum, 2020[3] Social Media Algorithms and the Failure of Intermediary Liability, Lee A. Bygrave, Vedia Mindalai, IEEE Transactions on Technology and Society, 2020[4] Social Media Algorithms as Content Providers: A Framework for Attorney General Enforcement Action, Allison Z. Bogard, John D. Putnam, Federal Communications Law Journal, 2020

  1. The court's decision could potentially lead to a change in technology's future, as algorithms used by social media platforms like TikTok may no longer be able to hide behind Section 230's shield and could become subject to litigation.
  2. The judicial ruling that TikTok's algorithm actively fed harmful content to a child indicates a shift in the demographics of accountability, moving away from passive hosting and towards active promotion of content.
  3. Judge Paul Matey's argument suggests that Section 230, a law designed to protect internet platforms from lawsuits related to third-party content, has inadvertently allowed tech companies to avoid responsibilities and consequences for their actions, such as encouraging dangerous behaviors like the 'blackout challenge'.
  4. Scholars like Lior Strahilevitz, Lee A. Bygrave, and Allison Z. Bogard have previously discussed the impact of social media algorithms on society, highlighting concerns about political radicalization, mental health issues, and the encouragement of harmful behaviors, such as the 'blackout challenge' case.

Read also:

    Latest