A US appeals court reviewed a devastating lawsuit filed by the mother of a 10-year-old girl who tragically died after participating in a viral “blackout” challenge promoted on TikTok, challenging the protections provided by Section 230 of the Communications Decency Act with its ruling. These protections essentially wiped away any liability social media companies had for content posted on their platforms, but this era of dodging accountability may have just ended.

The horrifying ‘challenge’ encouraged users of the social media platform to choke themselves until they lost consciousness. In the case of Nylah Anderson, the teenager who lost her life to this horrific social experiment, she accidentally hung herself.

A total of 20 deaths were reportedly caused by this challenge, according to research from Bloomberg News and Tawainna Anderson, Nylah’s mother, decided to take ByteDance – the parent company of TikTok – to court and blamed them for failing to protect her child from being exposed to this kind of content.

Mrs. Anderson claims that TikTok’s algorithm – also known as the “For You Page” (FYP) – should not have recommended this deadly challenge to her. Although the case was initially dismissed by a lower court citing the protections provided by Section 230, the Philadelphia 3rd US Circuit Court of Appeals gave the lawsuit another chance and opened the door to reassess ByteDance’s liability.

Judge Claims that Tiktok’s Algorithm Should Be Considered Editorial Content

At the heart of Judge Patty Shwartz’s decision to reinstate the case lies a distinction that she believes can be made between passively hosting third-party content and actively promoting it through recommendations made by proprietary algorithms whose goal is to keep users engaged. The harm caused by dangerous trends and challenges to the massive user base of TikTok, which is in large part made up of minors and young adults, is staggering. So far, Section 230 has protected TikTok and other social media apps (some of which seem to be even worse in this regard) from facing the sometimes deadly consequences of their recommendation algorithms.

tiktok users by age in 2022

In this regard, the United States Supreme Court recently ruled that these algorithms can be considered “editorial judgments” from social media platforms and, therefore, they may be held liable if the content their system recommends is harmful to users.

Also read: Best TikTok Crypto Accounts and Influencers

The court’s opinion states: “TikTok makes choices about the content recommended and promoted to specific users, and by doing so, is engaged in its own first-party speech.” This reasoning suggests that when platforms use algorithms to curate and recommend content, they are not merely passive hosts but active participants in the content distribution process.

Shwartz noted that if Nylah had searched for the blackout challenge on TikTok, “then TikTok may be viewed more like a repository of third-party content than an affirmative promoter of such content.”

Setting a Strong Precedent for Others to Pursue Similar Cases Against ‘Big Tech’

This ruling could have far-reaching consequences for social media platforms and their content recommendation systems. If upheld, it suggests that platforms could be held liable for the content they algorithmically recommend to users, even if that content was originally created by third parties. Naturally, this would totally change the social media industry in an instant.

Also read: Is TikTok Doomed? FTC Refers Major COPPA Complaints to DOJ

Jeffrey Goodman, the lawyer representing Tawainna Anderson, hailed the decision, stating that “Big Tech just lost its ‘get-out-of-jail-free card.’” These comments echo the growing frustration with the broad protections afforded to tech companies under Section 230 and the potential for this ruling to pave the way for increased accountability.

The decision also highlights the tension between platform autonomy and user safety, particularly when it comes to young users. Judge Paul Matey, in a partially concurring opinion, criticized TikTok’s “pursuit of profits above all other values.” He suggested that the platform has intentionally chosen to serve children by suggesting content with the “lowest virtues.” Hence, they cannot claim a level of immunity that Congress did not intend to provide.

Section 230: A Brief Overview

To understand the significance of this ruling, it’s important to understand the role of Section 230 in shaping the internet as we know it. Enacted as part of the Communications Decency Act in 1996, Section 230 provides online platforms with broad immunity from legal liability for content posted by their users.

The law states: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

This protection has been instrumental in allowing social media platforms, forums, and other interactive online services to flourish without the fear of being held responsible for every piece of content shared on their platforms.

However, critics argue that Section 230 has been interpreted too broadly, shielding platforms from any kind of responsibility even in cases where they play an active role in promoting or amplifying harmful content.

Anderson’s case against TikTok represents a potential shift in how courts may interpret the law in the context of algorithmic content recommendations.

TikTok Declines to Comment and Expert Believes the Precedent is Not That Dangerous

As of the court’s ruling, TikTok had not responded to requests for comment on the decision. However, the company has previously stated that it is committed to providing a safe environment to users and removing any kind of content related to dangerous challenges like the “blackout” challenge.

Also read: TikTok Launches Lawsuit Against US Government Over Ban Bill – Will It Win?

Legal experts are divided on the potential implications of this ruling. Some view it as a necessary recalibration of Section 230 protections in light of the increasingly sophisticated algorithms used by platforms like TikTok. Others worry that it could open the floodgates to a wave of lawsuits that could stifle innovation and free speech online.

Eric Goldman, a professor at Santa Clara University School of Law and an expert on Section 230, cautioned that, although the ruling is significant, its impact may be limited. “This decision is specific to the facts of this case and the Third Circuit’s interpretation,” Goldman noted. “It remains to be seen how other courts will view similar cases and whether this reasoning will be adopted more broadly.”

Lower Court Will Take a Second Look at the Case Without Section 230 Protections

The case has now been resubmitted to the lower court and TikTok will have to defend itself against Anderson’s claims without the shield of Section 230. This process could potentially reveal more details about the inner workings of the platform’s recommendation algorithm and how it decides which content to promote to users, particularly minors.

If the court ultimately upholds the idea that platforms can be held liable for the content that their algorithms promote, they may need to implement drastically more stringent content moderation practices or reconsider how they curate content for different age groups.

Moreover, this ruling could spark renewed debate in Congress about the need for Section 230 to be reformed. Critics of the law have long argued that it gives judges too much room to protect tech companies while supporters maintain that it is essential for preserving free speech online.

The case of Nylah Anderson highlights the complex challenges that social media platforms face these days as they attempt to balance user engagement, innovation, and safety.

While algorithmic recommendations have become a cornerstone of the social media experience, they also open up the debate about the responsibility of the platform to not promote potentially harmful content through these sophisticated systems (especially to kids).

As the case proceeds, it will be closely followed by legal experts, tech companies, and policymakers alike. The outcome could shape the future of content moderation and algorithmic recommendations.

For Tawainna Anderson and other families who have lost loved ones to dangerous online challenges, this ruling represents a step towards higher levels of accountability.