Section 230 Did Not Bar Claims Against TikTok & Meta Arising From “Subway Surfing” Death

In Nazario v. ByteDance, Inc., TikTok, Inc., Meta Platforms, Inc. et al, 2025 WL 1780792 (Sup. Ct. N.Y. Cty. June 27, 2025) – a case arising from the “subway surfing” death of fifteen-year-old Zackery Nazario – the court, inter alia, denied the motion to dismiss plaintiff’s complaint alleging various claims against the social media defendants, on the ground that such claims were barred by Section 230 of the Communications Decency Act (CDA).

The court distinguished this case from those (cited by defendants) in which the algorithms used to recommend content were considered non-actionable because they were “content neutral” or based on user inputs:

In contrast, here, plaintiff alleges that “Zackery Nazario did not open or use TikTok or Instagram to search for dangerous challenges,” yet the social media defendants “continued to flood Zackery — a minor — with material he was not interested in and did not want to see” (NYSCEF Doc No 3).

Plaintiff’s claims, therefore, are not based on the social media defendants’ mere display of popular or user-solicited third-party content, but on their alleged active choice to inundate Zackery with content he did not seek involving dangerous “challenges.” Plaintiff alleges that this content was purposefully fed to Zackery because of his age, as such content is popular with younger audiences and keeps them on the social media defendants’ applications for longer, and not because of any user inputs that indicated he was interested in seeing such content. Thus, based on the allegations in the complaint, which must be accepted as true on a motion to dismiss, it is plausible that the social media defendants’ role exceeded that of neutral assistance in promoting content, and constituted active identification of users who would be most impacted by the content (Nat’l Coalition on Black Civic Participation v Wohl, 2021 U.S. Dist. LEXIS 177589, *8, 26-27 [SDNY 2021] [communications company “identifi[ed] predominantly Black zip codes to amplify the intimidating nature of the robocall message and thereby achieve the goal of voter suppression”]; Patterson v Meta Platforms, Inc., 2024 N.Y. Misc. LEXIS 2312, *4-5 [SC Erie Co 2024] [denying defendant’s motion to dismiss where plaintiffs alleged that Meta hosts “sophisticated products designed to be addictive to young users and they specifically directed Gendron to [] postings that indoctrinated him with ‘white replacement theory’ ”]).

Notably, plaintiff does not demand that the social media defendants take on “a publisher’s traditional editorial functions–such as deciding whether to publish, withdraw, postpone or alter content” (Shiamili, 17 NY3d at 288; Nasca v Bytedance Ltd., 2025 NY Misc LEXIS 2255 [SC Suffolk Co 2025] [plaintiffs “admit[tedly allege] that Chase died because TikTok failed to adequately monitor and remove suicidal content. Accordingly, plaintiffs seek to hold defendants liable for its exercise of a publisher’s traditional editorial functions”]). Rather, plaintiff asserts that the social media defendants should not be permitted to actively target young users of its applications with dangerous “challenges” before the user gives any indication that they are specifically interested in such content and without warning. If the social media defendants are in fact targeting children as alleged in the complaint, they “could [] satisf[y] [their] alleged obligation–to take reasonable measures to design a product more useful than it was foreseeably dangerous–without altering the content that [their] users generate” (Lemmon, 995 F3d at 1092 [internal quotation marks omitted]).

[Citations omitted.]

The court was careful to note, however, that while the social media defendants might later be able to prove that their platforms are not products and that they are subject to the CDA’s protections, the court was required to rule on the basis of the complaint’s allegations, and not “facts’ asserted by the defendants.

Share This: