In Anderson v. TikTok, Inc., 2024 WL 3948248 (3d Cir. August 27, 2024), the court reversed and vacated a lower court’s decision dismissing which dismissed plaintiff’s complaint against TikTok and ByteDance, Inc. on the ground that the defendants were immunized by the Communications Decency Act (“CDA”), 47 U.S.C. § 230.
The court summarized the facts as follows:
TikTok, Inc., via its algorithm, recommended and promoted videos posted by third parties to ten-year-old Nylah Anderson on her uniquely curated “For You Page.” One video depicted the “Blackout Challenge,” which encourages viewers to record themselves engaging in acts of self-asphyxiation. After watching the video, Nylah attempted the conduct depicted in the challenge and unintentionally hanged herself.
Here is the court’s analysis:
Congress enacted § 230 of the CDA to immunize interactive computer services (“ICSs”)6 from liability based on content posted by third parties in certain circumstances. See F.T.C. v. Accusearch Inc., 570 F.3d 1187, 1195 (10th Cir. 2009).7 Section 230 immunizes ICSs only to the extent that they are sued for “information provided by another information content provider.”8 47 U.S.C. § 230(c)(1).9 In other words, ICSs are immunized only if they are sued for someone else’s expressive activity or content (i.e., third-party speech), but they are not immunized if they are sued for their own expressive activity or content (i.e., first-party speech).
Anderson asserts that TikTok’s algorithm “amalgamat[es] [ ] third-party videos,” which results in “an expressive product” that “communicates to users … that the curated stream of videos will be interesting to them[.]” ECF No. 50 at 5. The Supreme Court’s recent discussion about algorithms, albeit in the First Amendment context, supports this view.10 In Moody v. NetChoice, LLC, the Court considered whether state laws that “restrict the ability of social-media platforms to control whether and how third-party posts are presented to other users” run afoul of the First Amendment. ––– U.S. ––––, 144 S. Ct. 2383, 2393, ––– L.Ed.2d –––– (2024). The Court held that a platform’s algorithm that reflects “editorial judgments” about “compiling the third-party speech it wants in the way it wants” is the platform’s own “expressive product” and is therefore protected by the First Amendment. Id. at 2394.
Given the Supreme Court’s observations that platforms engage in protected first-party speech under the First Amendment when they curate compilations of others’ content via their expressive algorithms, id. at 2409, it follows that doing so amounts to first-party speech under § 230, too. See Doe ex rel. Roe v. Snap, Inc., ––– U.S. ––––, 144 S. Ct. 2493, 2494, ––– L.Ed.2d –––– (2024) (Thomas, J., dissenting from denial of certiorari) (observing that “[i]n the platforms’ world, they are fully responsible for their websites when it results in constitutional protections, but the moment that responsibility could lead to liability, they can disclaim any obligations and enjoy greater protections from suit than nearly any other industry.”).
Here, as alleged, TikTok’s FYP algorithm “[d]ecid[es] on the third-party speech that will be included in or excluded from a compilation—and then organiz[es] and present[s] the included items” on users’ FYPs. NetChoice, 144 S. Ct. at 2402. Accordingly, TikTok’s algorithm, which recommended the Blackout Challenge to Nylah on her FYP, was TikTok’s own “expressive activity,” id., and thus its first-party speech. Such first-party speech is the basis for Anderson’s claims. See App. 39 (Compl. ¶ 107(k), (o)) (alleging, among other things, that TikTok’s FYP algorithm was defectively designed because it “recommended” and “promoted” the Blackout Challenge).
The court concluded that “Section 230 immunizes only information ‘provided by another'” and that “here, because the information that forms the basis of Anderson’s lawsuit—i.e., TikTok’s recommendations via its FYP algorithm—is TikTok’s own expressive activity, § 230 does not bar Anderson’s claims.”