Section 230 protects Snapchat against lawsuit brought by assault victim

section 230

A young girl named C.O. found much misfortune using Snapchat. Her parents (the plaintiffs in this lawsuit) alleged that the app’s features caused her to become addicted to the app, to be exposed to sexual content, and to eventually be victimized on two occasions, including once by a registered sex offender.

Suing Snapchat

Plaintiffs sued Snap and related entities, asserting claims including strict product liability, negligence, and invasion of privacy, emphasizing the platform’s failure to protect minors and address reported abuses. Defendants moved to strike the complaint.

The court granted the motion to strike. It held that the allegations of the complaint fell squarely within the ambit of immunity afforded under Section 230 to “an interactive computer service” that acts as as a “publisher or speaker” of information provided by another “information content provider.” Plaintiffs “clearly allege[d] that the defendants failed to regulate content provided by third parties” when such third parties used Snapchat to harm plaintiff.

Publisher or speaker? How about those algorithms!

Plaintiffs had argued that their claims did not seek to treat defendants as publishers or speakers, and therefore Section 230 immunity did not apply. Instead, plaintiffs argued, they were asserting claims that defendants breached their duty as manufacturers to design a reasonably safe product.

Of particular interest was the plaintiffs’ claim concerning Snapchat’s algorithms which recommended connections and which allegedly caused children to become addicted. But in line with the case of Force v. Facebook, Inc., 934 F.3d 53 (2nd Cir. 2019), the court refused to find that use of algorithms in this way was outside the traditional role of a publisher. It was careful to distinguish the case from Lemmon v. Snap, Inc., 995 F.3d 1085 (9th Cir 2021), in which that court held Section 230 did not immunize Snapchat from products liability claims. In that case, the harm to plaintiffs did not result from third party content but rather from the design of the platform which tempted the users to drive fast. In this case, the harm to plaintiffs was the result of particular actions of third parties who had transmitted content using Snapchat, to lure C.O.

Sad facts, sad result

The court seemed to express some trepidation about its result, using the same language the First Circuit Court of Appeals used in Jane Doe No. 1 v. Backpage.com, LLC, 817 F.3d 12, 15 (1st Cir. 2016): “This is a hard case-hard not in the sense that the legal issues defy resolution, but hard in the sense that the law requires that [the court] … deny relief to plaintiffs whose circumstances evoke outrage.” And citing from Vazquez v. Buhl, 90 A.3d 331 (2014), the court observed that “[w]ithout further legislative action, however, there is little [this court can] do in [its] limited role but join with other courts and commentators in expressing [its] concern with the statute’s broad scope.”

V.V. v. Meta Platforms, Inc. et al., 2024 WL 678248 (Conn. Super. Ct., February 16, 2024)

See also:

Scroll to top