Section 230 protects Snapchat against lawsuit brought by assault victim

section 230

A young girl named C.O. found much misfortune using Snapchat. Her parents (the plaintiffs in this lawsuit) alleged that the app’s features caused her to become addicted to the app, to be exposed to sexual content, and to eventually be victimized on two occasions, including once by a registered sex offender.

Suing Snapchat

Plaintiffs sued Snap and related entities, asserting claims including strict product liability, negligence, and invasion of privacy, emphasizing the platform’s failure to protect minors and address reported abuses. Defendants moved to strike the complaint.

The court granted the motion to strike. It held that the allegations of the complaint fell squarely within the ambit of immunity afforded under Section 230 to “an interactive computer service” that acts as as a “publisher or speaker” of information provided by another “information content provider.” Plaintiffs “clearly allege[d] that the defendants failed to regulate content provided by third parties” when such third parties used Snapchat to harm plaintiff.

Publisher or speaker? How about those algorithms!

Plaintiffs had argued that their claims did not seek to treat defendants as publishers or speakers, and therefore Section 230 immunity did not apply. Instead, plaintiffs argued, they were asserting claims that defendants breached their duty as manufacturers to design a reasonably safe product.

Of particular interest was the plaintiffs’ claim concerning Snapchat’s algorithms which recommended connections and which allegedly caused children to become addicted. But in line with the case of Force v. Facebook, Inc., 934 F.3d 53 (2nd Cir. 2019), the court refused to find that use of algorithms in this way was outside the traditional role of a publisher. It was careful to distinguish the case from Lemmon v. Snap, Inc., 995 F.3d 1085 (9th Cir 2021), in which that court held Section 230 did not immunize Snapchat from products liability claims. In that case, the harm to plaintiffs did not result from third party content but rather from the design of the platform which tempted the users to drive fast. In this case, the harm to plaintiffs was the result of particular actions of third parties who had transmitted content using Snapchat, to lure C.O.

Sad facts, sad result

The court seemed to express some trepidation about its result, using the same language the First Circuit Court of Appeals used in Jane Doe No. 1 v. Backpage.com, LLC, 817 F.3d 12, 15 (1st Cir. 2016): “This is a hard case-hard not in the sense that the legal issues defy resolution, but hard in the sense that the law requires that [the court] … deny relief to plaintiffs whose circumstances evoke outrage.” And citing from Vazquez v. Buhl, 90 A.3d 331 (2014), the court observed that “[w]ithout further legislative action, however, there is little [this court can] do in [its] limited role but join with other courts and commentators in expressing [its] concern with the statute’s broad scope.”

V.V. v. Meta Platforms, Inc. et al., 2024 WL 678248 (Conn. Super. Ct., February 16, 2024)

See also:

Communications Decency Act immunizes hosting provider from defamation liability

Johnson v. Arden, — F.3d —, 2010 WL 3023660 (8th Cir. August 4, 2010)

The Johnsons sell exotic cats. They filed a defamation lawsuit after discovering that some other cat-fanciers said mean things about them on Complaintsboard.com. Among the defendants was the company that hosted Complaintsboard.com – InMotion Hosting.

Sassy is my parents' cat. She hisses whenever I'm around, though they say she's a nice cat otherwise.

The district court dismissed the case against the hosting company, finding that the Communications Decency Act at 47 U.S.C. §230 (“Section 230”) immunized the hosting provider from liability. The Johnsons sought review with the Eighth Circuit Court of Appeals. On appeal, the court affirmed the dismissal.

Though Section 230 immunity has been around since 1996, this was the first time the Eighth Circuit had been presented with the question.

Section 230 provides, in relevant part, that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” It also says that “[n]o cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section.”

The Johnsons argued that Section 230 did not immunize the hosting company. Instead, they argued, it did just what it says – provides that a party in the position of the hosting company should not be treated as a publisher or speaker of information provided by third parties. The Johnsons argued that the host should be liable in this case regardless of Section 230, because under Missouri law, defendants can be jointly liable when they commit a wrong by concert of action and with common intent and purpose.

The court rejected the Johnsons’ argument, holding that Section 230 bars plaintiffs from making providers legally responsible for information that third parties created and developed. Adopting the Fourth Circuit’s holding in Nemet Chevrolet v. Consumeraffiars.com, the court held that “Congress thus established a general rule that providers of interactive computer services are liable only for speech that is properly attributable to them.”

No evidence in the record showed how the offending posts could be attributed to the hosting provider. It was undisputed that the host did not originate the material that the Johnsons deemed damaging.

Given this failure to show the content originated with the provider, the court found in favor of robust immunity, joining with the majority of other federal circuits that have addressed intermediary liability in the context of Section 230.

How Section 230 is like arson laws when it comes to enjoining website operators

The case of Blockowicz v. Williams, — F.Supp.2d —, 2009 WL 4929111 (N.D. Ill. December 21, 2009), which I posted on last week is worthy of discussion in that it raises the question of whether website operators like Ripoff Report could get off too easily when they knowingly host harmful third party content. Immunity under 47 U.S.C. 230 is often criticized for going too far in shielding operators. Under Section 230, sites cannot be treated as the publisher or speaker of information provided by third party information content providers. This means that even when the site operator is put on notice of the content, it cannot face, for example, defamation liability for the continued availability of that content.

Don’t get me wrong — the Blockowicz case had nothing to do with Section 230. Although Ben Sheffner is routinely sharp in his legal analysis, I disagree with his assessment that Section 230 was the reason for the court’s decision. In the comments to Ben’s post that I just linked to, Ben gets into conversation with Ripoff Report’s general counsel, whom I believe correctly notes that the decision was not based on Section 230. Ben argues that had Section 230 not provided immunity, the plaintiffs would have been able to go after Ripoff Report directly, and therefore Section 230 is to blame. That’s kind of like saying if arson were legal, plaintiffs could just go burn down Ripoff Report’s datacenter. But you don’t hear anyone blaming arson laws for this decision.

Even though Section 230 didn’t form the basis of the court’s decision in favor of Ripoff Report, the notion of a website operator “acting in concert” with its users is intriguing. Clearly the policy of Section 230 is to place some distance, legally speaking, between site operator and producer of user-generated content. And the whole idea behind the requirement in copyright law that infringement must arise from a volitional act and not an automatic action of the system is a first cousin to this issue. See, e.g., Religious Tech. Center v. Netcom, 907 F.Supp. 1361, 1370 (N.D. Cal. 1995) (“[T]here should still be some element of volition or causation which is lacking where a defendant’s system is merely used to create a copy by a third party”).

For the web to continue to develop, we are going to need this continued protection of the intermediary. We’re going to see functions of the semantic web appear with more frequency in our everyday online lives. From a practical perspective, there will be even more distance — a continuing divergence between a provider’s will and the nature of the content. So as we get into the technologies that will make the web smarter, and our experience of it more robust and helpful, we’ll need notions of intermediary immunity more and not less.

That notion of an increasing need for intermediary immunity underscores how important it is that intermediaries act responsibly. No doubt people misunderstand the holdings of cases like this one. By refusing to voluntarily take down obviously defamatory material, and challenging a court order to do so, Ripoff Report puts a bad taste in everyone’s mouth. Sure there’s the First Amendment and all that, but where’s a sense of reasonable decency? Sure there’s the idea that free flowing information supports democracy and all that, but has anyone stopped to think what could happen when the politicians get involved again?

Do not taunt Happy Fun Ball

We are fortunate that Congress was as equinamimous and future-minded as it was in 1996 when it enacted the immunity provisions of Section 230. But results like the one in the Blockowicz case are going to be misunderstood. There’s a hue and cry already about this decision, in that it appears to leave no recourse. Section 230 wasn’t involved, but it still got the blame. Even the judge was “sympathetic to the [plaintiffs’] plight.”

So maybe we need, real quickly, another decision like the Roommates.com case, that reminds us that website operators don’t always get a free ride.

Injunction against defamatory content could not reach website owner

Blockowicz v. Williams, — F.Supp.2d —, 2009 WL 4929111 (N.D. Ill. December 21, 2009)

(This is a case from last month that has already gotten some attention in the legal blogosphere, and is worth reporting on here in spite of the already-existing commentary.)

Plaintiffs sued two individual defendants for defamation over content those defendants posted online. The court entered an order of default after the defendants didn’t answer the complaint. The court also issued an injunction against the defendants, requiring them to take down the defamatory material.

grasping

When plaintiffs were unable to reach the defendants directly, they asked the websites on which the content was posted — MySpace, Facebook, Complaints Board and Ripoff Report — to remove the material.

All of the sites except Ripoff Report took down the defamatory content. Plaintiffs filed a motion with the court to get Ripoff Report to remove the material. Ripoff Report opposed the motion, arguing that Rule 65 (the federal rule pertaining to injunctions) did not give the court authority to bind Ripoff Report as a non-party. The court sided with Ripoff Report and denied the motion.

Federal Rule of Civil Procedure 65 states that injunctions bind the parties against whom they are issued as well as “other persons who are in active concert or participation with” those parties. In this case, the court looked to the Seventh Circuit opinion of S.E.C. v. Homa, 514 F.3d 674 (7th Cir. 2008) for guidance on the contours of Rule 65’s scope. Under Homa, a non-party can be bound by an injunction if it is “acting in concert” or is “legally identified” (like as an agent or employee) with the enjoined party.

Plaintiffs argued that Ripoff Report was acting in concert with the defamers. Plaintiffs looked to Ripoff Report’s terms of service, by which posters to the site give an exclusive copyright license to and agree to indemnify Ripoff Report. Those terms also state that Ripoff Report will not remove any content for any reason. Plaintiffs read this combination of terms to stand for some sort of arrangement whereby Ripoff Report agreed to be a safe haven for defamatory material.

The court rejected this argument, finding there was no evidence in the record that Ripoff Report intended to protect defamers. Moreover, there was no evidence that Ripoff Report had communicated with the defendants in any way since the entry of a permanent injunction, or otherwise worked to violate the earlier court order requiring defendants to remove the materials.

Other commentary on this case:

Grasping photo courtesy Flickr user Filmnut under this Creative Commons license.

Website drives off with Section 230 win over Chevy dealer

Nemet Chevrolet sued the website Consumeraffairs.com over some posts on that website which Nemet thought were defamatory and interfered with Nemet’s business expectancy. The website moved to dismiss the lawsuit, claiming that the Communications Decency Act at 47 U.S.C. 230 immunized the website from the lawsuit.

But when we're driving in my Malibu, it's easy to get right next to you. . . . "

The court dismissed the action on Section 230 grounds and Nemet sought review of the dismissal with the Fourth Circuit Court of Appeals. The appellate court affirmed the dismissal.

Section 230 precludes tort plaintiffs from holding interactive computer services (like website operators) liable for the publication of information created and developed by others. Most courts (like the Fourth Circuit) consider Section 230’s protection to be a form of immunity for website operators from lawsuits arising over third party content.

But that immunity disappears if the content giving rise to the dispute was actually created or developed by the operator and not by a third party. In those circumstances the operator also becomes an information content provider. And there is no Section 230 immunity for information content providers.

That’s where Nemet steered its argument. It alleged that the website was a non-immune information content provider that created and developed the offending content.

Nemet raised two general points in its argument. It claimed that the website’s structure and design elicited unlawful content, and that the site operator contacted individual posters to assist in revisions to the content. It also claimed that the site operator simply fabricated a number of the offending posts.

Applying the pleading standards on which the Supreme Court recently elaborated in Ashcroft v. Iqbal, the court found Nemet’s claims that the site operator was actually an information content provider to be implausible.

As for the structure and design argument, the court differentiated the present facts from the situation in Fair Housing Council of San Fernando Valley v. Roommates. com. In Roommates.com, the court found that the website was designed to elicit information that would violate the Fair Housing Act. In this case, however, there was nothing unlawful in inviting commentary on goods or services, even if it was for the purposes of drumming up business for plaintiffs’ class action lawyers.

As for the other arguments, the court simply found that the allegations did not nudge the claims “across the line from conceivable to plausible.” The court found the argument that the website fabricated the posts to be particularly not creditable, in that Nemet’s allegations relied mainly on an absence of information in its own records that would connect the post to an actual customer.

On balance, this decision from the Fourth Circuit shows that Section 230 immunity is as alive and well at the end of the “oughts” as it was a dozen years before when the Fourth Circuit became the first federal appellate court to consider the scope of the section’s immunity. That 1997 decision in the case of Zeran v. AOL remains a watershed pronouncement of Section 230’s immunity.

Congratulations to my friend and fellow blogger Jonathan Frieden’s impressive win in this case.

And Happy New Year to all the readers of Internet Cases. Thanks for your continued loyal support.

Chevy Malibu photo courtesy Flickr user bea-t under this Creative Commons license.

Scroll to top