Snapchat not liable for enabling teacher to groom minor student

A high school science teacher used Snapchat to send sexually explicit content to one of her students, whom she eventually assaulted. Authorities uncovered this abuse after the student overdosed on drugs. The student (as John Doe) sued the teacher, the school district and Snapchat. The lower court threw out the case against Snapchat on the basis of the federal Communications Decency Act at 47 USC § 230. The student sought review with the United States Court of Appeals for the Fifth Circuit. On appeal, the court affirmed.

Relying on Doe v. MySpace, Inc., 528 F.3d 413 (5th Cir. 2008), the court affirmed the lower court’s finding that the student’s claims against Snapchat were based on the teacher’s messages. Accordingly, Snapchat was immune from liability because this provision of federal law – under the doctrine of the MySpace case – provides “immunity … to Web-based service providers for all claims stemming from their publication of information created by third parties.”

Doe v. Snap, Inc., 2023 WL 4174061 (5th Cir. June 26, 2023)

Amazon gets Section 230 win over alleged defamatory product review


Customer ordered a scarf from plaintiffs’ Amazon store. Customer left a review claiming the scarf was not a real Burberry. When neither  customer nor Amazon would take down the review, plaintiffs (the Amazon store owners) sued for Amazon for defamation. The lower court dismissed on Section 230 grounds. Plaintiffs sought review with the Eleventh Circuit which affirmed the dismissal in a non-published opinion.

Section 230 (a provision in federal law found at 47 U.S.C. 230 which gives legal immunity to many online services) provides that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Because the lawsuit sought to treat Amazon (a provider of an interactive computer service) as the publisher of information (the product review) provided by another information content provider (customer), this immunity applied to protect Amazon from liability.

Specifically, the court held:

  • Amazon is an interactive computer service provider. Amazon’s website allows customers to view, purchase, and post reviews online, and therefore provides computer access by multiple users similar to an online message board or a website exchange system.
  • Amazon was not responsible for the development of the offending content. According to the complaint, defendant wrote the allegedly defamatory review, and therefore she functioned as the information content provider.
  • Roommates.com is not applicable, as the complaint here alleges that defendant wrote the review in its entirety.
  • Plaintiffs seek to hold Amazon liable for failing to take down defendant’s review, which is exactly the kind of claim that is immunized by Section 230 — one that treats Amazon as the publisher of that information

McCall v. Amazon, No. 22-11725 (11th Cir., June 12, 2023)

McCall_v_Amazon

Section 230 immunity did not protect Omegle in product liability lawsuit

Section 230

When plaintiff was 11 years old, she was connected to a man in his late thirties using Omegle (a “free online chat room that randomly pairs strangers from around the world for one-on-one chats”). Before the man was arrested some three years later, he forced plaintiff to send him pornographic videos of herself, made threats against her, and engaged in other inappropriate and unlawful conduct with plaintiff.

Plaintiff sued Omegle, alleging product liability and negligence relating to how Omegle was designed, and for failure to warn users of the site’s dangers. Omegle moved to dismiss these claims, claiming that it could not be liable because it was protected by 47 U.S.C. §230.

The court found that that Section 230 did not apply because plaintiff’s claims did not seek to treat Omegle as the publisher or speaker of content. The court observed that to meet the obligation plaintiff sought to impose on Omegle, Omegle would not have had to alter the content posted by its users. It would only have had to change its design and warnings.

And the court found that plaintiff’s claims did not rest on Omegle’s publication of third party content. In the same way that Snapchat did not avoid liability on the basis of Section 230 in Lemmon v. Snap, Inc., 995 F.3d 1085 (9th Cir. 2021), Omegle’s alleged liability was based on its “own acts,” namely, designing and operating the service in a certain way that connected sex offenders with minors, and failed to warn of such dangers.

A.M. v. Omegle.com, LLC, 2022 WL 2713721 (D. Oregon, July 13, 2022)

See also:

Can a person be liable for retweeting a defamatory tweet?

section 230 user retweet defamatory

Under traditional principles of defamation law, one can be liable for repeating a defamatory statement to others. Does the same principle apply, however, on social media such as Twitter, where one can easily repeat the words of others via a retweet?

Hacking, tweet, retweet, lawsuit

A high school student hacked the server hosting the local middle school’s website, and modified plaintiff’s web page to make it appear she was seeking inappropriate relationships. Another student tweeted a picture of the modified web page, and several people retweeted that picture.

The teacher sued the retweeters for defamation and reckless infliction of emotional distress. The court dismissed the case, holding that 47 USC §230 immunized defendants from liability as “users” of an interactive computer service. Plaintiff sought review with the New Hampshire Supreme Court. On appeal, the court affirmed the dismissal.

Who is a “user” under Section 230?

Section 230 provides, in relevant part, that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider”. Importantly, the statute does not define the word “user”. The lower court held that defendant retweeters fit into the category of “user” under the statute and therefore could not be liable for their retweeting, because to impose such liability would require treating them as the publisher or speaker of information provided by another.

Looking primarily at the plain language of the statute, and guided by the 2006 California case of Barrett v. Rosenthal, the state supreme court found no basis in plaintiff’s arguments that defendants were not “users” under the statute. Plaintiff had argued that “user” should be interpreted to mean libraries, colleges, computer coffee shops and others who, “at the beginning of the internet” were primary access points for people. And she also argued that because Section 230 changed common law defamation, the statute must speak directly to immunizing individual users.

The court held that it was “evident” that Section 230 abrogated the common law of defamation as applied to individual users. “That individual users are immunized from claims of defamation for retweeting content they did not create is evident from the statutory language. ”

Banaian v. Bascom, — A.3d —, 2022 WL 1482521 (N.H. May 11, 2022)

See also:

Omegle protected by Section 230 against claims for child pornography, sex trafficking and related claims

Section 230 sex trafficking

Omegle is a notorious website where you can be randomly placed in a chat room (using video, audio and text) with strangers on the internet. Back in March 2020, 11-year-old C.H. was using Omegle and got paired with a pedophile who intimidated her into disrobing on camera while he captured video. When C.H.’s parents found out, they sued Omegle alleging a number of theories:

  • possession of child pornography in violation of 18 U.S.C. § 2252A;
  • violation of the Federal Trafficking Victims Protection Act, 18 U.S.C. §§ 1591 and 1595;
  • violation of the Video Privacy Protection Act, 18 U.S.C. § 2710;
  • intrusion upon seclusion;
  • negligence;
  • intentional infliction of emotional distress;
  • ratification/vicarious liability; and
  • public nuisance

The court granted Omegle’s motion to dismiss all eight claims, holding that each of the claims was barred by the immunity provided under 47 U.S.C. § 230. Citing to Doe v. Reddit, Inc., 2021 WL 5860904 (C.D. Cal. Oct. 7, 2021) and Roca Labs, Inc. v. Consumer Op. Corp., 140 F. Supp. 3d 1311 (M.D. Fla. 2015), the court observed that a defendant seeking to enjoy the immunity provided by Section 230 must establish that: (1) defendant is a service provider or user of an interactive computer service; (2) the causes of action treat defendant as a publisher or speaker of information; and (3) a different information content provider provided the information.

Omegle met Section 230’s definition of “interactive computer service”

The court found Omegle to be an interactive computer service provider because there were no factual allegations suggesting that Omegle authored, published or generated its own information to warrant classifying it as an information content provider. Nor were there any factual allegations that Omegle materially contributed to the unlawfulness of the content at issue by developing or augmenting it. Omegle users were not required to provide or verify user information before being placed in a chatroom with another user. And some users, such as hackers and “cappers”, could circumvent other users’ anonymity using the data they themselves collected from those other users.

Plaintiffs’ claims sought to treat Omegle as a publisher or speaker of information

The court found that each of the claims for possession of child pornography, sex trafficking, violation of the Video Privacy Protection Act, intrusion upon seclusion and intentional infliction of emotional distress sought redress for damages caused by the unknown pedophile’s conduct. Specifically, in the court’s view, no well-pleaded facts suggested that Omegle had actual knowledge of the sex trafficking venture involving C.H. or that Omegle had an active participation in the venture. As for the claims of intentional infliction of emotional distress, ratification/vicarious liability and public nuisance, the court similarly concluded that plaintiffs’ theories of liability were rooted in Omegle’s creation and maintenance of the site. The court observed that plaintiffs’ claims recognized the distinction between Omegle as an interactive computer service provider and its users, but nonetheless treated Omegle as the publisher responsible for the conduct at issue. The court found this was corroborated by the “ratification/vicarious liability” claim, in which plaintiffs maintained that child sex trafficking was so pervasive on and known to Omegle that it should have been vicariously liable for the damages caused by such criminal activity. And, in the court’s view, through the negligence and public nuisance claims, plaintiffs alleged that Omegle knew or should have known about the dangers that the platform posed to minor children, and that Omegle failed to ensure that minor children did not fall prey to child predators that may use the website.

The information at issue was provided by a third party

On this third element, the court found that Omegle merely provided the forum where harmful conduct took place. The content giving rise to the harm – the video and the intimidation – were undertaken by the unknown pedophile, not Omegle.

Special note: Section 230 and the sex trafficking claim

Section 230 (e)(5) limits an interactive computer service provider’s immunity in certain circumstances involving claims of sex trafficking. In this case, however, like the court did in the case of Doe v. Kik Interactive, Inc., 482 F. Supp. 3d 1242 (S.D. Fla. 2020), the court held that Omegle’s Section 230 immunity remained intact, because the plaintiffs’ allegations were premised upon general, constructive knowledge of past sex trafficking incidents. The complaint failed to sufficiently allege Omegle’s actual knowledge or overt participation in the underlying incidents between C.H. and the unknown pedophile.

M.H. and J.H. v. Omegle.com, LLC, 2022 WL 93575 (M.D. Fla. January 10, 2022)

Section 230 did not protect Snapchat from negligence liability in car crash lawsuit over Speed Filter

The tragic facts

Landen Brown was using Snapchat’s Speed Filter in 2017 when the car in which he was riding with two other young men crashed after reaching speeds above 120 miles per hour. The Speed Filter documented how fast the car was traveling. The crash killed Landon and the other two occupants.

Section 230

The parents of two of the passengers sued Snap, Inc. (the purveyor of Snapchat), claiming that the app was negligently designed. The parents alleged, among other things, that Snap should have known that users believed they would be rewarded within the app for using the filter to record a speed above 100 miles per hour. The negligence claim was based in part on the notion that Snap did not remove or restrict access to Snapchat while traveling at dangerous speeds.

Immune, then not

The lower court dismissed the case, holding that 47 U.S.C. 230 protected Snapchat from liability. Plaintiffs sought review with the Ninth Circuit. On appeal, the court reversed, finding that Section 230 liability did not apply to the negligent design claim.

Section 230’s role in the case

Section 230 provides that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” In this case, the court held that the parents’ complaint did not seek to hold Snap liable for its conduct as a publisher or speaker. The negligent design lawsuit treated Snap as a products manufacturer, accusing it of negligently designing a product (Snapchat) with a defect (the interplay between Snapchat’s reward system and the Speed Filter). Thus, the duty that Snap allegedly violated sprang from its distinct capacity as a product designer. Simply stated, in the court’s view, Snap’s alleged duty in this case case had nothing to do with its editing, monitoring, or removing of the content that its users generate through Snapchat.

Lemmon v. Snap, Inc. — F.3d —, 2021 WL 1743576 (9th Cir. May 4, 2021)

HuffPost protected by Section 230 in Carter Page defamation suit

Section 230

Carter Page sued the publisher of the HuffPost over some 2016 articles about the Russia collusion matter that Page claimed were defamatory. These articles were written by “contributors” to the HuffPost, who, according to the court, “control their own work and post freely to the site”.

The court threw out the defamation claims concerning these articles, in part because it found that HuffPost was immune from suit thanks to Section 230. The court determined that HuffPost was not the “information content provider” since the content was written by these so-called contributors.

Page v. Oath Inc., 2021 WL 528472 (Superior Ct. Del., February 11, 2021)

Evan Brown is a technology and intellectual property attorney.

Section 230 protected Google in illegal gambling lawsuit over loot boxes

Section 230

Plaintiffs sued Google claiming that loot boxes in games available through the Google Play store were illegal “slot machines or devices”. (Players who buy loot boxes get a randomized chance at receiving an item designed to enhance game play, such as a better weapon, faster car, or skin.) Plaintiffs characterized these loot boxes as a “gamble” because the player does not know what the loot box actually contains until it is opened. Defendant Google moved to dismiss the lawsuit on Section 230 grounds. The court granted the motion.

As relevant here, Section 230(c)(1) provides that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” 47 U.S.C. § 230(c)(1). “No cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section.” 47 U.S.C. § 203(e)(3).

The court held that Google was immune under Section 230 because (a) it is an interactive computer service provider, (b) plaintiffs’ claims over the loot boxes sought to treat Google as the “publisher or speaker” of the games containing the allegedly illegal loot boxes, and (c) the games constituted information provided by third parties.

Of particular interest was the court’s treatment of plaintiff’s argument that Section 230 only relates to “speech” and that Google’s provision of software did not fit into that category. Rejecting this argument, the court cited to the case of Evans v. Hewlett-Packard Co., 2013 WL 4426359(N.D. Cal. Aug. 15, 2013) in which the court used Section 230 to knock out Chubby Checker’s trademark and unfair competition claims against HP over a game HP made available.

Coffee v. Google, LLC, 2021 WL 493387 (N.D. Cal., February 10, 2021)

Evan Brown is a technology and intellectual property attorney in Chicago.

Section 230 immunity protected Twitter from claims it aided and abetted defamation

Twitter enjoyed Section 230 immunity for aiding and abetting defamation because plaintiffs’ claims on that point did not transform Twitter into a party that created or developed content.

An anonymous Twitter user posted some tweets that plaintiffs thought were defamatory. So plaintiffs sued Twitter for defamation after Twitter refused to take the tweets down. Twitter moved to dismiss the lawsuit. It argued that the Communications Decency Act (CDA) at 47 U.S.C. §230 barred the claim. The court agreed that Section 230 provided immunity to Twitter, and granted the motion to dismiss.

The court applied the Second Circuit’s test for Section 230 immunity as set out in La Liberte v. Reid, 966 F.3d 79 (2d Cir. 2020). Under this test, which parses Section 230’s language, plaintiffs’ claims failed because:

  • (1) Twitter was a provider of an interactive computer service,
  • (2) the claims were based on information provided by another information content provider, and
  • (3) the claims treated Twitter as the publisher or speaker of that information.

Twitter is a provider of an interactive computer service

The CDA defines an “interactive computer service” as “any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server.” 47 U.S.C. § 230(f)(2). The court found that Twitter is an online platform that allows multiple users to access and share the content hosted on its servers. As such, it is an interactive computer service for purposes of the CDA.

Plaintiffs’ claims were based on information provided by another information content provider

The court also found that the claims against Twitter were based on information provided by another information content provider. The CDA defines an “information content provider” as “any person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet or any other interactive computer service.” 47 U.S.C. § 230(f)(3). In this case, the court found that plaintiffs’ claims were based on information created or developed by another information content provider – the unknown Twitter user who posted the alleged defamatory content. Plaintiffs did not allege that Twitter played any role in the “creation or development” of the challenged tweets.

The claim treated Twitter as the publisher or speaker of the alleged defamatory information

The court gave careful analysis to this third prong of the test. Plaintiffs alleged that Twitter had “allowed and helped” the unknown Twitter user to defame plaintiffs by hosting its tweets on its platform, or by refusing to remove those tweets when plaintiffs reported them. The court found that either theory would amount to holding Twitter liable as the “publisher or speaker” of “information provided by another information content provider.” The court observed that making information public and distributing it to interested parties are quintessential acts of publishing. Plaintiffs’ theory of liability would “eviscerate” Section 230 protection because it would hold Twitter liable simply for organizing and displaying content exclusively provided by third parties.

Similarly, the court concluded that holding Twitter liable for failing to remove the tweets plaintiffs found objectionable would also hold Twitter liable based on its role as a publisher of those tweets, because deciding whether or not to remove content falls squarely within the exercise of a publisher’s traditional role and is therefore subject to the CDA’s broad immunity.

The court found that plaintiffs’ suggestion that Twitter aided and abetted defamation by arranging and displaying others’ content on its platform failed to overcome Twitter’s immunity under the CDA. In the court’s view, such activity would be tantamount to holding Twitter responsible as the “developer” or “creator” of that content. But in reality, to impose liability on Twitter as a developer or creator of third-party content – rather than as a publisher of it – Twitter would have to directly and materially contribute to what made the content itself unlawful.

Plaintiffs in this case did not allege that Twitter contributed to the defamatory content of the tweets at issue, and thus pled no basis upon which Twitter could be held liable as the creator or developer of those tweets. Accordingly, plaintiffs’ defamation claims against Twitter also satisfied the final requirement for CDA immunity: the claims sought to hold Twitter, an interactive computer service, liable as the publisher of information provided by another information content provider. Ultimately, Twitter had Section 230 immunity for aiding and abetting defamation.

Brikman v. Twitter, Inc., 2020 WL 5594637 (E.D.N.Y., September 17, 2020)

See also:

Website avoided liability over user content thanks to Section 230

About the author:

Evan Brown, Copyright work made for hireEvan Brown is an attorney in Chicago practicing copyright, trademark, technology and in other areas of the law. His clients include individuals and companies in many industries, as well as the technology companies that serve them. Twitter: @internetcases

Need help with an online legal issue?

Let’s talk. Give me a call at (630) 362-7237, or send me an email at ebrown@internetcases.com.

Section 230 did not protect online car sharing platform

Plaintiff Turo operates an online and mobile peer-to-peer car sharing marketplace. It allows car owners to rent their cars to other Turo users. It filed a declaratory judgment action against the City of Los Angeles, asking the court to determine the service was not being run in violation of applicable law.

Section 230

The city filed counterclaims against Turo alleging (1) violation of local airport commerce regulations; (2) trespass; (3) aiding and abetting trespass; (4) unjust enrichment; and (5) unlawful and unfair business practices under California statute.

Should Section 230 apply?

Turo moved to dismiss. It argued that the City’s counterclaims sought to hold Turo liable for content published by users on Turo’s platform. In Turo’s mind, that should make it immune under Section 230. 

Section 230(c) provides, in relevant part, that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

Turo argued that the city’s claims were barred by Section 230 because they sought to hold Turo liable for its users’ actions. Those users published rental listings and selected LAX as the designated pickup point for car rentals. According to Turo, because the content of the rental listings were provided by third-party users, and because the city’s claims sought to hold Turo liable as an interactive computer service responsible for that content, Section 230 should apply.

No immunity, based on what the platform did

The court rejected Turo’s arguments that Section 230 immunized Turo from liability arising from the city’s counterclaims.

It held that Section 230 did not provide immunity because the city sought to hold Turo liable for its role facilitating online rental car transactions, not as the publisher or speaker of its users’ listings.

Citing to Force v. Facebook, Inc., 934 F.3d 53 (2d Cir. 2019), the court observed that “Section 230(c)(1) limits liability based on the function the defendant performs, not its identity.”

And the court compared the situation to the one in HomeAway.com, Inc. v. City of Santa Monica, 918 F.3d 676 (9th Cir. 2019). In that case, Section 230 did not immunize companies providing peer-to-peer home rental platform services from a government ordinance that required homeowners to register their properties with the city before listing them on a home sharing platform.

The court explained that Section 230 immunity did not apply because the government plaintiff did not seek to hold the platform companies liable for the content of the bookings posted by their users, but only for their actions of processing transactions for unregistered properties.

Turo v. City of Los Angeles, 2020 WL 3422262 (C.D. Cal., June 19, 2020)

Evan Brown is an intellectual property and technology attorney advising companies on issues relating to the internet and new technologies.

Have a question? Email Evan to set up a time to talk: ebrown@internetcases.com. Or call (630) 362-7237.

Scroll to top