Blog

DMCA anticircumvention case over copied YouTube videos moves forward

Defendant fired plaintiff over two videos advocating for COVID-19 workplace protections that plaintiff posted on YouTube. Around the time of the termination, the employer allegedly used a smartphone to record the videos in question while they were being played on a computer screen. Defendant then allegedly further copied, distributed and performed these videos in connection with legal proceedings involving plaintiff, without plaintiff’s consent.

DMCA anticircumvention

Plaintiff sued his former employer for copyright infringement. And because YouTube technology provides technological protection measures to prevent unauthorized copying of videos, plaintiff sued under Section 1201 of the Copyright Act – one of the anticircumvention provisions of the Digital Millennium Copyright Act (“DMCA”).

No fair use (yet)

Defendant argued its conduct was fair use of the videos. It asserted it submitted the videos in response to plaintiff’s OSHA complaint and in support of a no-trespass order. But the court refused to make a fair use determination at the motion to dismiss stage, since no facts supporting fair use could be found in the complaint.

DMCA circumvention

The court also allowed plaintiff’s DMCA circumvention claim to move forward.

Section 1201(a)(1)(A) of the DMCA states “[n]o person shall circumvent a technological measure that effectively controls access to a work protected under [the Copyright Act].” “Circumvent,” as used in §1201, “means to descramble a scrambled work, to decrypt an encrypted work, or otherwise to avoid, bypass, remove, deactivate, or impair a technological measure, without the authority of the copyright owner[.]” §1201(a)(3)(A). “[A] technological measure ‘effectively controls access to a work’ if the measure, in the ordinary course of its operation, requires the application of information, or a process or a treatment, with the authority of the copyright owner, to gain access to the work.” §1201(a)(3)(B).

Defendant argued the court should throw out the DMCA circumvention claim because plaintiff did not identify a specific technological measure that defendant allegedly circumvented. The court rejected that argument, however, saying that that much specificity was not required to survive a motion to dismiss.

Defendant also argued the DMCA claim should fail because the statute prohibits “circumvention,” which is different from copying, and the complained-of conduct was simply copying. The court viewed the law differently, however, citing Chamberlain Group, Inc. v. Skylink Technologies., Inc., 381 F.3d 1178 (Fed. Cir. 2004), where the court held that while infringement and circumvention are distinct, an act of infringement can also involve an act of an authorized circumvention.

The facts of this case may cause one to consider cases such as R. Christopher Goodwin & Assoc., Inc. v. Search, Inc., 2019 WL 5576834 (E.D. Louisiana October 29, 2019) and wonder whether circumvention has really occurred. It does not appear defendant in this case did anything to disable measures that would have prevented it from viewing the videos. Presumably the streamed videos were available to anyone who could visit YouTube. And the act of creating the copies did not even touch on any of the protection measures YouTube put in place. There was no cracking or descrambling – just the capturing of video as it passed through the air (at that moment being analog) from the computer screen to the camera of the smart phone. Perhaps there is an analog hole defense here?

Edland v. Basin Electric Power Cooperative, 2021 WL 3080225 (D.S.D. July 21, 2021)

No contract formed via URL to terms and conditions in hard copy advertisement

Online terms of service found at URL in hard copy advertisement were not enforceable.

terms of service

Plaintiff visited a Subway restaurant. One of the Subway employees referred plaintiff to an in-store, hard-copy advertisement. On the advertisement, Subway offered to send special offers to plaintiff if she texted a keyword to a short code. Plaintiff sent the text message to Subway, and Subway began responding, including by sending her, via text message, a hyperlink to an electronic coupon.

Later, plaintiff wanted to stop receiving the messages, so she requested that the messages cease. But they kept arriving. Plaintiff then sued under the Telephone Consumer Protection Act (“TCPA”). Subway moved to compel arbitration, arguing that a contract was formed because the printed in-store advertisement that contained the keyword and short code to text included a reference to and URL for “terms and conditions”. Those terms and conditions required plaintiff to settle the dispute by arbitration.

The lower court denied the motion to compel arbitration. Subway sought review with the Second Circuit Court of Appeals. On appeal, the court affirmed the denial of a motion to dismiss, finding that plaintiff was not bound by the terms and conditions.

The appellate court held that plaintiff was not on notice of the terms and conditions, which contained the arbitration clause, because Subway failed to demonstrate that such terms and conditions would be clear and conspicuous to a reasonable person in plaintiff’s position. More specifically, the court held that the following facts showed plaintiff did not know what the terms said:

  • Subway failed to provide evidence regarding the size of the advertisement at issue, or the print size contained within that advertisement;
  • the reference to “terms and conditions” was buried on the advertisement in a paragraph that was printed in significantly smaller font relative to the other text on the advertisement, and the reference itself was surrounded by a substantial amount of unrelated information;
  • the advertisement only vaguely referenced “terms and conditions,” and did not state that a consumer would be agreeing to those terms if she sent a text message to Subway’s short code, nor did it otherwise direct the consumer to such terms;
  • access to the terms and conditions on the Subway website required plaintiff to type in the URL text provided on the hard-copy print advertisement into an internet browser on her cell phone or some other device with internet browsing capabilities; and
  • once linked to the Subway website, the heading stated that it contained “terms of use for this website,” thus potentially suggesting to a reasonable person (searching for conditions of the promotional offer) that the website did not contain any terms or conditions beyond those relevant to the use of the website.

This combination of barriers led the court to conclude that the terms and conditions were not reasonably conspicuous under the totality of the circumstances and, thus, a reasonable person would not realize she was being bound to such terms and conditions by texting Subway in order to begin receiving promotional offers.

Soliman v. Subway Franchisee Advertising Fund Trust, Ltd., — F.3d —, 2021 WL 2324549 (2nd Cir. June 8, 2021)

Related: Court finds clickwrap independent contractor agreement enforceable

What must social media platforms do to comply with Florida’s Senate Bill 7072?

Ron DeSantis social media billThe media has been covering Florida’s new law (Senate Bill 7072) targeting social media platforms, which Governor DeSantis signed today. The law is relatively complex and imposes a number of new obligations on social media platforms. The law is likely to face First Amendment challenges. And then there’s the Section 230 problem. In any event, through all the political noise surrounding the law’s passage, it is worth taking a careful look at what the statute actually says. 

Findings

The bill starts with some findings of the legislature that give context to what the law is about. There are some interesting findings that reflect an evolved view of the internet and social media as being critical spaces for information exchange, in the nature of public utilities and common carriers:

  • Social media platforms have transformed into the new public town square.
  • Social media platforms have become as important for conveying public opinion as public utilities are for supporting modern society.
  • Social media platforms hold a unique place in preserving first amendment protections for all Floridians and should be treated similarly to common carriers.
  • The state has a substantial interest in protecting its residents from inconsistent and unfair actions by social media.

Important definitions

The statute gives some precise and interesting definitions to important terms:

A “social media platform” is any information service, system, Internet search engine, or access software provider that provides or enables computer access by multiple users to a computer server, including an Internet platform or a social media site, operates as a sole proprietorship, partnership, limited liability company, corporation, association, or other legal entity, does business in Florida and has either annual gross revenues in excess of $100 million, or has at least 100 million monthly individual platform participants globally.

Interestingly, the statute appears to clarify that Disney World and other major players are not part of what constitute “social media platforms”: 

The term does not include any information service, system, Internet search engine, or access software provider operated by a company that owns and operates a theme park or entertainment complex as defined elsewhere in Florida law.

Some other definitions:

To “censor” is for a social media platform to delete, regulate, restrict, edit, alter, inhibit the publication or republication of, suspend a right to post, remove, or post an addendum to any content or material posted by a user. The term also includes actions to inhibit the ability of a user to be viewable by or to interact with another user of the social media platform.

“Deplatforming” means the action or practice by a social media platform to permanently delete or ban a user or to temporarily delete or ban a user from the social media platform for more than 14 days.

A “shadow ban” is action by a social media platform, through any means, whether the action is determined by a natural person or an algorithm, to limit or eliminate the exposure of a user or content or material posted by a user to other users of the social media platform. This term includes acts of shadow banning by a social media platform which are not readily apparent to a user.

“Post-prioritization” means action by a social media platform to place, feature, or prioritize certain content or material ahead of, below, or in a more or less prominent position than others in a newsfeed, a feed, a view, or in search results. The term does not include post-prioritization of content and material of a third party, including other users, based on payments by that third party, to the social media platform. 

Protections for political candidates

The first substantive part of the statute seeks to protect political candidates from being taken offline:

A social media platform may not willfully deplatform a candidate for office who is known by the social media platform to be a candidate, beginning on the date of qualification and ending on the date of the election or the date the candidate ceases to be a candidate. A social media platform must provide each user a method by which the user may be identified as a qualified candidate and which provides sufficient information to allow the social media platform to confirm the user’s qualification by reviewing the website of the Division of Elections or the website of the local supervisor of elections.

If the Florida Election Commission finds that a social media platform violates the above provision, it can fine the platform $250,000 per day for a candidate for statewide office, and $25,000 per day for a candidate for other offices. 

Social media platforms’ required activity

The statute spells out certain things that social media platforms must and must not do. For example, social media platforms:

  • Must publish the standards, including detailed definitions, it uses or has used for determining how to censor, deplatform, and shadow ban.
  • Must apply censorship, deplatforming, and shadow banning standards in a consistent manner among its users on the platform.
  • May not censor or shadow ban a user’s content or material or deplatform a user from the social media platform without notifying the user who posted or attempted to post the content or material (unless the content is obscene). (This notice must be in writing and must be delivered via electronic mail or direct electronic notification to the user within 7 days after the censoring action. It must include a thorough rationale explaining the reason that the social media platform censored the user. It must also include a precise and thorough explanation of how the social media platform became aware of the censored content or material, including a thorough explanation of the algorithms used, if any, to identify or flag the user’s content or material as objectionable.)
  • Must, if a user is deplatformed, allow that user to access or retrieve all of the user’s information, content, material, and data for at least 60 days after the user receives the required notice.
  • Must provide a mechanism that allows a user to request the number of other individual platform participants who were provided or shown the user’s content or posts and provide, upon request, a user with the number of other individual platform participants who were provided or shown content or posts.
  • Must categorize algorithms used for post-prioritization and shadow banning, and must allow a user to opt out of post-prioritization and shadow banning algorithm categories to allow sequential or chronological posts and content.
  • Must provide users with an annual notice on the use of algorithms for post-prioritization and shadow banning and reoffer annually the opt-out opportunity provided in the statute. 
  • May not apply or use post-prioritization or shadow banning algorithms for content and material posted by or about a user who is known by the social media platform to be a political candidate as defined under the law, beginning on the date of qualification and ending on the date of the election or the date the candidate ceases to be a candidate.
  • Must provide each user a method by which the user may be identified as a qualified candidate and which provides sufficient information to allow the social media platform to confirm the user’s qualification by reviewing the website of Florida’s Division of Elections or the website of the local supervisor of elections.
  • May not take any action to censor, deplatform, or shadow ban a journalistic enterprise based on the content of its publication or broadcast (unless the content is obscene as defined under Florida law). 

What happens if there is a violation?

A social media platform that violates the statute could face legal action from the government, or from private citizens who sue under the statute. 

Government action:

If the Florida Department of Justice, by its own inquiry or as a result of a complaint, suspects that a social media platform’s violation of the statute is imminent, occurring, or has occurred, it may investigate the suspected violation. Based on its investigation, the department may bring a civil or administrative action. The department can send subpoenas to learn about the algorithms related to any alleged violation. 

The ability for a private individual to bring an action under the statute is not as broad as the government’s ability to enforce the law. A private individual can only sue if:

  • the social media platform fails to apply censorship, deplatforming, and shadow banning standards in a consistent manner among its users on the platform, or
  • if the social media platform censors or shadow bans a user’s content or material or deplatforms the user from the social media platform without the required notice.

Remedies

The court may award the following remedies to the user who proves a violation of the statute:

  • Up to $100,000 in statutory damages per proven claim.
  • Actual damages.
  • If aggravating factors are present, punitive damages.
  • Other forms of equitable relief, including injunctive relief.
  • If the user was deplatformed in circumstances where the social media platform failed to apply censorship, deplatforming, and shadow banning standards in a consistent manner among its users on the platform, the user can recover its costs and reasonable attorney fees.

When does the law take effect?

July 1, 2021. 

 

Does Section 230 apply to claims under the Fair Credit Reporting Act?

Plaintiffs sued defendants, claiming that defendants violated the Fair Credit Reporting Act (FCRA) by including inaccurate criminal information on background check reports defendants produced and sold. Defendants moved for a judgment on the pleadings (a form of a motion to dismiss), arguing that 47 U.S.C. §230 provided immunity to defendants. Specifically, defendants argued that they were an interactive computer service, and that plaintiffs’ claims treated defendants as the publisher of third party content. The court agreed with defendants and granted defendants’ motion.

Section 230 Fair Credit Reporting Act

Defendants’ website

Defendants operate the website found at publicdata.com. The website allows customers to search through various databases available via the site. Defendants can pull this information into a report. Plaintiffs asserted that defendants purchase, collect, and assemble public record information into reports, which employers then buy from defendants via the website.

The FCRA claims

The FCRA places a number of requirements on “consumer reporting agencies,” and plaintiffs asserted that defendants did not meet these requirements. Each of the three plaintiffs – who wish to represent an entire class of plaintiffs – claim that reports obtained from prospective employers contained inaccurate information about criminal charges against plaintiffs, and resulted in plaintiffs not getting jobs they sought.

Section 230’s exceptions did not apply

The court began by noting that none of Section 230’s exceptions (i.e., situations where immunity does not apply) precluded immunity from applying to an FCRA claim. Congress enumerated five exceptions to immunity, expressly stating that Section 230 cannot have any effect on any “[f]ederal criminal statute,” “intellectual property law,” “[s]tate law that is consistent with this section,” “the Electronic Communications Privacy Act,” or “sex trafficking law.” Applying the canon of statutory construction of expressio unius est exclusio alterius, the court determined that where Congress explicitly enumerates certain exceptions to a general prohibition, additional exceptions are not to be implied, in the absence of evidence of a contrary legislative intent. The court held that since Congress plainly chose five exceptions to Section 230 immunity, and did not include the FCRA among them, by its plain language, Section 230 can apply to FCRA claims.

Immunity applied

Citing to the well-known Fourth Circuit case of Nemet Chevrolet, Ltd. v. Consumeraffairs.com, Inc., 591 F.3d 250, (4th Cir. 2009), the court looked to the three requirements to successfully assert § 230 immunity: (1) a defendant is an interactive computer service; (2) the content is created by an information content provider; and (3) the defendant is alleged to be the creator of the content. In this case, all three elements were met.

Finding that the defendants’ website was an interactive computer service, the court observed that Section 230 immunity covers information that the defendant does not create as an information content provider, and that such immunity is not lost when the interactive service provider pays a third party for the content at issue, and essentially becomes a “distributor” of the content.

On the second element, the court found that plaintiffs clearly stated that defendants did not create the content, but that they obtained it “from vendors, state agencies, and courthouses.” It was those entities that created the records defendants uploaded to their website and collected into reports.

And in the court’s mind there was no doubt that defendants did not create the content. It found that plaintiffs admitted in their complaint that the convictions and other information included on defendants’ reports were derived from other information content providers such as courts and other repositories of this information. Although plaintiffs alleged that defendants manipulated and sorted the content in a background check report, there was no explicit allegation that defendants materially contributed to or created the content themselves.

Henderson v. The Source For Public Data, 2021 WL 2003550 (E.D. Va. May 19, 2021)

Murdered Uber passenger’s mom can keep her case in court and out of arbitration

An Uber driver murdered plaintiff’s son. So plaintiff – the Uber user’s mom – sued Uber for wrongful death. The lower court threw out the case, saying that the Uber terms and conditions required the matter to go to arbitration. Plaintiff sought review with the Georgia Court of Appeals. On review, the court reversed and sent the case back to the lower court.

The appellate court found that it was improper to dismiss the case because it was not clear that plaintiff’s son – the one killed by the Uber driver – actually agreed to the Uber terms and conditions that contained the provision requiring arbitration.

First, there was a dispute as to whether he even saw the link to the terms and conditions when he signed up for Uber in 2016. That’s because he was using an Android phone, and plaintiff alleged the on-screen keyboard within the app may have covered up the link to the terms and conditions.

Second, the court noted that even though Uber submitted evidence it emailed updated terms and conditions to plaintiff’s son, and that he continued using Uber thereafter (thereby binding him to the terms), it was unclear that the email was ever sent to plaintiff’s son. If the customer never saw those terms, they would not apply, and therefore arbitration would not be proper.

Thornton v. Uber Technologies, Inc., 2021 WL 1960199 (Ct. App. Ga. May 17, 2021)

Section 230 did not protect Snapchat from negligence liability in car crash lawsuit over Speed Filter

The tragic facts

Landen Brown was using Snapchat’s Speed Filter in 2017 when the car in which he was riding with two other young men crashed after reaching speeds above 120 miles per hour. The Speed Filter documented how fast the car was traveling. The crash killed Landon and the other two occupants.

Section 230

The parents of two of the passengers sued Snap, Inc. (the purveyor of Snapchat), claiming that the app was negligently designed. The parents alleged, among other things, that Snap should have known that users believed they would be rewarded within the app for using the filter to record a speed above 100 miles per hour. The negligence claim was based in part on the notion that Snap did not remove or restrict access to Snapchat while traveling at dangerous speeds.

Immune, then not

The lower court dismissed the case, holding that 47 U.S.C. 230 protected Snapchat from liability. Plaintiffs sought review with the Ninth Circuit. On appeal, the court reversed, finding that Section 230 liability did not apply to the negligent design claim.

Section 230’s role in the case

Section 230 provides that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” In this case, the court held that the parents’ complaint did not seek to hold Snap liable for its conduct as a publisher or speaker. The negligent design lawsuit treated Snap as a products manufacturer, accusing it of negligently designing a product (Snapchat) with a defect (the interplay between Snapchat’s reward system and the Speed Filter). Thus, the duty that Snap allegedly violated sprang from its distinct capacity as a product designer. Simply stated, in the court’s view, Snap’s alleged duty in this case case had nothing to do with its editing, monitoring, or removing of the content that its users generate through Snapchat.

Lemmon v. Snap, Inc. — F.3d —, 2021 WL 1743576 (9th Cir. May 4, 2021)

Money for pain and suffering because your email was hacked?

pain and suffering email hack

Plaintiff and defendant worked together doing real estate appraisals. Defendant accessed plaintiff’s email account without authorization and was later found liable for violating the federal Stored Communications Act. When it came time to assess damages, plaintiff asked for $150,000 for the pain and suffering he endured because of the email access. He alleged that he suffered mental decline, began drinking a lot and had troubles with his marriage.

The court was sympathetic to plaintiff’s “very real difficulties” but found that the amount he was seeking bore “an outsized relationship to the actual offense.” From the court’s opinion:

[Defendant], on one occasion, committed a targeted SCA offense. [Defendant] searched solely for emails related to [plaintiff’s] disparagement of [defendant] and printed four of them. Immediately, [plaintiff] learned of the breach and quickly put security measures in place to prevent further unauthorized access. Because the offense was objectively narrow in scope, the Court credits that [plaintiff] suffered a brief period of emotional harm related to the offense. The original intrusion was startling and, no doubt, produced some anxiety during the time it took [plaintiff] to protect his privacy by installing computer security software, changing passwords, and contacting his internet service provider who assured him they had taken “care of everything.”

The court ended up awarding plaintiff $1,000 for his pain and suffering tied to the breach.

Skapinetz v. CoesterVMS, Inc., 2021 WL 1634712 (D.Md. April 27, 2021)

HuffPost protected by Section 230 in Carter Page defamation suit

Section 230

Carter Page sued the publisher of the HuffPost over some 2016 articles about the Russia collusion matter that Page claimed were defamatory. These articles were written by “contributors” to the HuffPost, who, according to the court, “control their own work and post freely to the site”.

The court threw out the defamation claims concerning these articles, in part because it found that HuffPost was immune from suit thanks to Section 230. The court determined that HuffPost was not the “information content provider” since the content was written by these so-called contributors.

Page v. Oath Inc., 2021 WL 528472 (Superior Ct. Del., February 11, 2021)

Evan Brown is a technology and intellectual property attorney.

What’s going on legally with Jeep pulling the Bruce Springsteen ad?

Morals clauses in talent agreements can fuel cancel culture.

Jeep featured Bruce Springsteen in an ad that aired during Sunday’s Super Bowl. Since then, news broke that Springsteen had been arrested almost three months prior for drunk driving. So Jeep pulled further use of the ad.

This scenario shines light on a key provision in the contract that celebrities and brands typically sign. An agreement of this sort will contain a “morals clause”. Here is the language of a typical clause like this (this is just an example of such a clause – not the one in the Jeep/Springsteen agreement):

Company will have the right to terminate this Agreement for cause, which includes, without limitation, . . . commission of any act (in the past or present) which degrades Talent, Company or the Products or brings Talent, or Company or the Products into public disrepute, contempt, scandal or ridicule. Upon termination for cause, Company shall have no further obligation to Talent (including, but not limited to, any payment obligations).

Companies want these provisions for obvious reasons – if the face of the company comes under public scrutiny for any bad reason, the company needs a method to part ways. Talent with more negotiating power may be able to narrow the scope of the circumstances in which the company can terminate the agreement. For example, it could require actual conviction of a serious crime.

One problem, however, particularly for talent, is how broadly morals clauses can be written. The example clause above is broad and vague. And note how the language in this example pulls in past conduct as well (old tweets, anyone?). Given the polarized character of modern public discourse, just about everything done in public is subject to contempt, scandal or ridicule by at someone. These clauses provide the means for the commercial side of cancel culture to flourish.

Evan Brown is an intellectual property and technology attorney in Chicago.

Section 230 protected Google in illegal gambling lawsuit over loot boxes

Section 230

Plaintiffs sued Google claiming that loot boxes in games available through the Google Play store were illegal “slot machines or devices”. (Players who buy loot boxes get a randomized chance at receiving an item designed to enhance game play, such as a better weapon, faster car, or skin.) Plaintiffs characterized these loot boxes as a “gamble” because the player does not know what the loot box actually contains until it is opened. Defendant Google moved to dismiss the lawsuit on Section 230 grounds. The court granted the motion.

As relevant here, Section 230(c)(1) provides that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” 47 U.S.C. § 230(c)(1). “No cause of action may be brought and no liability may be imposed under any State or local law that is inconsistent with this section.” 47 U.S.C. § 203(e)(3).

The court held that Google was immune under Section 230 because (a) it is an interactive computer service provider, (b) plaintiffs’ claims over the loot boxes sought to treat Google as the “publisher or speaker” of the games containing the allegedly illegal loot boxes, and (c) the games constituted information provided by third parties.

Of particular interest was the court’s treatment of plaintiff’s argument that Section 230 only relates to “speech” and that Google’s provision of software did not fit into that category. Rejecting this argument, the court cited to the case of Evans v. Hewlett-Packard Co., 2013 WL 4426359(N.D. Cal. Aug. 15, 2013) in which the court used Section 230 to knock out Chubby Checker’s trademark and unfair competition claims against HP over a game HP made available.

Coffee v. Google, LLC, 2021 WL 493387 (N.D. Cal., February 10, 2021)

Evan Brown is a technology and intellectual property attorney in Chicago.

Scroll to top