Blog

Suit under DMCA for concealing copyright management information failed because plaintiff did not properly allege defendants’ intent

Plaintiff sued defendants under the provision of the Digital Millennium Copyright Act (DMCA) (17 U.S.C. § 1202(a)) that, among other things, prohibits a person from knowingly and with the intent to induce, enable, facilitate, or conceal infringement, provide copyright management information that is false.

Defendants moved to dismiss for failure to state a claim. The district court granted the motion, and plaintiff sought review with the Second Circuit. On appeal, the court affirmed the dismissal.

It noted that in order to plead a violation of Section 1202(a), a plaintiff must plausibly allege that a defendant knowingly provided false copyright information and that the defendant did so with the intent to induce, enable, facilitate, or conceal an infringement. This is a double scienter requirement.

In this case, the court found that plaintiff’s DMCA claim merely alleged that one of the defendants was identified on a disputed work (a book) as its author, and that she was listed in the notice of copyright as its owner, which plaintiff alleges is false.

The court held that these facts did not amount to a plausible allegation that defendants knew that such copyright information was false, or that it even was false. Moreover, plaintiff had failed to adequately plead that defendants intended to conceal valid copyright management information.

Krechmer v. Tantaros, 2018 WL 4044048 (2nd Cir. August 24, 2018)

Is an online marketplace liable for injuries caused by defective products?

The recent case of Eberhart v. Amazon.com, Inc. discussed the question of whether a man could recover from Amazon for severe injuries to his thumb he suffered when the glass of a coffee maker he purchased on Amazon shattered. The court held that Amazon was not liable.

No strict liability

In most states, when a product injures someone, that injured party can seek to hold anyone within the distribution chain “strictly liable” for the injuries. That means that the party is potentially liable regardless of whether it sold the product directly to the consumer, and regardless of whether the injury was foreseeable or was caused because of a lack of due care.

The court concluded that Amazon was not within the coffeemaker’s chain of distribution such that Amazon could be considered a “distributor” subject to strict liability. Amazon never took title to the coffee maker. Moreover, Amazon was better characterized as a provider of services. And finally, many other courts that had considered the question had concluded that Amazon was not strictly liable for defective products sold on its marketplace.

No negligence, breach of warranty or misrepresentation

Plaintiff’s other legal theories against Amazon failed as well. As for his negligence claims, the court held Amazon owed no duty to plaintiff because Amazon did not manufacture, sell, or otherwise distribute the allegedly defective coffeemaker to him. And as for claims sounding in breach of express warranty and misrepresentation, the court held that because Amazon did not make any statement about the coffeemaker (the seller generated that content), it could not be held liable.

Eberhart v. Amazon.com, Inc., 2018 WL 4080348 (S.D.N.Y., August 27, 2018)

About the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown [at] internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

Police not required to publicly disclose how they monitor social media accounts in investigations

In the same week that news has broken about how Amazon is assisting police departments with facial recognition technology, here is a decision from a Pennsylvania court that held police do not have to turn over details to the public about how they monitor social media accounts in investigations.

The ACLU sought a copy under Pennsylvania’s Right-to-Know Law of the policies and procedures of the Pennsylvania State Police (PSP) for personnel when using social media monitoring software. The PSP produced a redacted copy, and after the ACLU challenged the redaction, the state’s Office of Open Records ordered the full document be provided. The PSP sought review in state court, and that court reversed the Office of Open Records order. The court found that disclosure of the record would be reasonably likely to threaten public safety or a public protection activity.

The court found in particular that disclosure would: (i) allow individuals to know when the PSP can monitor their activities using “open sources” and allow them to conceal their activities; (ii) expose the specific investigative method used; (iii) provide criminals with tactics the PSP uses when conducting undercover investigations; (iv) reveal how the PSP conducts its investigations; and (v) provide insight into how the PSP conducts an investigation and what sources and methods it would use. Additionally, the court credited the PSP’s affidavit which explained that disclosure would jeopardize the PSP’s ability to hire suitable candidates – troopers in particular – because disclosure would reveal the specific information that may be reviewed as part of a background check to determine whether candidates are suitable for employment.

Pennsylvania State Police v. American Civil Liberties Union of Pennsylvania, 2018 WL 2272597 (Commonwealth Court of Pennsylvania, May 18, 2018)

About the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown [at] internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

Court takes into consideration defendant’s privacy in BitTorrent copyright infringement case

Frequent copyright plaintiff Strike 3 Holdings filed a motion with the U.S. District Court for the District of Minnesota seeking an order allowing Strike 3 to send a subpoena to Comcast, to learn who owns the account used to allegedly infringe copyright. The Federal Rules of Civil Procedure created a bootstrapping problem or, as the court called it, a Catch-22, for Strike 3 – it was not able to confer with the unknown Doe defendant as required by Rule 26(f) because it could not identify the defendant, but it could not identify defendant without discovery from Comcast.

The court granted Strike 3’s motion for leave to take early discovery, finding that good cause existed for granting the request, and noting:

  • Strike 3 had stated an actionable claim for copyright infringement,
  • The discovery request was specific,
  • There were no alternative means to ascertain defendant’s name and address,
  • Strike 3 had to know defendant’s name and address in order to serve the summons and complaint, and
  • Defendant’s expectation of privacy in his or her name and address was outweighed by Strike 3’s right to use the judicial process to pursue a plausible claim of copyright infringement.

On the last point, the court observed that the privacy interest was outweighed especially given that the court could craft a limited protective order under Federal Rule of Civil Procedure 26(c) to protect an innocent ISP subscriber and to account for the sensitive and personal nature of the subject matter of the lawsuit.

Strike 3 Holdings, LLC v. Doe, 2018 WL 2278110 (D.Minn. May 18, 2018)

About the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown [at] internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

Online marketer’s misappropriation claims against publicly traded former suitor move forward

Plaintiff – a small online marketing company – sued a large, publicly-traded competitor for copyright infringement, misappropriation of trade secrets, deceptive and unfair practices, and breach of contract. The parties had previously signed a nondisclosure agreement and an agreement whereby plaintiff would provide defendant with access to plaintiff’s technology used to monitor the scope of companies’ online presence and the accuracy of information appearing in search engines. The parties had also engaged in discussions about defendant acquiring plaintiff. But after the negotiations broke off, plaintiff discovered that it appeared defendant had appropriated plaintiff’s technology (including copyright-protected materials) into defendant’s own product offerings.

The lower court entered a preliminary injunction against defendant, barring it from offering the allegedly infringing and misappropriating technology. Defendant sought review of the entry of preliminary injunction with the U.S. Court of Appeals for the Eleventh Circuit. The appellate court affirmed the order.

The appellate court rejected defendant’s argument that the lower court had not described specifically enough those trade secrets of plaintiff that defendant had allegedly misappropriated. It also rejected defendant’s arguments that plaintiff’s delay in bringing suit undermined its argument of irreparable harm, that plaintiff failed to show that it was likely to succeed on the merits of its underlying claims, and that the district court erred in weighing the balance of harm and in considering the impact on the public interest.

Advice Interactive Group, LLC v. Web.com Group, Inc., 2018 WL 2246603 (11th Cir., May 16, 2018)

About the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown [at] internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

No fraud claim against VRBO over bogus listing because website terms did not guarantee prescreening

Plaintiff sued the website VRBO for fraud after he used the website to find a purported vacation rental property that he paid for and later learned to be nonexistent. He specifically claimed that the website’s “Basic Rental Guarantee” misled him into believing that VRBO pre-screened the listings that third parties post to the site. The lower court granted VRBO’s summary judgment motion. Plaintiff sought review with the First Circuit Court of Appeals. On appeal, the court affirmed summary judgment, finding the guarantee was not fraudulent.

The court found the Basic Rental Guarantee was not fraudulent for a number of reasons. The document simply established a process for obtaining a refund (of up to $1,000) that involved satisfying certain conditions (e.g., having paid using a certain method, being denied a refund by the property owner, and making a claim to VRBO within a certain time). The document gave no indication that VRBO conducted any pre-screening of listed properties, but instead the document mentioned investigation that would be conducted only in the event a claim of “Internet Fraud” (as VRBO defined it) was made. And VRBO’s terms and conditions expressly stated that VRBO had no duty to pre-screen content on the website, and also disclaimed liability arising from any inaccurate listings.

Finally, the court found that the guarantee did not, under a Massachusetts statute, constitute a representation or warranty about the accuracy of the listings. Among other things, the document clearly and conspicuously disclosed the nature and extent of the guarantee, its duration, and what the guarantor undertook to do.

Hiam v. Homeaway.com, 887 F.3d 542 (1st Cir., April 12, 2018)

About the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown [at] internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

No privacy violation for disclosing information otherwise available on member-only website

Plaintiff sued several defendants related to her past work as a government employee. She sought to amend her pleadings to add claims for violation of the Fourth Amendment and the federal Stored Communications Act. She claimed that defendants wrongfully disclosed private medical information about her. The court denied her motion to amend the pleadings to add the Fourth Amendment and Stored Communications Act claims because such amendments would have been futile.

Specifically, the court found there to be no violation because she had no reasonable expectation of privacy in the information allegedly disclosed. She had made that information available on a website. Though to view the information required signing up for an account, plaintiff had not set up the website to make the information available only to those she invited to view it. The court relied on several cases from earlier in the decade that addressed the issue of privacy of social media content, among them Rosario v. Clark Cty. Sch. Dist., 2013 WL 3679375 (D. Nev. July 3, 2013), which held that one has no reasonable expectation of privacy in his or her tweets, even if he or she had maintained a private account. In that case, the court held that even if the social media user maintained a private account, his tweets still amounted to the dissemination of information to the public.

Burke v. New Mexico, 2018 WL 2134030 (D.N.M. May 9, 2018)

About the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown [at] internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

Can YouTube be sued for censorship? A court weighs in.

Prager University sued Google LLC and YouTube, LLC, alleging that defendants discriminated against plaintiff’s conservative political viewpoints by restricting its videos on YouTube. Plaintiff asked the court to issue a preliminary injunction to prevent defendants from continuing these practices and to allow plaintiff’s videos unrestricted access on the platform. Plaintiff also sought damages for alleged violations of free speech rights and other claims.

The court decided in favor of defendants. It dismissed plaintiff’s federal claims under the First Amendment and the Lanham Act and declined to exercise jurisdiction over the state law claims. Additionally, the court denied plaintiff’s motion for a preliminary injunction.

The court ruled that defendants, as private entities, were not state actors and therefore not bound by the First Amendment. It found that YouTube’s platform, even if widely used for public discourse, does not transform it into a public forum subject to constitutional free speech protections. Regarding the Lanham Act, the court concluded that statements about YouTube being a platform for free expression were non-actionable “puffery” and not specific enough to be considered false advertising.

In dismissing plaintiff’s state law claims, the court noted that they raised complex issues of California law better suited for state courts. This decision left open the possibility for plaintiff to amend its complaint or pursue claims in state court.

Three reasons why this case matters:

  • Clarification of First Amendment Limits: The ruling reinforces that constitutional free speech protections apply only to government actors, not private companies.
  • Role of Platforms in Content Moderation: The case highlights ongoing debates about the responsibilities of tech companies in regulating content and their impact on public discourse.
  • Defining Puffery vs. Advertising: The court’s finding that statements about neutrality were mere puffery provides insight into how courts assess claims of false advertising.

Prager University v. Google LLC, 2018 WL 1471939 (N.D. Cal. March 26, 2018)

Section 230 protected Google in lawsuit over blog post

Defendant used Google’s Blogger service to write a post – about plaintiffs’ business practices – that plaintiffs found objectionable. So plaintiffs sued Google in federal court for defamation, tortious interference with a business relationship, and intentional infliction of emotional distress. The lower court dismissed the case on grounds that the Communications Decency Act (at 47 U.S.C. §230) immunized Google from liability for the publication of third party content.

Plaintiffs sought review with the U.S. Court of Appeals for the District of Columbia. On appeal, the court affirmed the dismissal. Applying a three part test the court developed in Klayman v. Zuckerberg, 753 F.3d 1354 (D.C. Cir. 2014) (which in turn applied analysis from the leading case of Zeran v. America Online, Inc., 129 F.3d 327 (4th Cir. 1997)), the court held that Section 230 entitled Google to immunity because: (1) Google was a “provider or user of an interactive computer service,” (2) the relevant blog post contained “information provided by another information content provider,” and (3) the complaint sought to hold Google liable as “the publisher or speaker” of the blog post.

The court rejected defendant’s argument that in establishing and enforcing its Blogger Content Policy, Google influenced and thereby created the content it published. It held that Google’s role was strictly one of “output control” – because Google’s choice was limited to a “yes” or a “no” decision whether to remove the post, its action constituted “the very essence of publishing.” Since Section 230 immunizes online defendants against complaints seeking to hold them as the publisher of content, the lower court properly dismissed the action.

Bennett v. Google, LLC, 882 F.3d 1163 (D.C. Cir., February 23, 2018)

About the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown [at] internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

Ninth Circuit upholds decision in favor of Twitter in terrorism case

Tamara Fields and Heather Creach, representing the estates of their late husbands and joined by Creach’s two minor children, sued Twitter, Inc. Plaintiffs alleged that the platform knowingly provided material support to ISIS, enabling the terrorist organization to carry out the 2015 attack in Jordan that killed their loved ones. The lawsuit sought damages under the Anti-Terrorism Act (ATA), which allows U.S. nationals injured by terrorism to seek compensation.

Plaintiffs alleged that defendant knowingly and recklessly provided ISIS with access to its platform, including tools such as direct messaging. Plaintiffs argued that these services allowed ISIS to spread propaganda, recruit followers, raise funds, and coordinate operations, ultimately contributing to the attack. Defendant moved to dismiss the case, arguing that plaintiffs failed to show a direct connection between its actions and the attack. Defendant also invoked Section 230 of the Communications Decency Act, which shields platforms from liability for content created by users.

The district court agreed with defendant and dismissed the case, finding that plaintiffs had not established proximate causation under the ATA. Plaintiffs appealed, but the Ninth Circuit upheld the dismissal. The appellate court ruled that plaintiffs failed to demonstrate a direct link between defendant’s alleged support and the attack. While plaintiffs showed that ISIS used defendant’s platform for various purposes, the court found no evidence connecting those activities to the specific attack in Jordan. The court emphasized that the ATA requires a clear, direct relationship between defendant’s conduct and the harm suffered.

The court did not address defendant’s arguments under Section 230, as the lack of proximate causation was sufficient to resolve the case. Accordingly, this decision helped clarify the legal limits of liability for platforms under the ATA and highlighted the challenges of holding technology companies accountable for how their services are used by third parties.

Three Reasons Why This Case Matters:

  • Sets the Bar for Proximate Cause: The ruling established that a direct causal link is essential for liability under the Anti-Terrorism Act.
  • Limits Platform Liability: The decision underscores the difficulty of holding online platforms accountable for misuse of their services by bad actors.
  • Reinforces Section 230’s Role: Although not directly addressed, the case highlights the protections Section 230 offers to tech companies.

Fields v. Twitter, Inc., 881 F.3d 739 (9th Cir. 2018)

Scroll to top