Online platforms will have to answer for sales of alleged counterfeit products

 

A federal court in New York has denied the motion to dismiss filed by Chinese online platforms Alibaba and AliExpress in a lawsuit brought by a toymaker alleging that these companies’ merchant customers were engaged in contributory trademark and copyright infringement through the online sale of counterfeit products.

Background of the Case

Plaintiff toymaker accused the Alibaba defendants of facilitating the sale of counterfeit products on their platforms. The lawsuit stemmed from the activities of around 90 e-commerce merchants who were reportedly using the platforms to sell to sell fake goods.

The Court’s Rationale

The court’s decision to deny the motion to dismiss turned on several allegations that suggest the Alibaba defendants played a more complicit role than that of a passive service provider. These allegations included:

  1. Specific Awareness of Infringement: The Alibaba defendants were allegedly well-informed about the infringing activities of several merchants, including some named in the lawsuit. The Alibaba defendants should have known of these instances from orders in six separate lawsuits against sellers on the platforms.
  2. Continued Proliferation of Infringing Listings: Despite this awareness, the platforms reportedly allowed the continued presence and proliferation of infringing listings. This included listings from merchants already flagged under Alibaba’s “three-strike policy” for repeat offenders.
  3. Promotion of Infringing Listings: Plaintiff alleged the Alibaba defendants actively promoted infringing listings. The Alibaba defendants reportedly granted “Gold Supplier” and “Verified” statuses to infringing merchants, sold related keywords, and even promoted these listings through Google and promotional emails.
  4. Financial Gains from Infringements: Crucially, plaintiff argued that the Alibaba defendants financially benefited from these activities by attracting more customers, encouraging merchants to purchase additional services, and earning commissions on transactions involving counterfeit goods.

DMCA Safe Harbor Provisions Not Applicable

The court rejected the Alibaba defendants’ argument that safe harbor provisions under the Digital Millennium Copyright Act (DMCA) applied at this stage of the litigation. The DMCA safe harbor is typically an affirmative defense to liability, and for it to apply at the motion to dismiss stage, such defense must be evident on the face of the complaint. The court found that in this case, it was not.

Implications of the Ruling

This decision is relevant to purveyors of online products who face the persistent challenges of online enforcement of intellectual property rights. Remedies against overseas companies in situations such as this are often elusive. The case provides a roadmap of sorts concerning the types of facts that must be asserted to support a claim against an online provider in the position of the Alibaba defendants.

Kelly Toys Holdings, LLC v. 19885566 Store et al., 2023 WL 8936307 (S.D.N.Y. December 27, 2023)

Fifth Circuit dissent issues scathing rebuke of broad Section 230 immunity

section 230 immunity

Dissenting in the court’s refusal to rehear an appeal en banc, Judge Elrod of the Fifth Circuit Court of Appeals – joined by six of her colleagues – penned an opinion that sharply criticized the broad immunity granted to social media companies under Section 230 of the Communications Decency Act. The dissent emerged in a case involving John Doe, a minor who was sexually abused by his high school teacher, a crime in which the messaging app Snapchat played a pivotal role.

The Core of the Controversy

Section 230 (47 U.S.C. 230) is a provision that courts have long held to shield internet companies from liability for content posted by their users. The dissenting opinion, however, argues that this immunity has been stretched far beyond its intended scope, potentially enabling platforms to evade responsibility even when their design and operations contribute to illegal activities.

Snapchat’s Role in the Abuse Case

Snapchat, owned by Snap, Inc., was used by the teacher to send sexually explicit material to Doe. Doe sought to hold Snap accountable, alleging that Snapchat’s design defects, such as inadequate age-verification mechanisms, indirectly facilitated the abuse. But the lower court, applying previous cases interpreting Section 230, dismissed these claims at the initial stage.

A Critical Examination of Section 230

The dissent criticized the court’s interpretation of Section 230, arguing that it has been applied too broadly to protect social media companies from various forms of liability, including design defects and distributor responsibilities. It highlighted the statute’s original text, which was meant to protect platforms from being deemed publishers or speakers of third-party content, not to shield them from liability for their own conduct.

Varied Interpretations Across Courts

Notably, the dissent pointed out the inconsistency in judicial interpretations of Section 230. While some courts, like the Ninth Circuit, have allowed claims related to design defects to proceed, others have extended sweeping protections to platforms, significantly limiting the scope for holding them accountable.

The Implications for Internet Liability

This case and the resulting dissent underscore a significant legal issue in the digital age: how to balance the need to protect online platforms from excessive liability with ensuring they do not become facilitators of illegal or harmful activities. The dissent suggested that the current interpretation of Section 230 has tipped this balance too far in favor of the platforms, leaving victims like Doe without recourse.

Looking Ahead: The Need for Reevaluation

The dissenting opinion called for a reevaluation of Section 230, urging a return to the statute’s original text and intent. This reexamination – in the court’s view – would be crucial in the face of evolving internet technologies and the increasing role of social media platforms in everyday life. The dissent warned of the dangers of a legal framework that overly shields these powerful platforms while leaving individuals exposed to the risks associated with their operations.

Conclusion

The court’s dissent in this case is a clarion call for a critical reassessment of legal protections afforded to social media platforms. As the internet continues to evolve, the legal system must adapt to ensure that the balance between immunity and accountability is appropriately maintained, safeguarding individuals’ rights without stifling technological innovation and freedom of expression online.

Doe through Roe v. Snap, Incorporated, — F4th — 2023 WL 8705665, (5th Cir., December 18, 2023)

See also: Snapchat not liable for enabling teacher to groom minor student

California court decision strengthens Facebook’s ability to deplatform its users

vaccine information censorship

Plaintiff used Facebook to advertise his business. Facebook kicked him off and would not let him advertise, based on alleged violations of Facebook’s Terms of Service. Plaintiff sued for breach of contract. The lower court dismissed the case so plaintiff sought review with the California appellate court. That court affirmed the dismissal.

The Terms of Service authorized the company to unilaterally “suspend or permanently disable access” to a user’s account if the company determined the user “clearly, seriously, or repeatedly breached” the company’s terms, policies, or community standards.

An ordinary reading of such a provision would lead one to think that Facebook would not be able to terminate an account unless certain conditions were met, namely, that there had been a clear, serious or repeated breach by the user. In other words, Facebook would be required to make such a finding before terminating the account.

But the court applied the provision much more broadly. So broadly, in fact, that one could say the notion of clear, serious, or repeated breach was irrelevant, superfluous language in the terms.

The court said: “Courts have held these terms impose no ‘affirmative obligations’ on the company.” Discussing a similar case involving Twitter’s terms of service, the court observed that platform was authorized to suspend or terminate accounts “for any or no reason.” Then the court noted that “[t]he same is true here.”

So, the court arrived at the conclusion that despite Facebook’s own terms – which would lead users to think that they wouldn’t be suspended unless there was a clear, serious or repeated breach – one can get deplatformed for any reason or no reason. The decision pretty much gives Facebook unmitigated free speech police powers.

Strachan v. Facebook, Inc., 2023 WL 8589937 (Cal. App. December 12, 2023)

Scroll to top