Required content moderation reporting does not violate X’s First Amendment rights

x bill of rights

A federal court in California has upheld the constitutionality of the state’s Assembly Bill 587 (AB 587), which mandates social media companies to submit to the state attorney general semi-annual reports detailing their content moderation practices. This decision comes after X filed a lawsuit claiming the law violated the company’s First Amendment rights.

The underlying law

AB 587 requires social media companies to provide detailed accounts of their content moderation policies, particularly addressing issues like hate speech, extremism, disinformation, harassment, and foreign political interference. These “terms of service reports” are to be submitted to the state’s attorney general, aiming to increase transparency in how these platforms manage user content.

X’s challenge

X challenged this law, seeking to prevent its enforcement on the grounds that it was unconstitutional. The court, however, denied their motion for injunctive relief, finding that X failed to demonstrate a likelihood of success on the merits of its constitutional claims.

The court’s decision relied heavily on SCOTUS’s opinion in Zauderer v. Office of Disciplinary Counsel of Supreme Court of Ohio, 471 U.S. 626 (1985). Under the Zauderer case, for governmentally compelled commercial disclosure to be constitutionally permissible, the information must be purely factual and uncontroversial, not unduly burdensome, and reasonably related to a substantial government interest.

The court’s constitutional analysis

In applying these criteria, the court found that AB 587’s requirements fit within these constitutional boundaries. The reports, while compulsory, do not constitute commercial speech in the traditional sense, as they are not advertisements and carry no direct economic benefit for the social media companies. Despite this, the court followed the rationale of other circuits that have assessed similar requirements for social media disclosures.

The court determined that the content of the reports mandated by AB 587 is purely factual, requiring companies to outline their existing content moderation policies related to specified areas. The statistical data, if provided, represents objective information about the company’s actions. The court also found that the disclosures are uncontroversial, noting that the mere association with contentious topics does not render the reports themselves controversial. We know how controversial and political the regulation of “disinformation” can be.

Addressing the burden of these requirements, the court recognized that while the reporting may be demanding, it is not unjustifiably so under First Amendment considerations. X argued that the law would necessitate significant resources to monitor and report the required metrics. But the court noted that AB 587 does not obligate companies to adopt any specific content categories, nor does it impose burdens on speech itself, a crucial aspect under Zauderer’s analysis.

What this means

The court confirmed that AB 587’s reporting requirements are reasonably related to a substantial government interest. This interest lies in ensuring transparency in social media content moderation practices, enabling consumers to make informed choices about their engagement with news and information on these platforms.

The court’s decision is a significant step in addressing the complexities of regulating social media platforms, balancing the need for transparency with the constitutional rights of these digital entities. As the landscape of digital communication continues to evolve, this case may be a marker for how governments might approach the regulation of social media companies, particularly in the realm of content moderation.

X Corp. v. Bonta, 2023 WL 8948286 (E.D. Cal., December 28, 2023)

See also: Maryland Court of Appeals addresses important question of internet anonymity

California court decision strengthens Facebook’s ability to deplatform its users

vaccine information censorship

Plaintiff used Facebook to advertise his business. Facebook kicked him off and would not let him advertise, based on alleged violations of Facebook’s Terms of Service. Plaintiff sued for breach of contract. The lower court dismissed the case so plaintiff sought review with the California appellate court. That court affirmed the dismissal.

The Terms of Service authorized the company to unilaterally “suspend or permanently disable access” to a user’s account if the company determined the user “clearly, seriously, or repeatedly breached” the company’s terms, policies, or community standards.

An ordinary reading of such a provision would lead one to think that Facebook would not be able to terminate an account unless certain conditions were met, namely, that there had been a clear, serious or repeated breach by the user. In other words, Facebook would be required to make such a finding before terminating the account.

But the court applied the provision much more broadly. So broadly, in fact, that one could say the notion of clear, serious, or repeated breach was irrelevant, superfluous language in the terms.

The court said: “Courts have held these terms impose no ‘affirmative obligations’ on the company.” Discussing a similar case involving Twitter’s terms of service, the court observed that platform was authorized to suspend or terminate accounts “for any or no reason.” Then the court noted that “[t]he same is true here.”

So, the court arrived at the conclusion that despite Facebook’s own terms – which would lead users to think that they wouldn’t be suspended unless there was a clear, serious or repeated breach – one can get deplatformed for any reason or no reason. The decision pretty much gives Facebook unmitigated free speech police powers.

Strachan v. Facebook, Inc., 2023 WL 8589937 (Cal. App. December 12, 2023)

Vaccine information censorship: Is Congressman Adam Schiff liable for the deplatforming of a medical organization?

vaccine information censorship

One could characterize the recent case of Association of American Physicians & Surgeons, Inc. v. Schiff as addressing the issue of vaccine information censorship. The court considered whether letters written by Congressman Adam Schiff to Big Tech platforms, and statements he made in a Congressional hearing, caused the companies to deplatform a medical trade association and otherwise disfavor its content in search results.

The Association of American Physicians and Surgeons (AAPS) publishes online content that it characterizes not as “anti-vaccine,” but rather in favor of “informed consent based on disclosure of all relevant legal, medical, and economic information.” In 2019, California Representative Adam Schiff wrote letters to Google, Facebook, Amazon and Twitter, complaining about what he characterized as inaccurate information on vaccines, and requested answers to questions about what these platforms were doing to combat vaccine misinformation. In a later congressional hearing, he questioned whether Section 230 immunity for these sorts of technology platforms should be changed (a statement that AAPS characterized as a threat to these Big Tech platforms).

Thereafter, Amazon kicked AAPS out of its associates program, and AAPS’s web traffic to its vaccine information pages dropped (which it blames on Google and Facebook disfavoring the content). AAPS sued Schiff, seeking damages, claiming that his statements and actions caused these platforms to treat it disfavorably. The trial court dismissed the case on a motion to dismiss, finding that AAPS lacked standing. AAPS sought review with the District of Columbia Court of Appeals.

On appeal, the court affirmed the lower court’s dismissal for lack of standing. The court affirmed the dismissal primarily for two reasons. First, if found that AAPS had not sufficiently alleged that it suffered any injury in the form of an impairment of its ability to negotiate with Amazon. Secondly, the court found that any injury AAPS suffered from it being deplatformed and its content disfavored, as alleged by AAPS, was not sufficiently traceable to Schiff’s conduct.

Schiff had also argued that he could not be sued (i.e., that the court lacked subject matter jurisdiction) because his actions giving rise to the lawsuit were legislative acts and therefore protected by the Speech or Debate Clause of the Constitution. Because AAPS had not established that it had standing, the court did not need not reach the separate jurisdictional issue of immunity under this constitutional clause.

Association of American Physicians & Surgeons, Inc. v. Schiff, — F.4th —, 2022 WL 211219 (D.C. Cir. January 25, 2022)

Restraining order entered against website that encouraged contacting children of plaintiff’s employees

Plaintiff sued defendant (who was an unhappy customer of plaintiff) under the Lanham Act (for trademark infringement) and for defamation. Defendant had registered a domain name using plaintiff’s company name and had set up a website that, among other things, he used to impersonate plaintiff’s employees and provide information about employees’ family members, some of whom were minors.

Plaintiff moved for a temporary restraining order and the court granted the motion.

The Website

The website was structured and designed in a way that made it appear as though it was affiliated with plaintiff. For example, it included a copyright notice identifying plaintiff as the owner. It also included allegedly false statements about plaintiff. For example, it included the following quotation, which was attributed to plaintiff’s CEO:

Well of course we engage in bad faith tactics like delaying and denying our policy holders [sic] valid claims. How do you think me [sic], my key executive officers, and my board members stay so damn rich. [sic]

The court found that plaintiff had shown a likelihood of success on the merits of its claims.

Lanham Act Claim

It found that defendant used plaintiff’s marks for the purpose of confusing the public by creating a website that looked as though it was a part of plaintiff’s business operations. This was evidenced by, for example, the inclusion of a copyright notice on the website.

Defamation

On the defamation claim, the court found that the nature of the statements about plaintiff, plaintiff’s assertion that they were false, and the allegation that the statements were posted on the internet sufficed to satisfy the first two elements of a defamation claim, namely, that they were false and defamatory statements pertaining to the plaintiff and were unprivileged publications to a third party. The allegations in the complaint were also sufficient to indicate that defendant “negligently disregarded the falsity of the statements.”

Furthermore, the statements on the website concerned the way that plaintiff processed its insurance claims, which related to the business of the company and the profession of plaintiff’s employees who handled the processing of claims. Therefore, the final element was also satisfied.

First Amendment Limitations

The court’s limitation in the TRO is interesting to note. To the extent that plaintiff sought injunctive relief directed at defendant’s speech encouraging others to contact the company and its employees with complaints about the business, whether at the workplace or at home, or at public “ad hominem” comments, the court would not grant the emergency relief that was sought.

The court also would not prohibit defendant from publishing allegations that plaintiff had engaged in fraudulent or improper business practices, or from publishing the personally identifying information of plaintiff’s employees, officers, agents, and directors. Plaintiff’s submission failed to demonstrate to the court’s satisfaction how such injunctive relief would not unlawfully impair defendant’s First Amendment rights.

The did, however, enjoin defendant from encouraging others to contact the children and other family members of employees about plaintiff’s business practices because contact of that nature had the potential to cause irreparable emotional harm to those family members, who have no employment or professional relationship with defendant.

Symetra Life Ins. Co. v. Emerson, 2018 WL 6338723(D. Maine, Dec. 4, 2018)

Can YouTube be sued for censorship? A court weighs in.

Prager University sued Google LLC and YouTube, LLC, alleging that defendants discriminated against plaintiff’s conservative political viewpoints by restricting its videos on YouTube. Plaintiff asked the court to issue a preliminary injunction to prevent defendants from continuing these practices and to allow plaintiff’s videos unrestricted access on the platform. Plaintiff also sought damages for alleged violations of free speech rights and other claims.

The court decided in favor of defendants. It dismissed plaintiff’s federal claims under the First Amendment and the Lanham Act and declined to exercise jurisdiction over the state law claims. Additionally, the court denied plaintiff’s motion for a preliminary injunction.

The court ruled that defendants, as private entities, were not state actors and therefore not bound by the First Amendment. It found that YouTube’s platform, even if widely used for public discourse, does not transform it into a public forum subject to constitutional free speech protections. Regarding the Lanham Act, the court concluded that statements about YouTube being a platform for free expression were non-actionable “puffery” and not specific enough to be considered false advertising.

In dismissing plaintiff’s state law claims, the court noted that they raised complex issues of California law better suited for state courts. This decision left open the possibility for plaintiff to amend its complaint or pursue claims in state court.

Three reasons why this case matters:

  • Clarification of First Amendment Limits: The ruling reinforces that constitutional free speech protections apply only to government actors, not private companies.
  • Role of Platforms in Content Moderation: The case highlights ongoing debates about the responsibilities of tech companies in regulating content and their impact on public discourse.
  • Defining Puffery vs. Advertising: The court’s finding that statements about neutrality were mere puffery provides insight into how courts assess claims of false advertising.

Prager University v. Google LLC, 2018 WL 1471939 (N.D. Cal. March 26, 2018)

Pastor’s First Amendment rights affected parole conditions barring social media use

Plaintiff – a Baptist minister on parole in California – sued several parole officials, arguing that conditions placed on his parole violated plaintiff’s First Amendment rights. Among the contested restrictions was a prohibition on plaintiff accessing social media. Plaintiff claimed this restriction infringed on both his right to free speech and his right to freely exercise his religion. Plaintiff asked the court for a preliminary injunction to stop the enforcement of this condition. The court ultimately sided with plaintiff, ruling that the social media ban was unconstitutional.

The Free Speech challenge

Plaintiff argued that the parole condition prevented him from sharing his religious message online. As a preacher, he relied on platforms such as Facebook and Twitter to post sermons, connect with congregants who could not attend services, and expand his ministry by engaging with other pastors. The social media ban, plaintiff claimed, silenced him in a space essential for modern communication.

The court agreed, citing the U.S. Supreme Court’s ruling in Packingham v. North Carolina, which struck down a law barring registered sex offenders from using social media. In Packingham, the Court emphasized that social media platforms are akin to a modern public square and are vital for exercising free speech rights. Similarly, the court in this case found that the blanket prohibition on social media access imposed by the parole conditions was overly broad and not narrowly tailored to address specific risks or concerns.

The court noted that plaintiff’s past offenses, which occurred decades earlier, did not involve social media or the internet, undermining the justification for such a sweeping restriction. While public safety was a legitimate concern, the court emphasized that parole conditions must be carefully tailored to avoid unnecessary burdens on constitutional rights.

The Free Exercise challenge

Plaintiff also argued that the social media ban interfered with his ability to practice his religion. He asserted that posting sermons online and engaging with his congregation through social media were integral parts of his ministry. By prohibiting social media use, the parole condition restricted his ability to preach and share his faith beyond the physical boundaries of his church.

The court found this argument compelling. Religious practice is not confined to in-person settings, and plaintiff demonstrated that social media was a vital tool for his ministry. The court noted that barring a preacher from using a key means of sharing religious teachings imposed a unique burden on religious activity. Drawing on principles from prior Free Exercise Clause cases, the court held that the parole condition was not narrowly tailored to serve a compelling government interest, as it broadly prohibited access to all social media regardless of its religious purpose.

The court’s decision

The court granted plaintiff’s request for a preliminary injunction, concluding that he was likely to succeed on his claims under both the Free Speech Clause and the Free Exercise Clause of the First Amendment. The ruling allowed plaintiff to use social media during the litigation, while acknowledging the government’s legitimate interest in monitoring parolees. The court encouraged less restrictive alternatives, such as targeted supervision or limiting access to specific sites that posed risks, rather than a blanket ban.

Three reasons why this case matters:

Intersection of Speech and Religion: The case highlights how digital tools are essential for both free speech and the practice of religion, especially for individuals sharing messages with broader communities.

Limits on Blanket Restrictions: The ruling reaffirms that government-imposed conditions, such as parole rules, must be narrowly tailored to avoid infringing constitutional rights.

Modern Application of First Amendment Rights: By referencing Packingham, the court acknowledged the evolving role of social media as a platform for public discourse and religious expression.

Manning v. Powers, 281 F. Supp. 3d 953 (C.D. Cal. Dec. 13, 2017)

Facebook did not violate user’s constitutional rights by suspending account for alleged spam

constitution

Plaintiff sued Facebook and several media companies (including CNN, PBS and NPR) after Facebook suspended his account for alleged spamming. Plaintiff had posted articles and comments in an effort to “set the record straight” regarding Kellyanne Conway’s comments on the “Bowling Green Massacre”. Plaintiff claimed, among other things, that Facebook and the other defendants violated the First, Fourth, Fifth, and Fourteenth Amendments.

The court granted defendants’ motion to dismiss for failure to state a claim. It observed the well-established principle that these provisions of the constitution only apply to governmental actors – and do not apply to private parties. Facebook and the other media defendants could not plausibly be considered governmental actors.

It also noted that efforts to apply the First Amendment to Facebook have consistently failed. See, for example, Forbes v. Facebook, Inc., 2016 WL 676396, at *2 (E.D.N.Y. Feb. 18, 2016) (finding that Facebook is not a state actor for Section 1983 First Amendment claim); and Young v. Facebook, Inc., 2010 WL 4269304, at *3 (N.D. Cal. Oct. 25, 2010) (holding that Facebook is not a state actor).

Shulman v. Facebook et al., 2017 WL 5129885 (D.N.J., November 6, 2017)

About the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown [at] internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

Reports to advertisers about website content were protected speech

Plaintiff sued defendant in California state court for trade libel and other business torts over confidential reports that defendant provided to its customers (who advertised on plaintiff’s website) characterizing plaintiff’s websites as associated with copyright infringement and adult content.

Defendant moved to dismiss under California’s anti-SLAPP statute which, among other things, protects speech that is a matter of public concern. The trial court granted the anti-SLAPP motion. Plaintiff sought review. On appeal, the court affirmed the anti-SLAPP dismissal.

The court held that the communications concerning plaintiff’s websites (as being associated with intellectual property infringement or adult content) were matters of public concern, even though the communications were not public.

FilmOn.com v. DoubleVerify, Inc., 2017 WL 2807911 (Cal. Ct. App., June 29, 2017)

Evan_BrownAbout the Author: Evan Brown is a Chicago technology and intellectual property attorney. Call Evan at (630) 362-7237, send email to ebrown [at] internetcases.com, or follow him on Twitter @internetcases. Read Evan’s other blog, UDRP Tracker, for information about domain name disputes.

Seventh Circuit sides with Backpage in free speech suit against sheriff

trouble

Backpage is an infamous classified ads website that provides an online forum for users to post ads relating to adult services. The sheriff of Cook County, Illinois (i.e. Chicago) sent letters to the major credit card companies urging them to prohibit users from using the companies’ services to purchase Backpage ads (whether those ads were legal or not). Backpage sued the sheriff, arguing the communications with the credit card companies were a free speech violation.

The lower court denied Backpage’s motion for preliminary injunction. Backpage sought review with the Seventh Circuit. On appeal, the court reversed and remanded.

The appellate court held that while the sheriff has a First Amendment right to express his views about Backpage, a public official who tries to shut down an avenue of expression of ideas and opinions through “actual or threatened imposition of government power or sanction” is violating the First Amendment.

Judge Posner, writing for the court, mentioned the sheriff’s past failure to shut down Craigslist’s adult section through litigation (See Dart v. Craigslist, Inc. 665 F.Supp.2d 961 (N.D.Ill.2009)):

The suit against Craigslist having failed, the sheriff decided to proceed against Backpage not by litigation but instead by suffocation, depriving the company of ad revenues by scaring off its payments-service providers. The analogy is to killing a person by cutting off his oxygen supply rather than by shooting him. Still, if all the sheriff were doing to crush Backpage was done in his capacity as a private citizen rather than as a government official (and a powerful government official at that), he would be within his rights. But he is using the power of his office to threaten legal sanctions against the credit-card companies for facilitating future speech, and by doing so he is violating the First Amendment unless there is no constitutionally protected speech in the ads on Backpage’s website—and no one is claiming that.

The court went on to find that the sheriff’s communications made the credit card companies “victims of government coercion,” in that the letters threatened Backpage with criminal culpability when, à la Dart v. Craigslist and 47 U.S.C. 230, it was unclear whether Backpage was in violation of the law for providing the forum for the ads.

Backpage.com, LLC v. Dart, — F.3d —, 2015 WL 7717221 (7th Cir. Nov. 30, 2015)

Evan Brown is a Chicago attorney advising enterprises on important aspects of technology law, including software development, technology and content licensing, and general privacy issues.

California court okays lawsuit against mugshot posting website

The Court of Appeal of California has held that defendant website operator – who posted arrestees’ mugshots and names, and generated revenue from advertisements using arrestees’ names and by accepting money to take the photos down – was not entitled to have the lawsuit against it dismissed. Defendant’s profiting from the photos and their takedown was not in connection with an issue of public interest, and therefore did not entitle defendant to the relief afforded by an anti-SLAPP motion.

Plaintiff filed a class action lawsuit against defendant website operator, arguing that the website’s practice of accepting money to take down mugshots it posted violated California laws against misappropriation of likeness, and constituted unfair and unlawful business practices.

Defendant moved to dismiss, arguing plaintiff’s claims comprised a “strategic lawsuit against public participation” (or “SLAPP”). California has an anti-SLAPP statute that allows defendants to move to strike any cause of action “arising from any act of that person in furtherance of the person’s right of petition or free speech under the United States Constitution or the California Constitution in connection with a public issue …, unless the court determines that the plaintiff has established that there is a probability that the plaintiff will prevail on the claim.”

The court held that the posting of mugshots was in furtherance of defendant’s free speech rights and was in connection with a public issue. But the actual complained-of conduct – the generating of revenue through advertisements, and from fees generated for taking the photos down – was not protected activity under the anti-SLAPP statute.

Because the claims did not arise from the part of defendant’s conduct that would be considered “protected activity” under the anti-SLAPP statute, but instead arose from other, non-protected activity (making money off of people’s names and photos), the anti-SLAPP statute did not protect defendant. Unless the parties settle, the case will proceed.

Rogers v. Justmugshots.Com, Corp., 2015 WL 5838403, (Not Reported in Cal.Rptr.3d) (October 7, 2015)

Evan Brown is an attorney in Chicago helping clients manage issues involving technology and new media.

Scroll to top