Nick Catone Facebook lawsuit – more questions than answers

  • 125
  •  
  •  
  • 2
  •  
  •  
  •  
  •  
  •  
    127
    Shares

This article about the Nick Catone Facebook lawsuit was written by Dorit Rubinstein Reiss, Professor of Law at the University of California Hastings College of the Law (San Francisco, CA), who is a frequent contributor to this and many other blogs, providing in-depth, and intellectually stimulating, articles about vaccines, medical issues, social policy, and the law.

Professor Reiss writes extensively in law journals about the social and legal policies of vaccination. Additionally, Reiss is also a member of the Parent Advisory Board of Voices for Vaccines, a parent-led organization that supports and advocates for on-time vaccination and the reduction of vaccine-preventable disease.

Recently, Nick Catone – who lost his son tragically in 2017, and blamed vaccines for it, with no good supporting evidence –  sued – or tried to sue – Facebook in federal court for, allegedly, removing his account.

The Nick Catone Facebook lawsuit is problematic, and the story, in its entirety, seems strange.

What happened?

On February 24, 2020, Mr. Catone posted that his Facebook account was disabled.  In writing to Facebook challenging the decision, he was apparently told that Facebook does not “allow credible threats to harm others, support for violent organizations or exceedingly graphic content on Facebook.”

As a result, Facebook “determined that [Mr. Catone’s account] hasn’t followed the Facebook Terms. This has resulted in the permanent loss of [Mr. Catone’s] account.”

The Catones asked for help on Facebook, and also started a petition to restore Mr. Catone’s account.

On March 13, 2020, Mr. Catone posted on Facebook claiming: 

So we filed a lawsuit this week and an emergency motion to get me back on Facebook asap and I’m back on for now until we see how this all plays out.

The problem is that at that point, no lawsuit was filed. According to PACER, a case was opened on March 18, five days later. The case consists of three attempts to file a complaint (the first step in a lawsuit), and three responses from the court that tell the attorney the complaint was filed incorrectly.

In other words, as of Friday, March 20, 2020, no complaint was available on PACER or written as accepted by the court, and certainly no emergency order – or other order – was given in the case.

It’s unclear exactly what happened there. It is clear that there is an attempt by Mr. Catone, supported by attorney Jim Mermigis, who has been litigating multiple cases in an attempt to overturn New York’s law removing the religious exemption to school vaccine requirements, so far without success, to sue Facebook over an alleged banning.

Unknown facts include why Facebook decided to ban Mr. Catone and how Mr. Catone’s account was reinstated.

The Catones no doubt suffered extensively from their loss.  One use that Mr. Catone made of his account was to post memories – and it’s obvious and natural that he feels the loss of the pictures and videos posted.

But this lawsuit seems ill-founded. Although the complaint is not available through PACER, Mr. Mermigis posted screenshots of it (though not one PDF) on his firm’s Facebook page, and I downloaded them (and for ease of use, saved them as one file). 

The lawsuit suffers from several problems. The most serious one is misstating the law, and ignoring aspects of the law that are real barriers to it. The lawsuit also includes many conspiracy theories and factual problems, but at this stage, the first question will be if it states a legal cause of action – and here, the legal problems are going to be at the forefront.

The Nick Catone Facebook lawsuit – misstating the law

The lawsuit has several serious errors of law.

After introducing the parties, the lawsuit starts by quoting the United States Supreme Court’s in Packingham v. North Carolina, a case that found that North Carolina’s statute prohibiting convicted sex offender from accessing social media violated the First Amendment, as saying that Facebook is part of the “vast democratic forum of the internet.”

That’s not a misrepresentation – the paragraph in question described cyberspace generally and social media as the most important places in which people exchange views nowadays (see p. 5 ) – but it is a use of the case and the quote out of context, and not for its purpose.

Packingham addressed whether the state can prohibit access to a forum used for public discussion, and concluded it cannot. It neither said nor implied that a private company can be required to provide a forum. If you want a comparison, think about a state law saying that people convicted of drunk driving cannot write letters to the editors to the local newspaper.

The court will likely strike it down. But that does not impose on the local newspaper a duty to accept any letter to the editor it doesn’t want to publish; newspapers do pick and choose those.

A limit on the government’s ability to forbid access to a forum is not the same as requiring a private actor to give access. The First Amendment limits actions by government, not private corporations (in fact, it protects corporations, maybe too much), and this case does not change it.

But the most serious problem in the lawsuit is that it attempts to sue Facebook for damages for limiting Mr. Catone’s access, and such liability is barred by s. 230 of the Communications Decency Act.  It says:

(2)Civil liability No provider or user of an interactive computer service shall be held liable on account of—

(A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; …

In other words, you cannot sue a provider for damages for limiting the availability of material the provider thinks is objectionable. The complaint tries to get around it in two ways:

  • First, it tries to say that because part of the goal of the civil liability is a “means of ensuring free and robust speech on the internet”, the action by Facebook to remove Mr. Catone’s account is not protected – Facebook cannot hide from liability, according to the complaint, when it acts to violate free speech.
  • The other claim, as far as I understand it (the writing could be clearer) is that Facebook has been given immunity and that transforms it into a public forum or public trust, with certain duties.

Neither claim is convincing. First, s.230 itself sets several broad policy goals that go beyond free speech:

(b) Policy It is the policy of the United States—

(1) to promote the continued development of the Internet and other interactive computer services and other interactive media;
(2) to preserve the vibrant and competitive free market that presently exists for the Internet and other interactive computer services, unfettered by Federal or State regulation;
(3) to encourage the development of technologies which maximize user control over what information is received by individuals, families, and schools who use the Internet and other interactive computer services;
(4) to remove disincentives for the development and utilization of blocking and filtering technologies that empower parents to restrict their children’s access to objectionable or inappropriate online material; and
(5) to ensure vigorous enforcement of Federal criminal laws to deter and punish trafficking in obscenity, stalking, and harassment by means of computer.”

While it is not unreasonable to read concern about freedom of content, the goals also include protecting providers from lawsuits to allow strong competition and robust developments, and it also includes provisions that support allowing providers to limit content – like protecting children and deterring harassment.

More importantly, the language of s. 230 simply does not fit what Mr. Mermigis, speaking for Mr. Catone, wants. It clearly and explicitly allows platforms to restrict content, and gives them broad discretion, and the point of the section is to protect them from liability for exercising that discretion.

The point of the section was very much not to turn providers into targets for lawsuits, but to protect them from such lawsuits.

While the complaint addresses recent controversies about Facebook’s powers – controversies that came up in the context of the 2016 elections – and several congress members have called for changing s. 230 to allow liability against Facebook over what they see as biased enforcement of policies against conservative voices.

However, those are calls to change the existing law, not disagreements that in its current form, it allows Facebook to remove content and protects it from liability for doing so. 

The complaint’s claim of fraud is even stranger. It suggests that Facebook (and the other defendant, Mr. Mark Zuckerberg) “hold themselves out to the world as fostering a means by which the people of the world can communicate with each other,” and that they commit fraud by “harvesting data” and relying on community standards to censor unpopular speech.

But Facebook openly states that there are community standards, and users are aware – well aware – that content can be removed and users banned. It’s unclear what is the misrepresentation or fraud in this well-known reality.

They’re also not new. Community standards long predated the 2016 election and have always existed.

The last two claims are a contractual and torts claim. The contractual claim might have some merit if it was framed around the ads Mr. Catone got from Facebook and made the argument that he made a contract by paying for ads, and that contract was breached.

But there’s nothing of that in the complaint. It’s not clear what the contract in question is; is it the free user agreement Mr. Catone accepted on opening his Facebook account? Because that allows Facebook to ban him, and may raise questions as a contract.

The last claim is that by removing Mr. Catone’s Facebook account, Facebook interfered with his relationship with his customers. That’s a strange claim. First, Mr. Catone’s gym’s Facebook page is still on Facebook, and as he pointed out, his wife and workers can use it to interact with clients. Second, the gym’s website has other ways to interact besides Facebook. 

Finally, the Facebook user agreement allows termination – and seeing terminating their contract on its term as the relatively narrow tort of interference in contractual relations is a stretch. I will add that the tort allows compensation for improper interference – if Facebook is allowed to end Mr. Catone’s account at their discretion, that’s not improper. And again, s. 230 prevents civil liability.

Update 13 April 2020

On March 25, the complaint was finally filed. The initial pretrial hearing is scheduled for August.

Note, again, that the complaint was only successfully filed 12 days after Mr. Catone got back on Facebook. Crediting the lawsuit for that is… strange.

Nick Catone Facebook lawsuit

Summary

There is a lot that is unclear in this case. It’s unclear why Facebook removed Mr. Catone’s account if that’s what happened, and why it was put back on – before the Nick Catone Facebook lawsuit was filed.

But the lawsuit faces the high barrier of clear protection against civil liability in s. 230, and it has not made a convincing claim against it.


  • 125
  •  
  •  
  • 2
  •  
  •  
  •  
  •  
  •  
    127
    Shares
Dorit Rubinstein Reiss
This article is by Dorit Rubinstein Reiss, Professor of Law at the University of California Hastings College of the Law (San Francisco, CA), is a frequent contributor to this and many other blogs, providing in-depth, and intellectually stimulating, articles about vaccines, medical issues, social policy and the law. 

Professor Reiss writes extensively in law journals about the social and legal policies of vaccination. Additionally, Reiss is also member of the Parent Advisory Board of Voices for Vaccines, a parent-led organization that supports and advocates for on-time vaccination and the reduction of vaccine-preventable disease.