Justice Thomas Steps in Social Media Immunity Thicket

Section 230 of the Communications Decency Act, 47 U. S. C. §230(c)(1), is generally considered to provide that entities that merely act as conduits for communication (like social media companies) are not considered “publishers” of third-party content, and therefore have no direct liability for what other people say on their site. That section states, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

The statutory provision arose out of one of the Wolf of Wall Street cases. Online interactive service Prodigy operated a bulletin board service where people could post information about various topics—including financial topics under the Money Talk platform.

The Wolf of Wall Street Effect

Investment firm Stratton Oakmont (the Wolf of Wall Street company) was upset with something someone published on Prodigy, and rather than suing the poster, sued Prodigy as the publisher of the content—much like The New York Times publishes letters to the editor. A previous case, Cubby v. Compuserve, held that the online provider was not a publisher of the content, despite their ability to control, reject or modify the content, but in Stratton Oakmont, the New York state court found, for defamation purposes, that Prodigy could be considered the publisher of what the poster to the Money Talk platform posted.

In response to this holding, Congress passed Section 230 of the Communications Decency Act. As a practical matter, Section 230 has been held to provide virtually unlimited immunity to social media sites, online portals and other providers for some pretty nasty stuff. This can include postings that are threatening, harassing, unlawful (including revenge porn, doxxing and other attacks). While responsible social media sites frequently take down such objectionable materials, the act of taking down stuff (also protected by Section 230) inevitably raises questions of motive and alleged censorship by big tech. Congress has been consistently looking at how to reform Section 230 to promote ‘responsible’ control—but, of course, because the postings are often expressive First Amendment content, that raises the question of, for example, is it responsible to take down COVID-19 misinformation or Ukraine misinformation, or is that equivalent to those sites, portals and providers putting their thumb on the scales? Pretty sticky stuff.

Social Media Immunity

On March 7, 2022, in a case involving horrible conduct by a Facebook user, the U.S. Supreme Court once again was asked to address the immunity of social media, and once again declined to do so. In Jane Doe v. Facebook, Dkt. No. 21–459, the court declined to hear an appeal of a case in which a male sexual predator used Facebook to lure a 15-year-old girl to a meeting where she was raped, beaten and trafficked for sex. The girl and her family sued Facebook for violation of Texas’ sex trafficking law, alleging that the social media company was liable for the acts of the user.

Supreme Court Justice Thomas Weighs In

While the Supreme Court declined to hear the case (effectively immunizing Facebook from liability), Justice Clarence Thomas dissented—essentially arguing that the statute does not actually mean what it says. Thomas lamented the fact that the Texas Supreme Court afforded Facebook immunity “even though Facebook allegedly knows its system facilitates human traffickers in identifying and cultivating victims,” but has nonetheless “failed to take any reasonable steps to mitigate the use of Facebook by human traffickers” because doing so would cost the company users—and the advertising revenue those users generate.” At least that was what was alleged.

To Thomas, this is not what the CDA Section 230 means. Thomas noted that “arguments in favor of broad immunity under §230 rest largely on ‘policy and purpose,’ not on the statute’s plain text” and that the Supreme Court should step in and determine whether the 230 immunity should be broadly or narrowly construed. As Justice Thomas noted, “[a]ssuming Congress does not step in to clarify §230’s scope, we should do so in an appropriate case.” A broad interpretation of Section 230—which refuses to treat an ISP as a publisher in all cases and for all purposes—is the interpretation almost universally adopted by the federal courts. However, it is possible that a court could adopt a more narrow interpretation—that 230 immunity only applies to cases in which the potential liability arises from the act of publication—that is, for example, in a defamation case. Other acts of online social media which do not amount to publication liability are not immunized.

So, for example, if a Plaintiff alleges a “failure to screen” liability, or “failure to remove” or improper editing or taking down, or breach of contract (terms of service) or other possibly non-publishing liabilities, a court could rule that Section 230 affords no immunity. Even in cases of publication, a court could conceivably narrow the immunity to only those acts in which publication is inferred. The acts of editing, filtering, commenting on or removing—which are editorial functions—could give rise to liability, as could allegations that the social media entity is working in conjunction with (and as an agent of) the actual publisher. To date, this approach has not been taken by the majority of courts, which have almost universally shielded Internet companies from content liability for content posted by third parties. It is clear that Justice Thomas wants to reverse that.

In a previous case in which the Supreme Court also denied a petition for certiorari, Thomas argued at length to restore liability to social media companies. If companies like Facebook, LinkedIn, TikTok, Twitter and others were civilly (and possibly criminally) liable for what people posted, they would behave more “responsibly.” And on the contrary, they would also have to respond to those who think that they are repressing free speech and would also have to keep controversial (and inaccurate) postings up—turning them into some kind of truth police. Sauce for the goose.

These 230 cases will continue to trickle through the court system and, ultimately, either Congress or the Supreme Court will have to, once and for all, determine how much immunity is too much and to what extent social media sites will have to self-police to avoid liability. We know which way Justice Thomas will vote. The question is whether he can find four like-minded compatriots.

Avatar photo

Mark Rasch

Mark Rasch is a lawyer and computer security and privacy expert in Bethesda, Maryland. where he helps develop strategy and messaging for the Information Security team. Rasch’s career spans more than 35 years of corporate and government cybersecurity, computer privacy, regulatory compliance, computer forensics and incident response. He is trained as a lawyer and was the Chief Security Evangelist for Verizon Enterprise Solutions (VES). He is recognized author of numerous security- and privacy-related articles. Prior to joining Verizon, he taught courses in cybersecurity, law, policy and technology at various colleges and Universities including the University of Maryland, George Mason University, Georgetown University, and the American University School of law and was active with the American Bar Association’s Privacy and Cybersecurity Committees and the Computers, Freedom and Privacy Conference. Rasch had worked as cyberlaw editor for SecurityCurrent.com, as Chief Privacy Officer for SAIC, and as Director or Managing Director at various information security consulting companies, including CSC, FTI Consulting, Solutionary, Predictive Systems, and Global Integrity Corp. Earlier in his career, Rasch was with the U.S. Department of Justice where he led the department’s efforts to investigate and prosecute cyber and high-technology crime, starting the computer crime unit within the Criminal Division’s Fraud Section, efforts which eventually led to the creation of the Computer Crime and Intellectual Property Section of the Criminal Division. He was responsible for various high-profile computer crime prosecutions, including Kevin Mitnick, Kevin Poulsen and Robert Tappan Morris. Prior to joining Verizon, Mark was a frequent commentator in the media on issues related to information security, appearing on BBC, CBC, Fox News, CNN, NBC News, ABC News, the New York Times, the Wall Street Journal and many other outlets.

mark has 203 posts and counting.See all posts by mark