14 Sep Free Speech and Social Media
“Do difficult cases make bad law or do difficult laws make for bad cases?” – The Lonely Realist
What’s your definition of free speech?
Do you subscribe to John Milton’s 1644 version advocating no limits. His only caveat was that both the author and publisher be named to safeguard against mischief, libel or blasphemy. Milton’s rationale was that everything should be published because good information drives out the bad. [ED NOTE: History shows Milton to have been wrong.]
Or do you believe in the EU Convention on Human Rights that provides that speech “carries with it duties and responsibilities [and therefore] may be subject to such formalities, conditions, restrictions or penalties as are prescribed by law … in the interests of national security, territorial integrity or public safety…”? European countries accordingly have enacted the Digital Services Act, which forces social media sites to combat disinformation, online extremism and scams.
Then there’s the United Nations Universal Declaration of Human Rights which states that “everyone has the right to freedom of opinion and expression [including the right] to hold opinions without interference and to seek, receive and impart information and ideas through any media and regardless of frontiers.” Yet, a majority of countries treat freedom of speech as a privilege and not a right, one that cannot be permitted to subvert state power. Outside the U.S., there accordingly is no necessary correlation between legal and constitutional guarantees of free speech and actual practice.
Truly iconic free speech is embodied in America’s First Amendment, which provides Americans with the freedom to express opinions without Federal, State or local government interference while placing no restrictions on non-governmental individuals or businesses. In support of virtually limitless internet free speech in a successful effort to promote America’s internet industry, Congress in 1996 enacted Section 230 of the Communications Act of 1934, a media capstone that absolves internet providers from legal liability for information provided by third parties. The effect has been to make internet providers communication free speech pipelines rather than purveyors of fact news information. The result is that they can be held liable only for information they have developed (for example, editorial content), while being permitted to moderate content they deem to be objectionable, whether or not such material is constitutionally protected. Section 230 accordingly shields social media companies (like TwitterX, FacebookMeta, Google and TikTok) from lawsuits based on their decisions to either allow or preclude user-generated content.
The legislatures and governors of Texas and Florida haven’t been fond of Section 230. In 2021, they enacted laws designed to forbid social media platforms from altering user-submitted content based on “viewpoint,” specifically to prevent social media companies from discriminating against the political views of the two States’ governments. In the Texas litigation challenging the law, the Fifth Circuit Court of Appeals denied an injunction that would have prevented enforcement of the law and, in the Florida litigation, the Eleventh Circuit Court of Appeals granted such an injunction. On appeal, the Supreme Court in Moody v. NetChoice LLC held that the use of algorithms by social media companies to automatically prioritize or otherwise curate content that is posted by third parties reflects that platform’s editorial judgment and therefore constitutes “expressive activity” protected by the First Amendment. That is, it constitutes that company’s own speech, even though the content was originally created and uploaded by third parties, concluding that social media companies have a First Amendment right to use algorithms to boost or downplay posts based on whatever political, social or economic standards they wish. The unanimous decision left in place injunctions against execution of both the Florida and Texas laws.
What followed was a Third Circuit Court of Appeals decision (Tawainna Anderson v. TikTok) that held that, because Moody determined that self-selected algorithmic filters are first-party speech, Section 230 does not shield social media platforms from legal liability for making recommendations. The lawsuit arose after Tawainna Anderson’s daughter, Nylah, age 10, died after attempting the “Blackout Challenge” recommended to her by TikTok’s algorithm, which encouraged users to choke themselves with household items until they blacked out. The District Court for the Eastern District of Pennsylvania had dismissed the complaint, ruling that TikTok was immune under Section 230. The Third Circuit reversed based on Moody’s holding that algorithms can constitute first party speech. The Anderson decision has been derided by one commentator as “bonkers” because it “implies that any effort to curate third-party content automatically converts the third-party content into first-party content so that it no longer qualifies for Section 230 immunity” (warning that Anderson will be used as an excuse to “blow up Section 230 … and the internet as we know it”). Not so. Although Moody and Anderson will require courts to determine what is “third-party speech” and what, on a case-by-case basis, may convert it into “first-party speech,” that’s precisely what Section 230 should require. While the Third Circuit was faced with an emotionally-charged fact pattern in characterizing TikTok’s algorithmic recommendation of the ”Blackout Challenge” as first-party speech, allowing Mrs. Anderson’s lawsuit to continue does not infringe on TikTok’s constitutionally-protected speech, restrict TikTok’s editorial discretion, or create a content- or viewpoint-based distinction. Will there be a proliferation of litigation? Perhaps…, as already witnessed by the actions of the Florida and Texas legislatures and by Mrs. Anderson. Will Congress use these cases as an excuse to propose gutting Section 230 and/or eviscerating social media platforms? Perhaps. Politicians, after all, are politicians. Nevertheless, Moody and Anderson are difficult cases. One may well ask whether they will end up making bad law, whether Section 230 made for bad cases, and whether the nature of 21st Century technology makes for both difficult laws and difficult cases.
TLR Index
Prior TLR commentaries can be found here.
Finally (from a good friend)
No Comments