WASHINGTON, January 30, 2023 — As the Supreme Court prepares to hear a pair of cases about online platform liability, it is also considering a separate pair of social media lawsuits that aim to push content moderation practices in the opposite direction, adding additional questions about the First Amendment and common carrier status to an already complicated issue.
The “must-carry” laws in Texas and Florida, both aimed at limiting online content moderation, met with mixed decisions in appeals courts after being challenged by tech industry groups NetChoice and the Computer & Communications Industry Association. The outcomes will likely end up “affecting millions of Americans and their ability to express themselves online,” said Chris Marchese, counsel at NetChoice, at a Broadband Breakfast Live Online event on Wednesday.
In September, a federal appeals court in the Fifth Circuit upheld the Texas law, ruling that social media platforms can be regulated as “common carriers,” or required to carry editorial programming as were cable television operators in the Turner Broadcasting System v. FCC decisions from the 1990s.
Dueling appeals court interpretations
By contrast, the judges overturning the Florida ruling held that social media platforms are not common carriers. Even if they were, the 11th Circuit Court judges held, “neither law nor logic recognizes government authority to strip an entity of its First Amendment rights merely by labeling it a common carrier.”
Whether social media platforms should be treated like common carriers is “a fair question to ask,” said Marshall Van Alstyne, Questrom chair professor at Boston University. It would be difficult to reach a broad audience online without utilizing one of the major platforms, he claimed.
However, Marchese argued that in the Texas ruling, the Fifth Circuit “to put it politely, ignored decades of binding precedent.” First Amendment protections have previously been extended to “what we today might think of as common carriers,” he said.
“I think we can safely say that Texas and Florida do not have the ability to force our private businesses to carry political speech or any type of speech that they don’t see fit,” Marchese said.
Ari Cohn, free speech counsel at TechFreedom, disagreed with the common carrier classification altogether, referencing an amicus brief arguing that “social media and common carriage are irreconcilable concepts,” filed by TechFreedom in the Texas case.
Similar ‘must-carry’ laws are gaining traction in other states
While the two state laws have the same general purpose of limiting moderation, their specific restrictions differ. The Texas law would ban large platforms from any content moderation based on “viewpoint.,” Critics have argued that the term is so vague that it could prevent moderation entirely.
“In other words, if a social media service allows coverage of Russia’s invasion of Ukraine, it would also be forced to disseminate Russian propaganda about the war,” Marchese said. “So if you allow conversation on a topic, then you must allow all viewpoints on that topic, no matter how horrendous those viewpoints are.”
The Florida law “would require covered entities — including ones that you wouldn’t necessarily think of, like Etsy — to host all or nearly all content from so-called ‘journalistic enterprises,’ which is basically defined as anybody who has a small following on the internet,” Marchese explained. The law also prohibits taking down any speech from political candidates.
The impact of the two cases will likely be felt far beyond those two states, as dozens of similar content moderation bills have already been proposed in states across the country, according to Ali Sternburg, vice president of information policy for the CCIA.
But for now, both laws are blocked while the Supreme Court decides whether to hear the cases. On Jan. 23, the court asked for the U.S. solicitor general’s input on the decision.
“I think this was their chance to buy time because in effect, so many of these cases are actually asking the court to do opposite things,” Van Alstyne said.
Separate set of cases calls for more, not less, moderation
In February, the Supreme Court will hear two cases that effectively argue the reverse of the Texas and Florida laws by alleging that social media platforms are not doing enough to remove harmful content.
The cases were brought against Twitter and Google by family members of terror attack victims, who argue that the platforms knowingly allowed terrorist groups to spread harmful content and coordinate attacks. One case specifically looks at YouTube’s recommendation algorithms, asking whether Google can be held liable for not only hosting but promoting terrorist content.
Algorithms have become “the new boogeyman” in ongoing technology debates, but they essentially act like mirrors, determining content recommendations based on what users have searched for, engaged with and said about themselves, Cohn explained.
Reese Schonfeld, President of Cable News Network and Reynelda Nuse, weekend anchorwoman for CNN, stand at one of the many sets at the broadcast center in Atlanta on May 31, 1980. The network, owned by Ted Turner, began it’s 24-hour-a-day news broadcasts on Sunday in the afternoon. (AP Photo/Joe Holloway used with permission.)
“This has been litigated in a number of different contexts, and in pretty much all of them, the courts have said we can’t impose liability for the communication of bad ideas,” Cohn said. “You hold the person who commits the wrongful act responsible, and that’s it. There’s no such thing as negligently pointing to someone to bad information.”
A better alternative to reforming Section 230 would be implementing “more disclosures and transparency specifically around how algorithms are developed and data about enforcement,” said Jessica Dheere, director of Ranking Digital Rights.
Social media platforms have a business incentive to take down terrorist content, and Section 230 is what allows them to do so without over-moderating, Sternberg said. “No one wants to see this horrible extremist content on digital platforms, especially the services themselves.”
Holding platforms liable for all speech that they carry could have a chilling effect on speech by motivating platforms to err on the side of removing content, Van Alstyne said.
Our Broadband Breakfast Live Online events take place on Wednesday at 12 Noon ET. Watch the event on Broadband Breakfast, or REGISTER HERE to join the conversation.
Wednesday, January 25, 2023, 12 Noon ET – Section 230, Google, Twitter and the Supreme Court
The Supreme Court will soon hear two blockbuster cases involving Section 230 of the Telecommunications Act: Gonzalez v. Google on February 21, and Twitter v. Taamneh on February 22. Both of these cases ask if tech companies can be held liable for terrorist content on their platforms. Also in play: Laws in Florida and in Texas (both on hold during the course of litigation) that would limit online platforms’ ability to moderate content. In a recent brief, Google argued that denying Section 230 protections for platforms “could have devastating spillover effects.” In advance of Broadband Breakfast’s Big Tech & Speech Summit on March 9, this Broadband Breakfast Live Online event will consider Section 230 and the Supreme Court.
Panelists:
- Chris Marchese, Counsel, NetChoice
- Ari Cohn, Free Speech Counsel, TechFreedom
- Jessica Dheere, Director, Ranking Digital Rights
- Ali Sternburg, Vice President of Information Policy, Computer & Communications Industry Association
- Marshall Van Alstyne, Questrom Chair Professor, Boston University
- Drew Clark (moderator), Editor and Publisher, Broadband Breakfast
Panelist resources:
- Reynaldo Gonzalez, et. al, v. Google, Supreme Court Docket 21-1333
- Twitter v. Mehier Taamneh, et al., Supreme Court Docket 21-1496
- NetChoice v. Ken Paxton, Attorney General of Texas, Supreme Court Docket 22-555
- Ashley Moody, Attorney General of Florida, et al., v. NetChoice, Supreme Court Docket 22-277
- Free Speech, Platforms & The Fake News Problem, Marshall Van Alstyne, December 31, 2021
- The Big Tech Scorecard, Ranking Digital Rights, 2022
- “It’s the Business Model: How Big Tech’s Profit Machine is Distorting the Public Sphere and Threatening Democracy” Report Series, Ranking Digital Rights, 2020
- Supreme Court Seeks Biden Administration’s Input on Texas and Florida Social Media Laws, Broadband Breakfast, January 24, 2023
- Google Defends Section 230 in Supreme Court Terror Case, Broadband Breakfast, January 13, 2023
- Changing Section 230 Would Jeopardize Startup, Broadband Breakfast, January 6, 2023
- Amid Big Tech Controversies, Section 230’s Future is Uncertain, Broadband Breakfast’s 12 Days of Broadband, December 20, 2023
- Tech Groups, Free Expression Advocates Support Twitter in Landmark Content Moderation Case, Broadband Breakfast, December 8, 2023
Chris Marchese analyzes technology-related legislative and regulatory issues at both the federal and state level. His portfolio includes monitoring and analyzing proposals to amend Section 230 of the Communications Decency Act, antitrust enforcement, and potential barriers to free speech and free enterprise on the internet. Before joining NetChoice in 2019, Chris worked as a law clerk at the U.S. Chamber Litigation Center, where he analyzed legal issues relevant to the business community, including state-court decisions that threatened traditional liability rules.
Ari Cohn is Free Speech Counsel at TechFreedom. A nationally recognized expert in First Amendment law, he was previously the Director of the Individual Rights Defense Program at the Foundation for Individual Rights in Education (FIRE), and has worked in private practice at Mayer Brown LLP and as a solo practitioner, and was an attorney with the U.S. Department of Education’s Office for Civil Rights. Ari graduated cum laude from Cornell Law School, and earned his Bachelor of Arts degree from the University of Illinois at Urbana-Champaign.
Jessica Dheere is the director of Ranking Digital Rights, and co-authored RDR’s spring 2020 report “Getting to the Source of Infodemics: It’s the Business Model.” An affiliate at the Berkman Klein Center for Internet & Society, she is also founder, former executive director, and board member of the Arab digital rights organization SMEX, and in 2019, she launched the CYRILLA Collaborative, which catalogs global digital rights law and case law. She is a graduate of Princeton University and the New School.
Ali Sternburg is Vice President of Information Policy at the Computer & Communications Industry Association, where she focuses on intermediary liability, copyright, and other areas of intellectual property. Ali joined CCIA during law school in 2011, and previously served as Senior Policy Counsel, Policy Counsel, and Legal Fellow. She is also an Inaugural Fellow at the Internet Law & Policy Foundry.
Marshall Van Alstyne (@InfoEcon) is the Questrom Chair Professor at Boston University. His work explores how IT affects firms, innovation, and society with an emphasis on business platforms. He co-authored the international best seller Platform Revolution and his research influence ranks among the top 2% of all scientists globally.
Drew Clark (moderator) is CEO of Breakfast Media LLC. He has led the Broadband Breakfast community since 2008. An early proponent of better broadband, better lives, he initially founded the Broadband Census crowdsourcing campaign for broadband data. As Editor and Publisher, Clark presides over the leading media company advocating for higher-capacity internet everywhere through topical, timely and intelligent coverage. Clark also served as head of the Partnership for a Connected Illinois, a state broadband initiative.
As with all Broadband Breakfast Live Online events, the FREE webcasts will take place at 12 Noon ET on Wednesday.
SUBSCRIBE to the Broadband Breakfast YouTube channel. That way, you will be notified when events go live. Watch on YouTube, Twitter and Facebook.
See a complete list of upcoming and past Broadband Breakfast Live Online events.
Originally posted on January 31, 2023 @ 4:28 am