Skip to content

Court cases test opposing views of online content moderation

Do social media companies do too much content moderation? Or too little?

Judiciary Committee member Amy Klobuchar, D-Minn., was one of four senators who proposed legislation in the last Congress that would require social media platforms to provide data to researchers on the companies’ decision-making processes. The bill didn't advance out of Judiciary.
Judiciary Committee member Amy Klobuchar, D-Minn., was one of four senators who proposed legislation in the last Congress that would require social media platforms to provide data to researchers on the companies’ decision-making processes. The bill didn't advance out of Judiciary. (Tom Williams/CQ Roll Call file photo)

Four judicial cases involving social media companies and online platforms will test the Supreme Court’s views on two competing claims: The companies either engage in too much content moderation or they do too little.

The high court has agreed to hear arguments in two of the cases and is weighing whether to take up the other two. Whatever its decisions, the outcomes could dramatically change how social media companies and online platforms operate, especially in light of a stalemate in Congress, where lawmakers of both parties want to amend online content moderation laws but can’t agree on how to do so. 

The court is hearing arguments this term in two cases asking whether social media companies can be held liable for third-party content on their websites. Those decisions could determine whether the companies can still count on a liability shield Congress provided in 1996 to nurture the fledgling internet. The cases test congressional intent in that law. 

The court has yet to decide whether to take two other cases involving Florida and Texas laws instructing social media companies not to remove posts based on political views. The justices’ request this month that the Biden administration weigh in on the cases has many experts predicting the court will do so. The cases test the limits of the First Amendment.

“I think it’s relevant to note that one cluster of cases suggests that companies are doing too little, and another claiming they are doing too much,” said Matt Schruers, president of the Computer and Communications Industry Association. 

The stakes in the cases are high: The media companies could have to change the functionality that makes the platforms so easy to use; lawmakers could face new pressure to update legislation for an industry that is now entrenched, wealthy and divisive; and users could discover sites that no longer offer the content that underlies their popularity.

High court rulings that favor the companies in the content liability cases would affirm protections provided by Section 230 of the 1996 law against liability for content added by users, and do so for an industry that is many times larger and more powerful than it was more than 25 years ago. Both cases involve content related to terrorism acts far from the U.S.

But rulings that the companies are liable could effectively mean they shut down discussions on topics such as racism, abortion, gun ownership and the Holocaust, some of the most hotly debated issues in U.S. politics, according to legal experts. 

Rulings in the Texas and Florida laws will tell the social media companies how far they can rely on First Amendment protection to restrict content. Because two appellate courts have come to different conclusions over the state laws, even a Supreme Court decision not to take the cases would present the companies with a challenge.

“The last word on the constitutional question, on what the First Amendment protects, that’s going to come from the court because the court has the power to interpret the Constitution,” said Caitlin Vogus, deputy director of the free expression project at the Center for Democracy and Technology, a group in Washington and Brussels that uses technology policy to promote democracy. “But what Section 230 should protect and all of that … the last word does have to come from Congress.”

Section 230

The court is hearing arguments this term in Gonzalez v. Google and Twitter v. Taamneh, cases that test the companies’ responsibility for terrorism-related content on their sites. Congress said in 1996 that social media companies shouldn’t be treated as publishers, thus shielding them from liability for content posted by users. 

Google cites Section 230 of that law in a case that arose when the family of Nohemi Gonzalez, a 23-year-old student who died in a terrorist attack in Paris in 2015, alleged that YouTube’s recommendation system led users to terrorist recruitment videos and the company’s parent, Google, is therefore responsible for her death. 

The family’s case is that YouTube’s recommendation system is separate from the publishing system and is not covered by Section 230. 

“We’re really worried about what the impact could be on internet users if the court were to say that Section 230 doesn’t apply to claims based on the recommendation of content,” said Vogus. “We could see platforms imposing categorical bans on certain topics or over-removing user speech, because of concerns about potential liability.” 

For Google search users, recommendations are at the heart of the product. Type in a search term and Google, by virtue of providing a list of links related to the term, is using its algorithm to make recommendations. Liability for harm resulting from those links could pose a major business risk to the company, and result in some searches yielding no results. 

“Because there is so much content available online, providers must make choices about what content to show users and what order to put it in,” Vogus said. “If the court holds that ‘recommendations’ are not shielded from Section 230, the choices that every provider has to make about how to display content to users could mean [the companies] lose immunity.”

“Some providers may prohibit all content related to terrorism for fear of liability, including news reports, documentation of terrorists’ human rights violations, or even anti-indoctrination materials,” she said. 

The examples are not just hypothetical, Vogus said. 

After Congress amended Section 230 in 2018 to make it easier to sue online providers for certain claims relating to sex trafficking, some platforms prohibited “content related to sex altogether, including things like sex education or sexual health information,” she said.  

CDT has filed a brief telling the court that it’s hard to separate online content from recommendations and urging it to hold that Section 230 protections apply to recommendation systems.

The Twitter case also involves liability for content, although it stems from an anti-terrorism law. The family of a Jordanian shot and killed in a 2017 terror attack in Istanbul sued under the 1987 law, asserting that Twitter was aiding and abetting terrorism by allowing users with links to terrorist groups to use its site. 

Twitter’s motion to dismiss the case on grounds that it was not liable for user-generated content, which is protected by Section 230, was denied by the 9th U.S. Circuit Court of Appeals. The same court said Google was protected, resulting in the family’s appeal to the Supreme Court. 

The Section 230 protection is controversial among lawmakers of both parties. 

At the height of the COVID-19 pandemic, Sens. Amy Klobuchar, D-Minn., and Ben Ray Luján, D-N.M., introduced legislation that would have created an exception to the broad Section 230 immunity by removing such protections for health-related misinformation promoted by social media algorithms. 

Provisions of the 1996 law intended to promote “online speech and allow online services to grow — now distorts legal incentives for platforms to respond to digital misinformation on critical health issues, like Covid-19, and leaves people who suffer harm with little to no recourse,” Klobuchar and Luján said in a statement at that time. 

And Sen. Josh Hawley, R-Mo., filed a brief to the court in the Google case saying the companies hide behind it when they distribute what he calls illegal content.  

Florida and Texas

The Florida and Texas laws pose questions about the limits of the First Amendment. The states passed the laws after social media platforms banned former President Donald Trump following the Jan. 6, 2021, attack on the Capitol. The laws also call on social media platforms to explain decisions to take down posts or ban some individuals.

The Computer and Communications Industry Association, representing Amazon, Google, Meta, Twitter and other tech companies, sued to block the laws, but two appellate courts reached opposite conclusions, resulting in appeals to the Supreme Court. In this case, high court inaction — not taking the case — could itself have consequences for the companies.

“It would sow confusion across the nation,” Schruers said. “There were 200 state laws introduced in the past session that all seek more government control over online speech, [and] we would see a proliferation of laws across the union with some requiring digital services to remove harmful content and others compelling them to leave it up.”

The cases rest on whether the government may “interpose itself on private digital services” and “compel speech,” he said. “There are few clear examples of a situation where the Supreme Court ought to weigh in than here.” 

Schruers said the Supreme Court set a precedent for the cases in a 1974 ruling, Miami Herald Publishing Company v. Tornillo, on a Florida law compelling a newspaper to print a political candidate’s response to critical editorials. The court found the law inconsistent with the First Amendment.

“You and I have that right, social media companies have that right … Home Depot has that right,” he said, adding that the court ought to apply the same standard in the current cases.

Schruers said the online platforms wouldn’t be shutting out conservatives even if they prevail in the cases against the Texas and Florida laws.

“Conservative voices are among the most effective at cultivating online audiences,” he said, adding that the court may discredit claims about bias but that that would’t end them. “They’re unlikely to fade away for as long as they make for good fundraising.”

But the cases could also result in narrow decisions on a middle ground between outright victory or defeat from one side or the other.

Ramya Krishnan, a staff attorney at the Knight First Amendment Institute at Columbia University in New York, said the court could find the two state laws unconstitutional but still leave in place parts of the laws that call for greater transparency into social media platforms’ decision-making, a decision that she said would foster public discourse.

“Increasing transparency into the platforms actually serves an important First Amendment aim,” Krishnan said. “Transparency can further that goal.” 

If the court hears the cases, the arguments might also shed light on how far transparency requirements could go and what details companies must make public without burdening their decision-making, she said.

Congress

Any insight into the decision-making might also contribute to legislative efforts with lawmakers who are hampered in their efforts in part because they don’t know how the platforms operate. 

Klobuchar and Sens. Chris Coons, D-Del., Bill Cassidy, R-La., and Rob Portman, R-Ohio, proposed legislation in the last Congress that would require social media platforms to provide data to researchers on the companies’ decision process, including details of their algorithms. The bill did not advance out of the Senate Judiciary Committee.

“It is only if lawmakers understand what is happening on the platforms that they can carefully draft laws to address the problems the platforms are contributing to,” Krishnan said.

Recent Stories

Lawmakers postpone decision on Virginia-class submarine money

House Foreign Affairs votes to recommend holding Blinken in contempt

Filibuster’s future, emergency abortions fuel Senate debate

‘Take their factories’: Trump vows to snatch jobs from other countries

Senate Democrats bash Supreme Court’s Trump immunity ruling

‘Hello, I’m Johnny Cash’: Statue of the ‘Man in Black’ unveiled at the Capitol