Supreme Court blocks Texas social media law that tech companies warned would allow hateful content to run rampant

News

A person walks down the sidewalk near the U.S. Supreme Court building in Washington, D.C., February 16, 2022.
Jon Cherry | Reuters

The Supreme Court on Tuesday blocked a controversial Texas social media law from taking effect, after the tech industry and other opponents warned it could allow for hateful content to run rampant online.

The decision does not rule on the merits of the law, known as HB20, but reimposes an injunction blocking it from taking effect while federal courts decide whether it can be enforced. The Supreme Court is likely to be asked to take a look at the constitutionality of the law in the future.

Five justices on the court voted to block the law for now. Justice Samuel Alito issued a written dissent from the decision, which was joined by two other conservative justices, Clarence Thomas and Neil Gorsuch. Justice Elena Kagan, a liberal, also voted to allow the law to remain in effect while a challenge to it is pending.

The law prohibits online platforms from moderating or removing content based on viewpoint. It stems from a common charge on the right that major California-based social media platforms like Facebook and Twitter are biased in their moderation strategies and disproportionately quiet conservative voices. The platforms have said they apply their community guidelines evenly and right-leaning users often rank among the highest in engagement.

Two industry groups that represent tech companies including Amazon, Facebook, Google and Twitter, claimed in their emergency application with the court, “HB20 would compel platforms to disseminate all sorts of objectionable viewpoints, such as Russia’s propaganda claiming that its invasion of Ukraine is justified, ISIS propaganda claiming that extremism is warranted, neo-Nazi or KKK screeds denying or supporting the Holocaust, and encouraging children to engage in risky or unhealthy behavior like eating disorders.”

Texas’ attorney general Ken Paxton, a Republican, has said this is not the case, writing in a response to the emergency application that the law does not “prohibit the platforms from removing entire categories of content.”

“So, for example,” the response says, “the platforms can decide to eliminate pornography without violating HB 20 … The platforms can also ban foreign government speech without violating HB 20, so they are not required to host Russia’s propaganda about Ukraine.”

Alito’s dissent opened by acknowledging the significance of the case for social media companies and for states that would regulate how those companies can control the content on their platforms.

“This application concerns issues of great importance that will plainly merit this Court’s review,” Alito wrote. “Social media platforms have transformed the way people communicate with each other and obtain news. At issue is a ground-breaking Texas law that addresses the power of dominant social media corporations to shape public discussion of the important issues of the day.”

Alito said he would have allowed the law to remain in effect as the case proceeds through federal courts. He emphasized he has “not formed a definitive view on the novel legal questions that arise from Texas’s decision to address the ‘changing social and economic’ conditions it perceives.”

“But precisely because of that, I am not comfortable intervening at this point in the proceedings,” he wrote. “While I can understand the Court’s apparent desire to delay enforcement of HB20 while the appeal is pending, the preliminary injunction entered by the District Court was itself a significant intrusion on state sovereignty, and Texas should not be required to seek preclearance from the federal courts before its laws go into effect.”

Where things stand now

The legislation was passed in September but blocked by a lower court, which granted a preliminary injunction keeping it from going into effect. That changed when a federal appeals court for the Fifth Circuit ruled in mid-May to stay the injunction pending a final decision on the case, meaning the law could be enacted while the court deliberated on the broader case.

That prompted two tech industry groups, NetChoice and the Computer and Communications Industry Association (CCIA), to file an emergency petition with Alito, who is assigned to cases from that district.

NetChoice and CCIA asked the court to keep the law from going into effect, arguing social media companies make editorial decisions about what content to distribute and display, and that the appeals court’s decision would get rid of that discretion and chill speech. It said the court should vacate the stay as the appeals court reviews the important First Amendment issues central to the case.

“Texas’s HB 20 is a constitutional trainwreck — or, as the district court put it, an example of ‘burning the house to roast the pig,'” said Chris Marchese, Counsel at NetChoice, in response to Tuesday’s ruling. ”We are relieved that the First Amendment, open internet, and the users who rely on it remain protected from Texas’s unconstitutional overreach.”

“No online platform, website, or newspaper should be directed by government officials to carry certain speech,” said CCIA President Matt Schruer. “This has been a key tenet of our democracy for more than 200 years and the Supreme Court has upheld that.”

The Supreme Court’s decision has implications for other states that may consider legislation similar to that in Texas. Florida’s legislature has already passed a similar social media law, but it has so far been blocked by the courts.

Soon after the tech groups’ emergency appeal in the Texas case, a federal appeals court for the Eleventh Circuit upheld an injunction against a similar law in Florida, unanimously concluding that content moderation is protected by the Constitution. Florida’s attorney general filed an amicus brief on behalf of her state and several others, urging the court to continue to allow the Texas law to be in effect, arguing the industry had misinterpreted the law and that states are within their rights to regulate businesses in this way.

Testing ground for Congress

The state laws serve as an early testing ground for the ways the U.S. Congress is considering reforming the legal liability shield tech platforms have relied on for years to moderate their services. That law, Section 230 of the Communications Decency Act, keeps online platforms from being held responsible for content users post to their services and also gives them the ability to moderate or remove posts in good faith.

The law has come under fire from both Democrats and Republicans, but for different reasons. Democrats seek to reform the law to give tech platforms more responsibility to moderate what they see as dangerous content, including misinformation. While Republicans agree certain types of content like terrorist recruitment or child sexual exploitation material should be removed, many seek to make it harder for platforms to engage in some other forms of moderation that they view as ideological censorship.

One of the authors of Section 230, former Rep. Christopher Cox, R-Calif., filed an amicus brief supporting the industry groups’ plea for the Supreme Court to reverse the stay. In the brief, Cox argues that HB20 “is in irreconcilable conflict” with Section 230, which should preempt the state law.

Still, at least one Justice on the Supreme Court has already expressed interest in reviewing Section 230 itself.

In 2020, Thomas, a conservative, wrote that “in an appropriate case, we should consider whether the text of this increasingly important statute aligns with the current state of immunity enjoyed by Internet platforms.”

Last year, he suggested in a concurrence that online platforms may be “sufficiently akin to common carriers or places of accommodation to be regulated in this manner.”

–CNBC’s Dan Mangan contributed to this report.

Subscribe to CNBC on YouTube.

WATCH: The messy business of content moderation on Facebook, Twitter, YouTube