0.6 C
Moscow
Saturday, November 23, 2024

Supreme Court Declines to Rule on Social Media Laws in Florida and Texas

Must read

The Supreme Court sidestepped a definitive resolution on Monday in a pair of cases challenging state laws aimed at curbing the power of social media companies to moderate content. The ruling left in limbo an effort by Republicans who had promoted the legislation as a remedy to what they say is a bias against conservatives.

It was the most recent instance of the Supreme Court considering — and then dodging — a major decision on the parameters of speech on social media platforms.

The state laws differ in their details. Florida’s prevents the platforms from permanently barring candidates for political office in the state, while Texas’ prohibits the platforms from removing any content based on a user’s viewpoint.

The justices unanimously agreed to return the cases to lower courts for analysis. Justice Elena Kagan, writing for the majority, noted that neither lower appeals courts had properly analyzed the First Amendment challenges to the Florida and Texas laws.

“In sum, there is much work to do below on both these cases,” Justice Kagan wrote, adding, “But that work must be done consistent with the First Amendment, which does not go on leave when social media are involved.”

Under the narrow ruling, the state laws remain intact, but lower court injunctions also remain in place, meaning both laws continue to be paused.

Although the justices voted 9-to-0 to return the cases to the lower courts, they splintered on the reasoning, with several writing separate concurrences to lay out their positions. Justice Kagan was joined by Chief Justice John G. Roberts Jr., along with Justices Sonia Sotomayor, Brett M. Kavanaugh and Amy Coney Barrett. Justice Ketanji Brown Jackson joined, in part.

In a separate concurring opinion, Justice Barrett hinted at how lower courts might analyze the cases.

Justice Barrett wrote that the federal appeals court that examined the Florida case showed an “understanding of the First Amendment’s protection of editorial discretion” that “was generally correct,” while the appeals court that examined the Texas case did not.

A unanimous three-judge panel of the U.S. Court of Appeals for the 11th Circuit had largely upheld a preliminary injunction that temporarily blocked Florida’s law.

A divided three-judge panel of the Fifth Circuit, by contrast, had reversed a lower court’s order blocking the Texas law.

That the justices avoided making any major statement on the issue allowed both sides to declare victory.

Chris Marchese, the director of the litigation center at NetChoice, one of the trade groups that challenged the laws, said in a statement that the “Supreme Court agreed with all our First Amendment arguments.”

Ashley Moody, the Florida attorney general, suggested on social media that the outcome was to the state’s advantage. “While there are aspects of the decision we disagree with, we look forward to continuing to defend state law,” she said.

The Biden administration had supported the social media companies in both cases, Moody v. NetChoice, No. 22-277, and NetChoice v. Paxton, No. 22-555.

In the majority opinion, Justice Kagan remarked on how quickly the internet has evolved. Less than 30 years ago, she wrote, the justices still felt the need to define the internet in their opinions, describing it at the time as “an international network of interconnected computers.”

Today, she wrote, “Facebook and YouTube alone have over two billion users each.”

She described a flood of content that has prompted major platforms to “cull and organize” posts. The platforms sometimes remove messages entirely or add warnings or labels, often in accordance with community standards and guidelines that help the sites determine how to treat a variety of content.

Because such sites can “create unparalleled opportunities and unprecedented dangers,” she added, it is no surprise that lawmakers and government agencies struggle with how and whether to regulate them.

Government entities are typically better positioned to respond to these challenges, Justice Kagan noted, but courts still play an integral role “in protecting those entities’ rights of speech, as courts have historically protected traditional media’s rights.”

The laws at issue in these cases, statutes enacted in 2021 by Florida and Texas lawmakers, differ in what companies they cover and what activities they limit. However, Justice Kagan wrote, both restrict platforms’ choices about what user-generated content will be shown to the public. Both laws also require platforms to give reasons for their choices in moderating content.

Justice Kagan then provided a clue about how a majority of the justices may be thinking about how to apply the First Amendment to these types of laws.

Although it was too early for the court to come to conclusions in the cases, she wrote, the underlying record suggested that some platforms, at least some of the time, were engaged in expression.

“In constructing certain feeds, those platforms make choices about what third-party speech to display and how to display it,” Justice Kagan wrote. “They include and exclude, organize and prioritize — and in making millions of those decisions each day, produce their own distinctive compilations of expression.”

She added that although social media is a newer format, “the essence” is familiar. She analogized the platforms to traditional publishers and editors who select and shape others’ expressions.

“We have repeatedly held that laws curtailing their editorial choices must meet the First Amendment’s requirements,” Justice Kagan wrote. “The principle does not change because the curated compilation has gone from the physical to the virtual world.”

So far, however, the justices have avoided definitively defining the responsibility of social media platforms for content, even as they have continued to acknowledge the enormous power and reach of the networks.

Last year, the justices declined to hold technology platforms liable for user content in a pair of rulings — one involving Google and the other involving Twitter. Neither decision clarified the breadth of the law that protects the platforms from liability for those posts, Section 230 of the Communications Decency Act.

The Florida and Texas laws at issue on Monday were prompted in part by the decisions of some platforms to bar President Donald J. Trump after the Jan. 6, 2021, attack on the Capitol.

Supporters of the laws said they were an attempt to combat what they called Silicon Valley censorship. The laws, they added, fostered free speech, giving the public access to all points of view.

Opponents said the laws trampled on the platforms’ own First Amendment rights and would turn them into cesspools of filth, hate and lies.

A ruling that tech platforms have no editorial discretion to decide which posts to allow would have exposed users to a greater variety of viewpoints but almost certainly would also have amplified the ugliest aspects of the digital age, including hate speech and disinformation.

The two trade associations challenging the state laws — NetChoice and the Computer & Communications Industry Association — said that the actions that the Court of Appeals for the Fifth Circuit called censorship in upholding the Texas law were editorial judgments protected by the First Amendment.

The groups said that social media companies were entitled to the same constitutional protections enjoyed by newspapers, which are generally free to publish without government interference.

A majority of the justices were sharply critical of the Fifth Circuit’s decision to reverse a lower court’s order that had blocked the Texas law.

Justice Kagan wrote that the Texas law prevented social media platforms from using content-moderation standards “to remove, alter, organize, prioritize or disclaim posts in its news feed.” That legislation, she wrote, blocks precisely the types of editorial judgments that the Supreme Court has previously held as protected by the First Amendment.

She said that particular application of the law was “unlikely to withstand First Amendment scrutiny.”

But in concurring opinions, Justices Jackson and Barrett acknowledged the difficulty of making sweeping pronouncements about how free speech protections should work online.

Justice Barrett offered a hypothetical: A social media platform could be protected by the First Amendment if it set rules for what content is allowed on its feed, and then used an algorithm to automate its enforcement of those policies. But she said it could be less clear that the First Amendment protected software that determined, on its own, what content was harmful.

“And what about A.I., which is rapidly evolving?” she wrote. “What if a platform’s owners hand the reins to an A.I. tool and ask it simply to remove ‘hateful’ content?”

Olivier Sylvain, a law professor at Fordham University, said that Monday’s ruling could open the door for the court or regulators to consider those more complicated issues. That could include how to handle commercial speech online, like platforms that amplify discriminatory advertising, rather than the political viewpoints at the heart of Monday’s ruling.

“Texas and Florida were taken by an ideological political spat that social media companies are biased against conservative viewpoints,” he said. “I’m hopeful, at least, that this has cabined that stuff out and we can start thinking about all the many questions that are far more interesting.”



Source link

More articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest article

Translate »