On February 26, 2024, the justices of the US Supreme Court heard Moody v. NetChoice and NetChoice v. Paxton, both cases that will determine how and what social media companies can do when running their platforms. The cases will test how powerful social media companies are and establish where the Tech Giants rank in the hierarchy of First Amendment protections in the United States. The question likely to be answered this spring: Do states have the right to restrict how social media companies moderate and run their platforms? 

Here are five facts to know about the SCOTUS hearing on Moody v. NetChoice and NetChoice v. Paxton:

 

1. TEXAS AND FLORIDA HAVE LAWS ON THE BOOKS RESTRICTING HOW SOCIAL MEDIA PLATFORMS CAN ACT 

The current case before the court comes from two states, Texas and Florida. The Texas law prohibits social media platforms from removing content based on a user’s point of view. For example, if a user says the COVID-19 vaccine is not safe, a social media platform is not able to remove the content because of the expressed point of view. The Florida law fines social media platforms if they ban a candidate who is running for political office in the state, requires the platform to be upfront with how they moderate content, and forbids platforms from removing content from a “journalistic enterprise.”

These laws were largely viewed (and cited by some lawmakers) as a response to social media platforms’ attempts to regulate the flow of factual information during the COVID-19 pandemic, as well as following former President Trump’s removal from some platforms following the January 6 events at the US Capitol. These laws, passed in conservative state legislatures, were said to protect individuals from censorship from social media platforms. While studies point to conservatives not losing engagement on social platforms, most Americans in 2020 believed that social media platforms were censoring political viewpoints. Both laws have been stayed pending a SCOTUS decision.  

 

2. SOCIAL MEDIA PLATFORMS DECIDE WHAT GOES ON THEIR PLATFORMS AND WHAT DOES NOT  

When you first sign up for an account on any social media platform, you agree to terms of service that give the platform the right to moderate content as they see fit. Social media companies cite terms of service as a way to obtain the agreement of users that their posts and content shared will abide by their regulations, and to help foster a safe environment for advertisers and general audiences.  

However, the Supreme Court has put limitations on how platforms can moderate content. In 1997, in Reno v. American Civil Liberties Union, SCOTUS ruled that a law regulating indecent speech online was unconstitutional. In the Court’s majority opinion, Justice John Paul Stevens held that speech on the Internet is entitled to the highest level of First Amendment protection, similar to the protection afforded to books and newspapers. However, the government can—and does—enforce decency standards on broadcast television and radio.

To keep platforms as welcoming as possible to a wide audience, Tech Giants have worked to police hate speech, harassment, bad actors, and those spreading disinformation around elections.  

The Texas and Florida legislatures argue that content moderation by Meta, Google, and X are forms of censorship. This was among the reasons that Elon Musk bought Twitter in 2022, to make it a “platform for free speech around the globe." 

 

3. THE POTENTIAL OUTCOMES TELL A BIGGER STORY  

The SCOTUS ruling, no matter the outcome, will further continue the conversation over who, how, when, and where speech can be moderated on social media platforms.  

  • If content moderation is allowed to continue as is: Critics of Big Tech will continue to claim the social media companies are restricting speech on the basis of points of view they disagree with. This could continue to drive distrust in social content and the platforms themselves, impacting individuals and brands on the platforms.  
  • If the Florida and Texas laws are overturned: This will likely result in further refinement of the state laws with another pass at challenging social media companies' ability to moderate content on their social media platforms. Inevitably, this puts the topic back into the political conversation.

 

4. SECTION 230 PROTECTS BIG TECH FROM LIABILITY—FOR NOW 

Anytime you have a conversation about Big Tech and their liability around content moderation, you have to have a conversation about Section 230. Section 230 of the Communications Decency Act of 1996 shields Internet platforms from legal liability over most user content and gives social media companies protection over their decisions to moderate content in certain circumstances.  

Social media platforms are caught between two sides.  

Activist groups want companies to take a tougher stand against harmful content and argue that Section 230 needs to be modified because of the harmful effects of social media companies. They argue Section 230 makes it impossible to hold social media companies accountable for the harm their platforms cause.  

On the other side, free speech advocates claim social media platforms cannot claim Section 230 immunity from liability for others’ speech and then claim that they are speakers with full First Amendment rights over others' speech.  

While neither of these cases challenged the legality of Section 230, the Supreme Court in Gonzalez v. Google (2023) and Twitter v. Taamneh (2023) decided not to hold YouTube or Twitter liable for the content that was challenged in court.  

 

5. CONGRESS IS STILL ANGRY 

Tech Giants have managed to do something that’s rare in today’s political environment: They’ve made both Republicans and Democrats angry. Republicans insist platforms are censoring them while Democrats argue that platforms are letting disinformation run rampant. Both parties are worried the platforms are negatively impacting children.  

Social media platforms, however, are also stuck in an uncomfortable place. They’ve asked the federal government for regulation so they can have guidelines to ease the burden from an angry public. Congress has taken their time providing these regulations, negotiating with one another and the Tech Giants to come up with a solution that is not only practical, but enforceable.  

Social media is one of the top ways to meet consumers where they are, so businesses must pay attention to where cases like these net out. Changing rules impact the types of content seen online and affect where and how companies show up.