If you read or post in the comments sections accompanying your online news, the U.S. Supreme Court may soon give you a lot to comment about but fewer places to do it. The same diminishing access may apply if you debate with friends and family on Facebook, enjoy sharing memes on X (formerly known as Twitter), leave reviews on Yelp or even list products for sale on Etsy.

A lot of the everyday online platforms that many of us use to connect with friends and family, gain information or run a business provide a place to create user-generated content. As a result, most platforms require some degree of content moderation around what is or isn’t allowed. But state laws in Texas and Florida could change how platforms can handle such content, making them less likely to offer such opportunities for expression to users, which would impact far more than just social media.

The good news for those who enjoy all the opportunities user-generated content creates is the Supreme Court is considering challenges to these laws and the possibility that they violate the First Amendment. The Supreme Court on Monday held oral arguments in the cases of NetChoice v. Paxton and Moody v. NetChoice, each of which tests the constitutionality of the two states’ social media laws.

The law in Texas prohibits social media platforms with more than 50 million active U.S. users from censoring “a user, a user’s expression, or a user’s ability to receive the expression of another person” and would prevent online services from engaging in content moderation except in a few specific instances.

Florida seeks to limit large social media platforms with “annual gross revenues in excess of $100 million” and “at least 100 million monthly individual platform participants globally” from willfully de-platforming a political candidate. Along with other provisions on public contracts, the law aims to categorize big social media platforms as “common carriers.”

Similar social media laws, different legal conclusions

These cases arrived at the Supreme Court after lower courts came to different conclusions on the constitutionality of the two similar laws.

In Moody v. NetChoice, the 11th Circuit ruled that Florida’s provisions on content moderation and disclosure requirements limit platforms’ ability to exercise editorial judgment, therefore triggering First Amendment scrutiny.

Meanwhile, in NetChoice v. Paxton, the 5th Circuit ruled that large social media platforms do not have a First Amendment right to “censor” speech, finding a sufficient state interest to regulate the platforms’ conduct.

Now, the Supreme Court is considering the underlying question: Do the content moderation restrictions and transparency requirements placed by the Texas and Florida laws each violate the First Amendment rights of the large social media platforms these laws seek to regulate?

Though a great deal of emphasis will be placed on the platform’s First Amendment rights, those rights will also impact the user experience.

It may be easy to think that requiring platforms to permit certain users or content would help ensure that free speech continues to flourish, or that minority viewpoints are not crowded out, but these proposals would create new concerns not only for platforms but also for users.

For example, under the Texas law, a platform serving the Jewish community that fits the law’s provisions would not be able to remove lawful but awful antisemitic content. Under the Florida law, anyone could register to run for, say, local dogcatcher and proceed to share videos of animal cruelty and platforms would be unable to respond.

What about a LinkedIn job fair? Or views on Etsy or Uber?

The Supreme Court hearing raised important questions Monday about what these laws would mean not only for traditional social media but also a LinkedIn virtual job fair, or if Etsy wanted to limit discussion of politics on its platforms.

As Justice Sonia Sotomayor noted to Florida’s solicitor general, “It seems like your law is covering just about every social media platform on the internet.”

The ways these laws apply to the content moderation needed for user-generated content means that they won’t just impact social media but also things like review sites, comment sections on articles and even online listings. As the Supreme Court noted, while the Texas law is a bit narrower, the Florida law could impact even platforms like Uber.

Some platforms might just choose not to carry user-generated content anymore, given the risk. Others might eliminate the opportunity to discuss certain topics or offer certain products that are perceived as having a viewpoint (although the internet often illustrates that almost anything can become a vehemently held belief and viewpoint).

While the Texas law is narrower on what platforms it covers, it remains very vague on what constitutes a viewpoint in a way that could eliminate any number of both serious and more fanciful internet debates on everything from the upcoming election to Taylor Swift’s impact on the NFL.

While some will try to make the NetChoice cases only about “Big Tech” or social media, the reality is that the Supreme Court’s decision will impact not just the companies but also everyday internet users.