Can Florida and Texas Regulate Content Moderation and User Removal from Social Media Platforms?

Francesca Huth

Large social media platforms have historically had significant freedom to regulate the content disseminated on their sites. First Amendment lawsuits against social media companies are “routinely dismissed” because social media companies are private actors and their “platforms are not public forums.”[1] As a result, social media platforms can choose to filter out comments they categorize as false, harmful, or spam. In May of 2021, however, Florida passed a law severely restraining the editorial purviews of social media platforms.

Florida law SB 7072 seeks to control how social media platforms moderate their content and prohibit them from de-platforming political candidates.[2] The law also contains new transparency requirements that force social media platforms to disclose the specific mechanisms they use to identify offensive and spam content.[3] Not only does this law undermine social media platforms’ right to exclude certain content, it violates the “privacy interest that [they] have in their editorial source data.”[4]

The law was seemingly passed as a response to platforms like Twitter and Facebook banning former President Trump following the January 6th attack on the United States Capitol. The law ostensibly makes social media sites more open spaces where people with varying viewpoints can speak freely. In effect, though, it makes places like Facebook and Twitter more susceptible to vitriolic speech and false, harmful rhetoric. These social media platforms should not be forced to display the content of political extremists spewing hate speech.

NetChoice, a trade association of technology and internet-based businesses, sued Florida in federal court, arguing that SB 7072 violates social media platforms’ First Amendment right to select and filter the content hosted on their platforms.[5] The district court ruled in favor of NetChoice, and the Eleventh Circuit affirmed in part and reversed in part. While the Eleventh Circuit agreed with NetChoice that Florida cannot impose these types of content moderation restrictions, it upheld the law’s disclosure provisions.[6] Florida then appealed to the Supreme Court.

In September 2021, Texas enacted HB 20 which closely resembles Florida’s law by prohibiting social media platforms from de-platforming individuals based on their viewpoints.[7] The Fifth Circuit upheld Texas’ law, stating that they disagreed with the Eleventh Circuit’s assertion that the First Amendment has an independent category protecting “editorial discretion.”[8] The Fifth Circuit then went further, stating that even if there was such a category, they “disagree with the Eleventh Circuit’s conclusion that the Platforms’ censorship is akin to the ‘editorial judgment’ that’s been mentioned in Supreme Court doctrine.”[9]

The circuit split between the Fifth and Eleventh Circuits increases the likelihood that the Supreme Court will grant certiorari in Florida’s case. The Eleventh Circuit cited a large body of caselaw to support its position that private entities have a right to exercise “editorial judgment” in how they disseminate speech. If the Supreme Court decides to rule on these matters, it will be an interesting test of the Supreme Court’s commitment to First Amendment rights when they are employed in ways that are at odds with the majority of justices’ political leanings.


[1] Brett M. Pinkus, The Limits of Free Speech in Social Media, Accessible Law (Apr. 26, 2021), [] [].

[2] SB 7072, 2021 Leg. Sess. (Fla. 2021).

[3] Id.

[4] Clark Neily et al., NetChoice v. Attorney General, State of Florida, CATO Institute (Oct. 22, 2022), [] [].

[5] NetChoice, L.L.C. v. Att’y Gen., Fla., 34 F.4th 1196, 1208 (11th Cir. 2022).

[6] Clark Neily et al., supra note 4.

[7] Brian Fung, Federal Appeals Court Pauses Texas Social Media Law’s Enforcement Amid Looming Supreme Court Petition, CNN (Oct. 13, 2022), [] [].

[8] NetChoice, L.L.C. v. Paxton, No. 21-51178, at 82 (5th Cir. Sept. 16, 2022).

[9] Id. at 12.