Commentary
Breffni Neary is a second-year evening student at Fordham University School of Law. At Fordham, Breffni is a Staff Member on the Fordham Law Voting Rights and Democracy Forum, serves on the Board of Student Advisors for first-year evening students, and is the Evening Representative and Secretary for Fordham OUTLaws. Breffni graduated summa cum laude with a bachelor’s degree in Media Studies and Early Childhood Education from the Macaulay Honors College at the City University of New York, Hunter College.
To train the next generation of lawyers in the law and practice of voting rights, ballot access, campaign finance, election administration, and democracy protection.
By Breffni Neary
Feb 23, 2023, 1:45 PM
On January 25, 2023, Meta’s President of Global Affairs announced that former President Donald Trump would be welcomed back to Facebook and Instagram. Mr. Trump’s ban from the social media platforms expired on January 7, 2023, two years after the January 6th insurrection at the United States Capitol. The company stated that the public safety risk created by Trump in 2021 had sufficiently receded, and that “[d]emocracy is messy and people should be able to make their voices heard.” That said, Meta specified that Trump would be subject to “heightened penalties” if he violates Meta’s updated community standards by promoting civil unrest.
The reinstatement follows a December 2022 letter to Meta from democratic lawmakers that addressed concerns over Trump’s “conspiratorial rhetoric” about the 2020 election on the social media platform Truth Social and the likelihood he would bring the same messages to Facebook. In their letter, these members stated that re-platforming Trump would not allow Meta to maintain a legitimate integrity policy.
Election law expert Richard L. Hasen holds a similar perspective. Calling Meta’s decision “lamentable and ill-advised,” Professor Hasen expressed concern about the millions of Americans who believe Trump’s voter fraud false claims and the violence some committed against the democratic election process. Professor Hasen also stated that Meta’s “heightened penalties” policy does not go far enough to protect free and fair elections. For instance, Professor Hasen cites to January 2023 reporting that details that Trump “is planning to make his return” to mainstream social media platforms “with posts about ‘rigged elections.’” Even more, Meta’s own oversight board, which does not determine company policy, has called for Meta to be more clear about such policies and its definitions of violations.
In a similar vein, Princeton Politics Professor Jan-Werner Muller has stated that while the decision to reinstate Trump was wrong, it is not the end of democracy as we know it. Professor Muller posits that “democracy is based on the notion that no one is irredeemable,” and that people deserve second chances, even when granting that second chance is difficult. In cases like former President Trump’s, Professor Muller contends that those who previously “engaged in anti-democratic actions” must have the opportunity to convince skeptics “that they have changed their ways.” In Professor Muller’s view, the more significant issue is Meta’s hypocrisy and its focus on profits, seemingly placing its engagement numbers above creating a safe environment.
The House Select Committee to Investigate the January 6th Attack on the United States Capitol also criticized Meta in its leaked report. The Committee drafted the view that social media companies contributed to the January 6th riot due to their failure to cull the rampant misinformation being spread about the 2020 election. Referring to Meta specifically, the Committee said, “Facebook did not fail to grapple with election delegitimization after the election so much as it did not even try.” The Committee also noted that fear of reprisal and claims of censorship from conservatives influenced Meta’s decision-making.
This pressure from conservative lawmakers may explain why Meta’s new guidelines focus on future elections instead of prior ones. According to a Meta spokesperson, Trump will be allowed to continue attacking the 2020 election without facing consequences. Ultimately, as some legal scholars contend, election denialism may undermine the legitimacy of future elections. Trump, however, is presumably less capable of impacting future elections on Facebook. Meta’s policies changed during his ban, and the platform no longer maintains the same ad-targeting features that existed during Trump’s previous campaigns. As such, political advertisers have switched their focus to streaming services, as Facebook is no longer a strong return on investment.
Regardless, Trump has yet to post on Facebook and Instagram—or Twitter, which lifted his ban in November 2022. For now, Trump continues to frequently post on Truth Social. But Trump’s return to mainstream social media may come soon. According to a Pew Research Center report, Trump’s Truth Social posts do not have the same impact as they would on Facebook—as only two percent of Americans check the site for news, compared to the thirty-one percent of Americans who use Facebook for news.
While the political world awaits Trump’s return to mainstream social media, several other legal concerns at the nexus of social media and free speech may affect democracy leading up to the 2024 election. Of chief concern is the House Republicans’ crusade against “Big Tech.” In a December 2022 letter to Mark Zuckerberg, U.S. Representative Jim Jordan (R-OH) claimed that Meta and other Big Tech companies were colluding with the Biden Administration to suppress freedom of speech online. Notably, however, in the 117th Congress, cracking down on Big Tech was a rare bipartisan desire—even if the impetus for the proposed changes differed.1For example, Senators Tom Cotton (R-AR) and Amy Klobuchar (D-MN) introduced the Platform Competition and Opportunity Act. S. 3197, 117th Cong. (2021). The legislation, among others introduced, was explicitly directed at suppressing Big Tech mergers and acquisitions.
Section 230 of the Communications Decency Act247 U.S.C. § 230. is especially relevant as it allows social media companies to moderate their platforms as they see fit while escaping liability entirely for their users’ posts.3§ 230(c). On February 21 and 22, the Supreme Court heard two cases, Gonzalez v. Google LLC42 F.4th 871 (9th Cir. 2021), cert granted, 214 L. Ed.2d 12, 143 S. Ct. 80 (2022). and Twitter, Inc. v. Taamneh,5214 L. Ed. 2d 12, 143 S. Ct .81 (2022). concerning the scope of protection provided by Section 230. The companion cases are specific to platforms’ responsibility regarding the removal of terrorist content. Still, any decision on online safety standards and post removal based on viewpoint could very well be relevant to Trump and his impact on democracy. At the same time, faced with both “economic headwinds and political and legal pressure,” Big Tech companies are reducing their teams of policy experts handling misinformation. David Brody, the Managing Attorney for the Digital Justice Initiative at the Lawyers’ Committee for Civil Rights Under the Law sums us the complexity of free speech: “[t]here’s a freedom to speak freely[] . . . [b]ut there’s also the freedom to be free from harassment, to be free from discrimination.” Accordingly, while frustration builds on both sides of the aisle in Congress, the Supreme Court will likely have the first say on Section 230.
References:
Related