Capitol Riots

Last week’s attack on the Capitol removed any doubts that the regulation of social media will continue to be a Congressional priority after the Trump administration ends. The president and his allies have argued for legislative changes to address alleged anti‐​conservative bias within the most popular social media companies. These complaints have often overshadowed concerns from Democratic lawmakers, who have expressed unease for years about the proliferation of extremist content online. Many of these lawmakers no doubt view last week’s tragedy as a vindication of their concerns, and they will act. We should expect renewed policy debates on Section 230 and encryption to center around political extremism.

Sadly, last week’s attack on the Capitol was not the first time Americans have witnessed domestic violence from those who become expressed extremist views online. In October 2018 a shooter murdered eleven congregants in the Tree of Life Synagogue in Pittsburgh. Shortly after, Sen. Mark Warner (D‑VA) said, “I have serious concerns that the proliferation of extremist content — which has radicalized violent extremists ranging from Islamists to neo‐​Nazis — occurs in no small part because the largest social media platforms enjoy complete immunity for the content that their sites feature and that their algorithms promote.” The Pittsburgh shooter had been an active poster on Gab, a social media site popular with white nationalists and conspiracy theorists. He allegedly posted, “HIAS [Hebrew Immigrant Aid Society] likes to bring invaders in that kill our people. I can’t sit by and watch my people get slaughtered. Screw your optics, I’m going in.”



Sen. Warner’s mention of “complete immunity” is a reference to Section 230 of the Communications Decency Act. The law states that interactive computer services, such as Facebook and Twitter, are not considered the publishers of the vast majority of content posted by users. Warner is incorrect when he describes Section 230 as providing “complete immunity.” The law does include exceptions for (among others things) content related to sex trafficking and content that violates copyright. Nonetheless, Warner is correct to note that online content can radicalize social media users.

A few months after the shooting in Pittsburgh, a white supremacist murdered dozens of Muslims during a shooting at two mosques in Christchurch, New Zealand—and live‐​streamed the shooting on Facebook. He had visited sites such as 4chan’s /​pol board, well‐​known as a home for alt‐​right content. His descent into xenophobic ideology did not occur solely on the Internet. He went on a pilgrimage to Europe, visiting sites of Islamic terrorist attacks and meeting with identitarian leaders. After the shooting, Sen. Richard Blumenthal (D‑CT) accused Facebook, YouTube, and Twitter of turning a “blind eye to hate & racism on their platforms.”

While the shootings in Pittsburgh and Christchurch prompted discussions about online extremist speech, it is safe to assume that the recent storming of the Capitol will lead to a much larger backlash against online extremist speech.

One of the most popular recent venues for extremist political speech and conspiracy theories was Parler. A social media network that portrayed itself as an online free speech zone, Parler became a popular venue for Trump supporters amid allegations of Silicon Valley anti‐​conservative bias. Parler users were among the rioters at the Capitol last week. Perhaps in anticipation of political backlash, Apple and Google removed Parler from their app stores. Amazon joined them in taking action by suspending Parler from Amazon Web Services (AWS) hosting, taking the site offline entirely. Parler is suing AWS, alleging that Amazon breached its contract. Since the AWS news, Parler has registered its domain with Epik, a domain registrar of last resort for the far‐​right.

Since Google, Apple, and Amazon severed ties with Parler a range of online platforms have seen an increase in users. Tens of millions of people signed up for Signal and Telegram, two encrypted messaging apps. Reporting from The New York Times reveals that at least one militia group is using Signal to organize its activities.

That Parler enjoyed Section 230 protections and political extremists moved to encrypted channels suggest that upcoming debates on online speech and encryption will feature frequent references to extremist content.

Although much of the debate surrounding objectionable online content features Section 230, we should not forget that it is the First Amendment, not Section 230, that protects a private company’s decision to remove content they find objectionable. Section 230 is about liability, not the freedom of association. Nonetheless, Section 230 remains crucial for any institution allowing users to post content on walls, message boards, review pages, etc.

Last year, a bipartisan group of senators proposed the EARN IT Act. The bill, as its name implies, would require companies to “earn” Section 230 protections, making them contingent on services adhering to a set of best practices developed by a commission aimed at tackling child sexual abuse material. Civil libertarians voiced their concerns about the bill, which many consider a threat to encryption. After all, if the commission deemed the creation of a “back door” to encrypted content part of their best practices, interactive computer services would be put in the position of choosing between threatening their users’ security and privacy or facing the potential of crippling lawsuits.

As lawmakers see political extremists flock to end‐​to‐​end encrypted messaging services such as Signal, they may look to proposals such as the EARN IT Act and seek to incentivize services to allow law enforcement to decrypt encrypted content. It is true that criminals use encryption, but so do journalists, whistleblowers, dissidents, members of the military, Capitol Hill staff, and many others. There is no such thing as encryption that only works for the good guys. Weakening encryption may help law enforcement investigate crimes, but it will put the privacy and security of Americans at risk.

Any legislation to address political extremism will quickly run into a stubborn barrier: the First Amendment. Much of the content shared on Parler was vile, but it was not illegal. Under U.S. law, it is not illegal to say that the world would be better if the vice president were killed, or spread conspiracy theories and racist content.

The list of speech not protected by the First Amendment is short, but it does include incitement to “imminent lawless action.” Although many commentators have described the rhetoric of President Trump’s Jan. 6th comments as “inciting” the mob to attack the Capitol, it is not obvious that his comments clear the Supreme Court’s incitement test set out in Brandenburg v. Ohio (1969). Under Brandenburg, speech that is “directed to inciting or producing imminent lawless action and is likely to incite or produce such action” is not protected by the First Amendment. Legal scholars and commentators have come to different conclusions about whether Trump’s comments meet that standard.

Whether Trump’s comments are illegal could have a significant impact on online speech if Section 230 is amended. Boston University Law School’s Danielle Citron and the Brookings Institution’s Ben Wittes have proposed changing Section 230 so that it applies only to interactive computer services that take “reasonable steps to prevent or address unlawful uses of its services.”

If such an amendment were enacted, interactive computer services would have an incentive to embrace false positives in order to ensure that they don’t run afoul of Section 230. Awful but lawful speech could be stifled because sites hosting third‐​party content would seek to avoid bankruptcy via a tsunami of lawsuits.

Some might ask, “What’s wrong with services having an incentive to err on the side of caution when it comes to borderline illegal speech?” The answer is that such an environment is likely to be anti‐​competitive, with powerful market incumbents best positioned to adapt to how courts and lawmakers interpret “reasonable steps.” While concerns about online political extremism are likely to prompt lawmakers to seek carrots and sticks for social media companies, we should keep in mind that Section 230 amendments could ultimately entrench the companies so many are criticizing.

The attack on the Capitol last week will bring online political extremism to the center of debates about encryption and Section 230. Amid such debates, we should be wary of the unintended consequences of weakening encryption and amending Section 230.