Last week, conservatives once again cried “bias” after Facebook banned a spate of popular fringe pundits and conspiracy theorists. Meanwhile, the week’s most important content moderation story went, for the most part, unnoticed. Had conservatives paid more attention to the Wisconsin Supreme Court’s ruling in Daniel v. Armslist, they might feel differently about the utility of platform intermediary liability protections like Section 230 of the Communications Decency Act, a bedrock indemnity that prevents internet platforms from liability for user behavior. While usually understood merely as a shield for social media firms, it guards a wide variety of services that utilize user-generated content, such as classified advertising or individual websites’ comments sections.


Armslist is essentially a digital classified ads section for guns. Daniel, the daughter of a shooting victim, sought to hold Armslist liable for the use of its platform by her mother’s murderer. Her suit alleged that certain features of Armslist’s site were negligently designed, without regard for how they might be used by persons prohibited from buying firearms. According to the complaint, Armslist should have anticipated that its lack of user registration requirements and the ability to search for in-state, background check-free sales would be misused by patrons prohibited from possessing firearms. In most states, it is perfectly legal to privately sell a firearm to your neighbor without utilizing the services of a federally licensed dealer. The imposition of broad liability on services that help to coordinate legal activities would burden and perhaps preclude Americans’ right to sell and buy firearms, a legal activity.


The suit against Armslist represents a growing trend of attempts to circumvent CDA 230’s protections by suing platforms for features that enable certain kinds of harmful user behavior, rather than simply suing over the behavior itself. Snapchat was recently sued for the creation of a speedometer filter by plaintiffs who were struck by a Snapchat user driving at over a hundred miles an hour (the driver was speeding in pursuit of a high speedometer reading on the Snapchat app, attempting to impress her friends with her foolishness). Thankfully, in both the Snapchat case and now in Daniel v. Armslist, judges have understood that “no matter how artfully pled”, these suits attempt to hold platforms responsible for user behavior and are therefore precluded by CDA 230.


Digital classified ads services and speedometers are neutral tools. Chief Justice Patience D. Roggensack writes in Armslist: “All of these features can be used for lawful purposes, so the CDA immunizes interactive computer service providers from liability when these neutral tools are used for unlawful purposes.” So long as a given feature can be used lawfully, service providers cannot be held liable for their unlawful use: doing so would unreasonably burden lawful users of the tool in question. Just because Uber can be used to summon a getaway vehicle after a heist does not render Uber a getaway car-hailing service. Had Armslist been decided differently, or if CDA 230’s protections were to be limited or eliminated, these tools would not be commercially viable. 


The conservative claims about social media bias may lead to legislative revisions to CDA 230. Those revisions could easily restrict current CDA 230 protections for businesses like Armslist. Indeed, some on the left would see removing such protections as a goal of revising CDA 230. The Armslist decision shows that the harm done to the Second Amendment would be real and permanent. Are the speculative gains of seizing control of Facebook’s content moderation really worth the risks to the Second Amendment? Won’t this be a case of unintended consequences of the sort conservatives used to warn us about so many, many years ago?