divThe Wall Street Journal reports that Facebook has consulted with conservative individuals and groups about its content moderation. Recently I suggested that social media managers would be inclined to give stakeholders a voice (though not a veto) on content moderation policies. Some on the left were well ahead in this game, proposing that the tech companies essentially turn over content moderation of “hate speech” to them. Giving voice to the right represents a kind of rebalancing of the play of political forces.
divI argued earlier that looking to stakeholders had a flaw. These groups would be highly organized representatives of their members but not of most users of a platform. The infamous “special interests” of regular politics would thus come to dominate social media content moderation which in turn would have trouble generating legitimacy with users and the larger world outside of the internet. 
divBut another possibility exists which might be called “pluralism.” Both left and right are organized and thus are stakeholders. Social media managers recognize and seek advice from both sides about content moderation. But the managers retain the right of deciding the “content” part of content moderation. The groups are not happy, but we settle into a stable equilibrium that over time becomes a de facto speech regime for social media. 
divA successful pluralism is possible. A lot will depend on the managers rapidly developing the political skills necessary to the task. They may be honing such skills. Facebook’s efforts with conservatives are far from hiring the usual suspects to get out of a jam. Twitter apparently followed conservative advice and verified a pro-gun Parkland survivor, an issue of considerable importance to conservative web pundits, given the extent of institutional support for the March for Our Lives movement. Note I am not saying the Right will win out but rather the companies may be able to manage a balanced system of oversight. 
divBut there will be challenges for this model. 
divSpending decisions by Congress are often seen as a case of pluralist bargaining. Better organized or more skillful groups get more from the appropriations process; those who lose out can be placated with “side payments” to make legislation possible. Overall you get spending bills that no one completely likes, but everyone can live with until the next appropriations cycle. (I know that libertarians reject this sort of pluralism, but I not discussing what should be but rather what is as a way of understanding private content moderation).


Here’s the challenge. The groups trying to affect social media content moderation are not bargaining over money. The left believes much of the rhetoric of the right has no place on any platform. The right notes that most social media employees lean left and wonder if their effort to cleanse the platforms begins with Alex Jones and ends with Charles Murray (i.e. everyone on the right). The right is thus tempted to call in a fourth player in the pluralist game of content moderation: the federal government. Managing pluralist competition and bargaining is a lot harder in a time of culture wars, as Facebook and Google have discovered. 

divTransparency will not help matters. The Journal article mentioned earlier states:


For users frustrated by the lack of clarity around how these companies make decisions, the added voices have made matters even murkier. Meetings between companies and their unofficial advisers are rarely publicized, and some outside groups and individuals have to sign nondisclosure agreements.

divMurkiness has its value! In this case, it allows candid discussions between the tech companies and various representatives of the left and the right. Those conversations might build trust between the companies and the groups from the left and the right and maybe even, among the groups. The left might stop thinking democracy is threatened online, and the right might conclude they are not eventually going to be pushed off the platforms. We might end up with rules for online speech that no one completely likes and yet are better than all realistic alternatives. 


Now imagine that everything about private content moderation is made public. For some, allowing speech on a platform will become compromising with “hate.” (Even if a group’s leaders don’t actually believe that, they would be required to say it for political reasons). Suppressing harassment or threats will frighten others and foster calls for government intervention to protect speech online. Our culture wars will endlessly inform the politics of content moderation. That outcome is unlikely to be the best we can hope for in an era when most speech will be online.

div