Facebook has set out a draft charter for an “Oversight Board for Content Decisions.” This document represents the first concrete step yet toward the “Supreme Court” for content moderation suggested by Mark Zuckerberg. The draft charter outlines the board itself and poses several related questions for interested parties. I will offer thoughts on those questions in upcoming blog posts. I begin here not with a question posed by Facebook, but rather by discussing two values I think get too little attention in the charter: legitimacy and deliberation.
The draft charter mentions “legitimacy” once: “The public legitimacy of the board will grow from the transparent, independent decisions that the board makes.” Legitimacy is commonly defined as conforming to law or existing rules (see, for example, the American Heritage Dictionary). But Facebook is clearly thinking more broadly, and they are wise to do so. Those who remove content from Facebook (and the board that judges the propriety of those removals) have considerable power. The authors of banned content acquire at least a certain stigma and may incur a broader social censure. Facebook has every legal right to remove the content, but they also need public acceptance of this power to impose costs on others. Absent that acceptance, the oversight board might become just another site of irreconcilable political differences or worse, “the removed” will call in the government to make things right. The oversight board should achieve many goals, but its architects might think first about its legitimacy.
The term “deliberation” also gets one mention in the draft charter: “Members will commit themselves not to reveal private deliberations except as expressed in official board explanations and decisions.” So there will be deliberations, and they will not be public (more on this in later posts about transparency). The case for deliberation is strengthened by considering its absence.
The draft could have said “members will commit themselves not to reveal private voting.…” In a pure stakeholder model of the board, members would accurately represent the Facebook community (that is, they would be diverse). Members would consider the case before them and vote to advance the interests of those they represent. No deliberation would be necessary, though talk among members might be permitted. And, of course, such voting could be both transparent and independent. But the decision would be a mere weighing of interests rather than a consideration of reasons.
Why would those disappointed by the decision nonetheless consider it legitimate? Facebook could say to the disappointed: The board has final say on appeals of content moderation (after all, it’s in the terms of service you signed), and this is their decision. Logically that deduction might do the trick, but I think a somewhat different process might increase the legitimacy of the content moderation in the eyes of the disappointed.
Consider a deliberative model for the board. A subset of the board meets and discusses the case before them. Arguments are offered, values probed, and conclusions reached. But the votes on the case would be informed by the prior deliberation. Members will represent the larger community in its many facets, but the path from representation to voting will include a collective giving and taking of reasons. That difference, I think, makes the deliberative model more likely to gain legitimacy. Simply losing a vote can seem like an expression of power. Losing an argument is more acceptable, and later the argument might be renewed with a different outcome.
The importance of deliberation implicates other values in the charter, especially independence. The draft places great weight on the independence of the board from Facebook. That emphasis is understandable. Critics have said Facebook will turn a blind eye to dangerous speech because it attracts attention and thereby, advertising dollars (Mark Zuckerberg has rebutted this criticism). The emphasis on independence from the business contains a truth: a board dedicated to maximizing Facebook’s quarterly returns might have a hard time gaining legitimacy. But the board’s deliberations should not be completely independent of Facebook. Facebook needs to make money to exist. Doing great harm to Facebook as a business cannot be part of the remit of the board.
Here, as so often in life, James Madison has something valuable to add. In Federalist 10, Madison argues that political institutions should be designed to protect the rights of citizens and to advance “the permanent and aggregate interests of the community.” Facebook is a community. The Community Standards (and the board’s interpretation of them) should serve the permanent and aggregate interests of that community. The prosperity of the company (though perhaps not necessarily at every margin) is surely in the interest of the community. The interests represented on the board are a starting point for understanding the interests of that community, but in themselves they are not enough for that. Deliberation might be the bridge between those interests and the “permanent and aggregate interests of the community.” Looked at that way, most users would have a reason to believe in the legitimacy of a deliberative board as opposed to a board of stakeholders.
Facebook’s draft charter evinces hard work and thought. But it could benefit from more focus on the conditions for the legitimacy of the oversight board. Deliberation (rather than simple interest representation) is part of the answer to the legitimacy question. As deliberations go forward, perhaps the charter’s framers might give more attention to how institutional design can foster deliberation.