Recently I argued that Facebook needed to focus more on representation in creating its oversight board for appeals from its content moderation. If the Board is to be legitimate, it must be representative as well as law-like and independent. How might it be representative?


First: who should be represented on the Board? Facebook users create much of the content that generates the data used to improve both advertising and the company’s bottom line. They are the community often mentioned by the leaders of Facebook. The Board should, among other values, represent users.


Initially Facebook officials will select Board members in part because they represent users. This means Board members will vary along various familiar dimensions: race, gender, culture and so on. Of course, it will also mean the Board will be multi-national since Facebooks has over two billion users in many countries. Such representation is to be expected, and given the consultation process surrounding the creation of the Board, we might expect good results.


Other factors are encouraging. Facebook is a business. That means the company must make as large a profit as possible. For that reason, Facebook wants as many users as possible; that desire means the company must satisfy as many people as possible regarding its Board. All things being equal, Facebook’s Board membership will look like its users for business reasons.


But there are risks ahead. The easiest way to make the Board appear representative would be to appoint well-known members. Both the member and what they “stand for” would be quickly recognized by users. But well-known people have many demands on their time. Making the Facebook Board a success will demand time and effort from its founding members? Will the work done by the Board take priority for the famous? If not, many of the actual decisions by the Board will be profoundly influenced by someone other than Board members. Who?


The staff of any organization tends to run it; hence, the famous principal-agent problem wherein the staff (the agent) acts on its own interests rather than the interests of Facebook users (the principal here). Of course, the Board’s staff could also be selected to be representative of Facebook users. But that selection will be much less public than the selection of Board members. Organized groups with an interest in what the Board does are likely to seek influence on staff selections. But there is a risk here. The Board might end up appearing to represent almost all users through its well-known members while actually representing organized interests concerned about Facebook policies.


I see two ways for the concerns of the broader Facebook user community to inform Board deliberations. First, Facebook could survey users about their views about content moderation. For example, a survey could ask users to strike a balance between voice and safety in concrete cases similar to problems confronted by Facebook’s content moderators. Survey findings would not decide questions of content moderation. Rather, they could inform decisions about the Community Standards and their applications if the Board wished. In other words, surveys would represent user opinions about the questions posed to the Board. It would make the information available to the Board more representative even if the final decision contravened the preferences of most users.


Instead of surveying users, why not just allow users to elect Board members? Facebook has tried voting by users in the past, but poor turnout compromised the legitimacy of these efforts. Why risk another failure with turnout that might raise questions about the legitimacy of the Board? The Board itself is a remarkable innovation; trying to do too much at once, even in the name of representation, may limits its chances for success. However, if the Oversight board surveyed samples of users to receive feedback, the views of users would be known rather precisely given the likely size of the samples.


But Facebook might consider other limited but intriguing innovations to foster representation. Selection of members and surveys are both passive representation of users. Facebook officials choose representatives for the larger group of users. Board members select user opinions to inform their decisions. Might Facebook users be more active in these matters?


As mentioned in my previous post, juries (including grand juries) represent the broader public in the legal system. In fact, the fifth, sixth, and seventh amendments to the republican Constitution of the United States require juries for decisions about criminal and common law. It might be too much to expect that user juries would decide whether some future Alex Jones has violated Facebook’s rules. But juries might decide some cases where decisions have been appealed to the Board. I am thinking here of juries, not voters. We should not dismiss the possibility that jurors could deliberate about questions of content moderation. Deliberation is a social act whether in the jury room or online. No doubt the structure and rules for such deliberation require much attention. Facebook has enormous technical capacity and much experience with such deliberation within the company. It should experiment with a jury model either privately or publicly.


Gaining legitimacy for Facebook’s Board of Oversight will be hard. The Board needs to be both judicial and representative, two values that can conflict. In the United States such conflicts are fought out between the branches of government. At Facebook, the struggle will take place within this new Board. Yet the Board must be both judicial and representative if its judgments are to be accepted by its users and by larger publics.