By the end of 2019, Facebook promises to establish an independent body to handle appeals of its content moderation decisions. That intention follows an earlier suggestion by Mark Zuckerberg that Facebook might establish a “Supreme Court” of content moderation. Like the real Supreme Court, Facebook’s board will presumably review the meaning and application of its Community Standards, which might be considered the basic law of the platform.
There are many questions about this new institution. This post looks at how its members might be selected.
To fix ideas, let’s begin with how members of the U.S. Supreme Court are selected. Appointments to the highest court are procedurally simple and normatively complex. Article II of the Constitution says the president “shall nominate, and by and with the Advice and Consent of the Senate, shall appoint…Judges of the Supreme Court…” Advice and Consent can mean a simple majority or a supermajority of senators. Whatever the rule, senators vote only once on a nominee. Thereafter justices “hold their Offices during good Behavior.” (Article III) In practice that means justices continue serving however unpopular their decisions. Of course, justices may be impeached and removed from the Supreme Court. However, Congress has not removed justices or judges because of their decisions. It really does require bad behavior.
Clearly the framers of the U.S. Constitution valued judicial independence, especially a certain distance from the unfiltered will of the majority. A candidate for the presidency may promise voters to nominate a favored judge to the Court; he lacks power to seat anyone. Candidates for the Senate may promise to support or oppose a nominee to the Court, but no senator or group of senators can decide whom to nominate. Once seated, the “good Behavior” standard means a justice serves until retirement, death, or impeachment and removal. The first two are by far the most likely means of departing the Court. The justices need not fear their peers in the executive or the legislative branches or indeed, the people themselves, since they may not be recalled by an angry electorate.
But justices are not free to exercise the judicial power of the United States as they wish. Presumably they are obligated to interpret and apply the words of the Constitution which both empowers and limits the government. The courts are not independent of “We, the People” understood over time as the will of majorities and supermajorities expressed as the text of and amendments to the Constitution. The presidents and senators who nominate and appoint justices also depend directly or indirectly on voters. In these ways, the selection of Supreme Court justices balances independence and representation, thereby fostering the legitimacy of the Court and its decisions.
Following legal and constitutional values might enhance the legitimacy of Facebook’s “appeals court.” Facebook clearly values independence in this new review board: Zuckerberg describes the board as “an independent body, whose decisions would be transparent and binding.” But he also says, the board would “uphold the principle of giving people a voice.”
You might think Facebook’s Community Standards cannot support its review board the way the Constitution supports the Supreme Court. The Constitution gained consent through a deliberative process that led to approval of the document eventually in all states. Nothing like that happened at Facebook. Yet every user consents to abide by Facebook’s Community Standards when joining the platform. We might question the quality of that consent, but it seems similar to the consent given to the Constitution by those of us who joined the “platform” after 1789.
What about applying the Community Standards? Zuckerberg says the board should be independent of Facebook for three reasons:
First, it will prevent the concentration of too much decision-making within our teams. Second, it will create accountability and oversight. Third, it will provide assurance that these decisions are made in the best interests of our community and not for commercial reasons.
Members of the board might serve during “good behavior” like justices. This would create space for its members to interpret and apply Facebook’s Community Standards as they saw fit. Perhaps extended terms for members of the board could attain the same end. But remember: even Supreme Court justices can be removed for bad behavior.
Facebook’s board also needs to be independent in the sense of being free of both politics and commerce. Many people fear that Facebook’s content governance reflects the political commitments of its managers and employees. The interpretation of Facebook’s Community Standards also could become the plaything national political forces. In both instances, concerns about independence reflect worries about misrepresentation. What should be for all turns out to reflect the will of a few. How might the selection of review board members better represent Facebook’s users?
Let’s begin with a straightforward idea of representation. Imagine Mark Zuckerberg appoints the members of the board. Zuckerberg is accountable to Facebook’s users because they can exit the platform and thereby harm or destroy his business. That constraint would mean his appointments represent the concerns of users, along with other matters important to the business. Zuckerberg is a faithful agent of his customers not because he wishes to be so but because he must be.
Facebook wants its content moderation to be accepted as legitimate by its users (and by others). Would users accept this market theory of representation for the board? Many people doubt that markets constrain business managers. Others will think of representation as direct voting rather than indirect responses to consumer desires. Partial acceptance of the market theory may not be adequate to legitimize the new body.
So instead of one person, maybe every adult should elect the members of the board. But direct election seems impossible. The institutions to make that happen do not exist and would take a long time to create. If created, the elections would likely have low turnout with dire implications for the legitimacy of the board.
Any decisionmaker faced with a similar situation tends to act on what might be called the stakeholder theory of representation. Facebook could determine which groups have a strong interest in content moderation by the company. They could then consult with these organized interests about who should serve on the board and then appoint them. The stakeholders would nominate while Facebook managers appoint the “justices” of the review board. Facebook might well see these appointments as representative of its users. The appointees would not work for Facebook and hence be independent in a sense. More realistically, if these selections were done correctly, Facebook would give its critics (and supporters) a seat at the appeals court. Its critics might become more constructive or even supporters of Facebook’s content moderation.
But who would these appointed stakeholders represent? They would be suggested by groups with intense interests in Facebook’s content moderation. For them, the benefits of organizing to influence content moderation would outweigh its costs as noted in a famous book. For most Facebook users, the opposite would be true; the costs of organizing would outweigh its benefits. As a result, the appointees would likely have atypical views about the meaning and application of Facebook’s Community Standards. In other words, representatives of stakeholders are unlikely to be representative of Facebook’s users.
Turning to stakeholders to help with a political challenge is natural. Above all, they are there, and you know them. Stakeholders do indeed offer a measure of representation, and perhaps also some independence from forces outside any organization. Indeed conflict among stakeholding members might enhance the board’s independence. But the representation they offer is flawed. And perhaps, at this stage of institutional design, Facebook might look for alternatives that offer more in the way of both independence and representation.