Anyone who has been paying attention to politics in the last year or so will know that across the country there have been an increasing number of public policy fights surrounding the transgender community. Perhaps the most notable are debates on rules governing who should be able to compete in women’s and girl’s sporting events or access gender-segregated bathrooms. Others include debates on “Don’t Say ‘Gay’ ” bills. What is less often discussed is how recognition technology fueled by Artificial Intelligence (AI) will affect members of the transgender community. As this technology proliferates among law enforcement and the private sector we should expect for questions of transgender people’s experience with it to become increasingly urgent.
This post takes no position on the metaphysics of gender identity. Whether you take an essentialist view on gender or you believe that gender is a feature of an individual that they can choose has no bearing on the disproportionate effect recognition technologies have on the transgender community.
Facial recognition is the most notable of these technologies. It is often used for identity verification. It identifies an individual’s identity by comparing an image (e.g. customs border photos, CCTV stills, etc.) with an image on file (usually a passport or driver’s license photo). Facial recognition technology automatically measures the physical attributes of faces that appear in the images. A match confirms identity.
At first glance, it might seem that facial recognition does not pose issues for the transgender community. After all, when someone changes gender their eyes do not move closer together or wider apart. The distance between their lips and nose do not change. Absent cosmetic surgery, body modification, or accident the kind of measurements facial recognition makes will not change in a significant enough way to make facial recognition identification unfeasible.
But AI that can analyze facial features is not only used for identity verification. One study found that deep neural networks outperformed humans in predicting people’s sexual orientation. From the study:
We used deep neural networks to extract features from 35,326 facial images. These features were entered into a logistic regression aimed at classifying sexual orientation. Given a single facial image, a classifier could correctly distinguish between gay and heterosexual men in 81% of cases, and in 74% of cases for women. Human judges achieved much lower accuracy: 61% for men and 54% for women. The accuracy of the algorithm increased to 91% and 83%, respectively, given five facial images per person. Facial features employed by the classifier included both fixed (e.g., nose shape) and transient facial features (e.g., grooming style).
Automated facial analysis can also be used to target advertisements and identify diseases. One study suggests that such analysis could be used to diagnose children with autism.
There is a subset of facial analysis called Automated Gender Recognition (AGR). Such analysis informs users of the gender of someone in an image. As Os Keyes, a PhD Candidate at the University of Washington notes:
Automated Gender Recognition (AGR) isn’t something most people have heard of, but it’s remarkably common. A subsidiary technology to facial recognition, AGR attempts to infer the gender of the subject of a photo or video through machine learning. It’s integrated into the facial recognition services sold by big tech companies like Amazon and IBM, and has been used for academic research, access control to gendered facilities, and targeted advertising. It’s difficult to know all of the places where it’s currently deployed, but it’s a common feature of general facial recognition systems: anywhere you see facial recognition, AGR might well be present.
A study from the University of Colorado Boulder subjected 2,450 images from Instagram to gender recognition tools built by IBM, Amazon, Microsoft and Clarifai. The images were divided into seven groups of 350 according to hashtags (#women, #man, #transgenderwoman, #transgenderman, #agender, #agenderqueer, #nonbinary). The results were as follows:
“On average, the systems were most accurate with photos of cisgender women (those born female and identifying as female), getting their gender right 98.3% of the time. They categorized cisgender men accurately 97.6% of the time.
But transgender men were wrongly identified as women up to 38% of the time. And those who identified as agender, genderqueer or nonbinary—indicating that they identify as neither male or female—were mischaracterized 100% of the time.”
Much of this discrepancy can be attributed to the fact that AGR systems do not have a transgender designation, embracing labels that conform to a binary understanding of gender. In sexually dimorphic species such human beings there are physical differences between two sexes. Although human beings have a lower level of dimorphism compared to other species it is nonetheless the case that human males tend to be taller, heavier, and more hairy than females. Sexual dimorphism between males and females is also apparent in facial structures.
Debates on policies affecting the transgender community rest on disagreements over whether someone who asserts that their gender identity is at odds with the gender they were assigned at birth are making an accurate claim about their identity. Whatever your view on the matter it remains the case that recognition systems are relying on measurements of physical features, not mental content. How an AI system labels a person’s face says nothing about the truth or falsehood of what that person thinks.
The implications of AI tools labeling someone as the gender they do not identify with could have consequences ranging from the embarrassing to those that could result in civil liberties violations.
Clearview AI is best-known as a law enforcement facial recognition tool. The company scrapes billions of images from publicly available social media sites in order to build a facial recognition search engine. Although Clearview’s leadership insists that it does not plan to release a consumer-grade version of the technology, the company has secured a patent for recognition systems that would allow users to scan someone’s face in order to determine whether that person does drugs, is homeless, or suffers from mental illness. The same technology could also be used to determine someone’s education history or contact details.
Such technology could put members of the transgender community who wish to conceal their gender transition at risk of social sanction, public humiliation, and violence. It is not inconceivable that in the near future a company will release a dating app that includes a recognition function similar to one outlined in the Clearview AI patent that also includes gender recognition capability.
Online dating is not the only potential application of AGR. In 2019, Berlin’s public transport operator BVG chose to mark Equal Pay Day by giving women a 21 percent discount off their tickets. Customers could purchase their tickets at kiosks, which automatically determined the customers’ gender via gender identification. From the German newspaper B.Z.:
A camera built into the machine scans the customer’s face. Women get the reduced prices, men get a wrong message. The reduced day ticket has the note “Frauenticket” printed on it. For the annual pass, the machine spits out a ticket that can be redeemed for the ticket in the customer center. Payment is only possible by card.
Given ongoing policy fights over who should get to access gender-segregated spaces such as bathrooms and changing rooms, it is possible that private businesses, universities, and schools might choose to use AGR technology under the guise of security enhancement and privacy protection.
It is not hard to imagine AGR technology at train stations, bathrooms, and changing rooms prompting encounters where a member of the transgender community would feel embarrassed, unsafe, humiliated, and exposed. Unfortunately, so-called “culture war” issues — such as gendered access to bathrooms — are becoming increasingly politicized. We should therefore expect for policy discussions surrounding the transgender community to continue to be a staple of political campaigns and state legislative agendas, especially given that an increasing number of people identify as members of the LGBTQ community.
Libertarians are often conscientious objectors in “culture war” battles, embracing the principle that non-government institutions such as the media, academia, religious organizations, charities, journalism outlets, and others are better venues for debates on difficult topics such as gender identity than legislatures are. There is no libertarian inconsistency between arguing strongly in favor of allowing members of the transgender community to be free to identify as the gender of their choice while also arguing against a one-size-fits-all policy for every school, university, and corporation. Nonetheless, libertarians should not be shy about highlighting the potential civil liberties implications of legislation aimed at the transgender community.
In this year alone, lawmakers across the country have introduced hundreds of bills addressing (among other things) transgender medical care and bathroom access. These bills risk worsening the transgender community’s already far from ideal relationship with American law enforcement by increasing the chance of police officer interactions with members of the transgender community.
In a jurisdiction where a “bathroom bill” is enforced a private company might choose to install AGR tools that either label customers and employees as “male” or “female.” Transgender customers and employees may run afoul of the law by visiting the bathroom that is consistent with their gender identity rather than the bathroom associated with the gender they were assigned at birth. Doing so would violate the law and trigger an alert on the AGR tool. In many cases, this would result in police being called to the business. Police would have an incentive to determine the sex of the person who allegedly violated provisions of the “bathroom bill.” In such a situation, police might choose to use AGR’s alert as grounds for detention or arrest. Police could also choose to use their own AGR tools.
Police departments are not transparent enough about the surveillance tools they use. Too often, citizens learn about what new and emerging surveillance technology law enforcement is using in journalism outlets rather than from local officials. At a time when lawmakers across the country are seeking to pass laws aimed at the transgender community and technology is capable of identifying sex and other features via automated facial analysis it is especially crucial that civil libertarians argue more law enforcement transparency. I have argued before for policies that would require police to hold public hearings about new surveillance tools before they are used. Whatever your position on how members of the transgender community identify themselves, you should be concerned about police being able to scan citizens’ faces with tools that were never subjected to public oversight and comment.