ACADEMIC ARTICLE SUMMARY

The Gender Panopticon: Artificial Intelligence, Gender, and Design Justice

Article Source: UCLA Law Review, Vol. 68, pp. 692-785, 2021
Publication Date:
Time to Read: 2 minute read
Written By:

JJ

Jessica Jung

ARTICLE SUMMARY

Summary:

Artificial Intelligence (AI) surveillance systems often use binary male/female gender classifications, failing to recognize the complexity of LGBTQ+ identity formation.

POLICY RELEVANCE

Policy Relevance:

Law and technology design should support gender self-determination.

KEY TAKEAWAYS

Key Takeaways:
  • Transportation Security Administration systems do not allow for classifications other than male or female, resulting in distressing treatment for transgender travelers.
  • Jeremy Bentham described the "panopticon," a prison design that allows constant surveillance by guards in a central tower, so the prisoners feel they are always being watched; today's AI-based surveillance technologies, often directed at the LGBTQ+ community, are comparable.
  • Automated biometric gender recognition researchers make harmful assumptions:
    • They assume that gender is binary, limited to male or female.
    • They assume that gender is immutable.
    • They assume that gender may be identified based on physical characteristics.
    • Nonbinary persons are almost always misgendered.
  • Three types of harm flow from gender panopticism, including distress, the censorship of LGBTQ+ expression, and the continued disparagement of minorities.
  • Some courts recognize that discrimination against transgender people is a type of sex discrimination; government identification systems that treat every applicant as male or female are inaccurate as applied to transgender or intersex individuals.
  • Limiting a transgender person's ability to declare their gender might infringe on First Amendment rights of free speech.
  • Facebook and Google allow users to choose among many genders, but limit users to three pronoun choices (them, her, or his); the user’s choice of pronoun supports the sale of data to advertisers displaying gender-targeted ads.
  • Gender self-determination should be coded into our technology.
    • The right to be forgotten could support LGBTQ+ rights, by allowing an individual to erase occurrences of their deadname.
    • Because gender is subjective, AI cannot identify gender with 100 percent accuracy.
    • Allowing individuals to choose a gender means that some will face prejudice, unless security personnel are properly trained.
    • AI systems could be trained to take cues about gender from pronoun usage.
  • The principle of "inclusive neutrality" calls for the creation of a public realm in which gender divisions are not reinforced or enforced, and all individuals may self-identify.

QUOTE

TAGS

Sonia Katyal

About Sonia Katyal

Sonia Katyal is the Roger J. Traynor Distinguished Professor of Law, Co-Director of the Berkeley Center for Law & Technology, and Associate Dean of Faculty Development and Research at the University of California, Berkeley School of Law. Professor Katyal’s work focuses on the intersection of technology, intellectual property, and civil rights (including antidiscrimination, privacy, and freedom of speech).