Evan Selinger: Cybersecurity Workers Need to Learn From Those They’re Trying to ProtectPublication Date: January 23, 2020 3 minute read
from “Cybersecurity Workers Need to Learn From Those They’re Trying to Protect” by Evan Selinger and Albert Fox Cahn
We can reinforce the lesson that every single one of us has an affirmative moral duty to remedy the discriminatory product design that benefits some privileged users at the price of ignoring countless others.
Philosophy professor Evan Selinger, Rochester Institute of Technology, explores the ethical issues concerning technology, science, and the law. He focuses primarily on technology ethics with artificial intelligence and privacy.
One of Professor Selinger’s recent projects tackled privacy challenges faced by people who are victims of stalking and surveillance. Addressing the experience gap between most developers who design security software (see “See Big Tech’s Terrible Diversity Record, Visualized Using Its Logos”) and the vulnerable communities that are customers of these security products and tools, Professor Selinger and his colleague Albert Fox Cahn, Executive Director of Surveillance Technology Oversight Project, devised a semester-long program intended to open the eyes of up-and-coming developers to the biases and presumptions they bring to their professional projects.
Professor Selinger and Mr. Cahn share key takeaways from this work in their Medium | One Zero article, “Cybersecurity Workers Need to Learn From Those They’re Trying to Protect.” Below are a few excerpts:
We brought together students and a grassroots community group, exploring a new model of collaborative development that moved beyond general cyber hygiene strategies and paternalistic assumptions about how to help people whose privacy is being threatened.
The experiences of vulnerable communities, including women targeted by domestic abusers, need to be considered and represented in the design process, and more tools need to be made available to them.
The students were asked to create a surveillance training kit to better protect the privacy of the clients who work with TPNY [Turning Point for Women and Families, a nonprofit serving survivors of domestic violence in the Muslim, Arab, and South Asian communities]. The kit contains handouts and lesson plans on limiting stalker access to location data, protecting against unwanted calls and messages, strengthening passwords and access controls, checking for unauthorized account access, and limiting an online footprint. This frontline learning taught them far more about the surveillance of minority communities than a textbook ever could.
Students were repeatedly reminded that they should begin the design process by asking, “What do other users need?” This consciousness-raising is a central value of working with individuals who will help shape commercial software products for years to come. In this way, students are not only helping to undo the damage done by prior iterations of design-based discrimination, but they are learning an invaluable lesson about the biases and presumptions they will bring to their professional projects.
Read more: “Cybersecurity Workers Need to Learn From Those They’re Trying to Protect” by Evan Selinger and Albert Fox Cahn.
About Evan Selinger
Evan Selinger is a Professor of Philosophy at Rochester Institute of Technology and an Affiliate Scholar at Northeastern University’s Center for Law, Innovation, and Creativity. He is also a Senior Fellow at the Future of Privacy Forum. Professor Selinger’s research primarily addresses ethical issues concerning technology, including artificial intelligence, science, and the law.