Deirdre Mulligan
Associate Professor in the School of Information at UC Berkeley, faculty Director of the Berkeley Center for Law & Technology and affiliated faculty on the Hewlett funded Berkeley Center for Long-Term Cybersecurity
Saving Governance-By-Design
Abstract:
Governing through technology has proven irresistibly seductive. Technologists, system designers, advocates, and regulators increasingly seek to use the design of technological systems for the advancement of public policy—to protect privacy, advance fairness, or ensure law enforcement access, among others. Designing technology to “bake in” values offers a seductively elegant and potentially effective means of control. Technology can harden fundamental norms into background architecture, and its global reach can circumvent jurisdictional constraints, sometimes out of public view. As technology reaches into the farthest corners of our public and private lives it's power to shape and control human behavior, often imperceptibly, makes it an important locus for public policy. Yet while “Governance-by-design”—the purposeful effort to use technology to embed values—is becoming a central mode of policy-making, our existing regulatory systems are ill-equipped to prevent that phenomenon from subverting public governance. Far from being a panacea, governance-by-design has undermined important governance norms and chipped away at important rights. In administrative agencies, courts, Congress, and international policy bodies, public discussions about embedding values in design arise in a one-off, haphazard way, if at all. Constrained by their structural limitations, these traditional venues rarely explore the full range of values that design might affect, and often advance, a single value or occasionally pit one value against another. They seldom permit a meta-discussion about when and whether it is appropriate to enlist technology in the service of values at all. And policy discussions almost never include designers, engineers, and those that study the impact of socio-technical systems on values. When technology is designed to regulate without such discussion and participation the effects can be insidious. The resulting technology may advance private interests at the price of public interests, protect one right at the expense of another, and often obscures government and corporate aims and the fundamental political decisions that have been made. This talk proposes a detailed framework for saving governance-by-design. It examines recent battles to embed policy in technology design to identify recurring dysfunctions of governance-by-design efforts in existing policy making processes and institutions. It closes by offering a framework to guide "governance-by-design" that surfaces and resolves value disputes in technological design, while preserving rather than subverting public governance and public values.
Bio:
Deirdre K. Mulligan is an Associate Professor in the School of Information at UC Berkeley, a faculty Director of the Berkeley Center for Law & Technology, and an affiliated faculty on the Hewlett funded Berkeley Center for Long-Term Cybersecurity. Mulligan’s research explores legal and technical means of protecting values such as privacy, freedom of expression, and fairness in emerging technical systems. Her book, Privacy on the Ground: Driving Corporate Behavior in the United States and Europe, a study of privacy practices in large corporations in five countries, conducted with UC Berkeley Law Prof. Kenneth Bamberger was recently published by MIT Press. Mulligan and Bamberger received the 2016 International Association of Privacy Professionals Leadership Award for their research contributions to the field of privacy protection. She is a member of the Defense Advanced Research Projects Agency's Information Science and Technology study group (ISAT); and a member of the National Academy of Science Forum on Cyber Resilience. She is Chair of the Board of Directors of the Center for Democracy and Technology, a leading advocacy organization protecting global online civil liberties and human rights; a founding member of the standing committee for the AI 100 project, a 100-year effort to study and anticipate how the effects of artificial intelligence will ripple through every aspect of how people work, live and play; and a founding member of the Global Network Initiative, a multi-stakeholder initiative to protect and advance freedom of expression and privacy in the ICT sector, and in particular to resist government efforts to use the ICT sector to engage in censorship and surveillance in violation of international human rights standards. She is a Commissioner on the Oakland Privacy Advisory Commission. Mulligan chaired a series of interdisciplinary visioning workshops on Privacy by Design with the Computing Community Consortium to develop a shared interdisciplinary research agenda. Prior to joining the School of Information. she was a Clinical Professor of Law, founding Director of the Samuelson Law, Technology & Public Policy Clinic, and Director of Clinical Programs at the UC Berkeley School of Law.
Mulligan was the Policy lead for the NSF-funded TRUST Science and Technology Center, which brought together researchers at U.C. Berkeley, Carnegie-Mellon University, Cornell University, Stanford University, and Vanderbilt University; and a PI on the multi-institution NSF funded ACCURATE center. In 2007 she was a member of an expert team charged by the California Secretary of State to conduct a top-to-bottom review of the voting systems certified for use in California elections. This review investigated the security, accuracy, reliability and accessibility of electronic voting systems used in California. She was a member of the National Academy of Sciences Committee on Authentication Technology and Its Privacy Implications; the Federal Trade Commission's Federal Advisory Committee on Online Access and Security, and the National Task Force on Privacy, Technology, and Criminal Justice Information. She was a vice-chair of the California Bipartisan Commission on Internet Political Practices and chaired the Computers, Freedom, and Privacy (CFP) Conference in 2004. She co-chaired Microsoft's Trustworthy Computing Academic Advisory Board with Fred B. Schneider, from 2003-2014. Prior to Berkeley, she served as staff counsel at the Center for Democracy & Technology in Washington, D.C.
Associate Professor in the School of Information at UC Berkeley, faculty Director of the Berkeley Center for Law & Technology and affiliated faculty on the Hewlett funded Berkeley Center for Long-Term Cybersecurity
Saving Governance-By-Design
Abstract:
Governing through technology has proven irresistibly seductive. Technologists, system designers, advocates, and regulators increasingly seek to use the design of technological systems for the advancement of public policy—to protect privacy, advance fairness, or ensure law enforcement access, among others. Designing technology to “bake in” values offers a seductively elegant and potentially effective means of control. Technology can harden fundamental norms into background architecture, and its global reach can circumvent jurisdictional constraints, sometimes out of public view. As technology reaches into the farthest corners of our public and private lives it's power to shape and control human behavior, often imperceptibly, makes it an important locus for public policy. Yet while “Governance-by-design”—the purposeful effort to use technology to embed values—is becoming a central mode of policy-making, our existing regulatory systems are ill-equipped to prevent that phenomenon from subverting public governance. Far from being a panacea, governance-by-design has undermined important governance norms and chipped away at important rights. In administrative agencies, courts, Congress, and international policy bodies, public discussions about embedding values in design arise in a one-off, haphazard way, if at all. Constrained by their structural limitations, these traditional venues rarely explore the full range of values that design might affect, and often advance, a single value or occasionally pit one value against another. They seldom permit a meta-discussion about when and whether it is appropriate to enlist technology in the service of values at all. And policy discussions almost never include designers, engineers, and those that study the impact of socio-technical systems on values. When technology is designed to regulate without such discussion and participation the effects can be insidious. The resulting technology may advance private interests at the price of public interests, protect one right at the expense of another, and often obscures government and corporate aims and the fundamental political decisions that have been made. This talk proposes a detailed framework for saving governance-by-design. It examines recent battles to embed policy in technology design to identify recurring dysfunctions of governance-by-design efforts in existing policy making processes and institutions. It closes by offering a framework to guide "governance-by-design" that surfaces and resolves value disputes in technological design, while preserving rather than subverting public governance and public values.
Bio:
Deirdre K. Mulligan is an Associate Professor in the School of Information at UC Berkeley, a faculty Director of the Berkeley Center for Law & Technology, and an affiliated faculty on the Hewlett funded Berkeley Center for Long-Term Cybersecurity. Mulligan’s research explores legal and technical means of protecting values such as privacy, freedom of expression, and fairness in emerging technical systems. Her book, Privacy on the Ground: Driving Corporate Behavior in the United States and Europe, a study of privacy practices in large corporations in five countries, conducted with UC Berkeley Law Prof. Kenneth Bamberger was recently published by MIT Press. Mulligan and Bamberger received the 2016 International Association of Privacy Professionals Leadership Award for their research contributions to the field of privacy protection. She is a member of the Defense Advanced Research Projects Agency's Information Science and Technology study group (ISAT); and a member of the National Academy of Science Forum on Cyber Resilience. She is Chair of the Board of Directors of the Center for Democracy and Technology, a leading advocacy organization protecting global online civil liberties and human rights; a founding member of the standing committee for the AI 100 project, a 100-year effort to study and anticipate how the effects of artificial intelligence will ripple through every aspect of how people work, live and play; and a founding member of the Global Network Initiative, a multi-stakeholder initiative to protect and advance freedom of expression and privacy in the ICT sector, and in particular to resist government efforts to use the ICT sector to engage in censorship and surveillance in violation of international human rights standards. She is a Commissioner on the Oakland Privacy Advisory Commission. Mulligan chaired a series of interdisciplinary visioning workshops on Privacy by Design with the Computing Community Consortium to develop a shared interdisciplinary research agenda. Prior to joining the School of Information. she was a Clinical Professor of Law, founding Director of the Samuelson Law, Technology & Public Policy Clinic, and Director of Clinical Programs at the UC Berkeley School of Law.
Mulligan was the Policy lead for the NSF-funded TRUST Science and Technology Center, which brought together researchers at U.C. Berkeley, Carnegie-Mellon University, Cornell University, Stanford University, and Vanderbilt University; and a PI on the multi-institution NSF funded ACCURATE center. In 2007 she was a member of an expert team charged by the California Secretary of State to conduct a top-to-bottom review of the voting systems certified for use in California elections. This review investigated the security, accuracy, reliability and accessibility of electronic voting systems used in California. She was a member of the National Academy of Sciences Committee on Authentication Technology and Its Privacy Implications; the Federal Trade Commission's Federal Advisory Committee on Online Access and Security, and the National Task Force on Privacy, Technology, and Criminal Justice Information. She was a vice-chair of the California Bipartisan Commission on Internet Political Practices and chaired the Computers, Freedom, and Privacy (CFP) Conference in 2004. She co-chaired Microsoft's Trustworthy Computing Academic Advisory Board with Fred B. Schneider, from 2003-2014. Prior to Berkeley, she served as staff counsel at the Center for Democracy & Technology in Washington, D.C.