147 reads

Dear Public, We Added Noise to Your Data. Love, the Government

Too Long; Didn't Read

The 2020 U.S. Census adopted differential privacy to protect data but failed to gain public trust. This case study reveals how transparency and participation efforts must focus on values and be supported by expert guidance, not just technical openness.

Coins Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - Dear Public, We Added Noise to Your Data. Love, the Government
Tech Media Bias [Research Publication] HackerNoon profile picture
0-item

Authors:

(1) AMINA A. ABDU, University of Michigan, USA;

(2) LAUREN M. CHAMBERS, University of California, Berkeley, USA;

(3) DEIRDRE K. MULLIGAN, University of California, Berkeley, USA;

(4) ABIGAIL Z. JACOBS, University of Michigan, USA.

Abstract and 1. Introduction

2. Related Work

3. Theoretical Lenses

3.1. Handoff Model

3.2. Boundary objects

4. Applying the Theoretical Lenses and 4.1 Handoff Triggers: New tech, new threats, new hype

4.2. Handoff Components: Shifting experts, techniques, and data

4.3. Handoff Modes: Abstraction and constrained expertise

4.4 Handoff Function: Interrogating the how and 4.5. Transparency artifacts at the boundaries: Spaghetti at the wall

5. Uncovering the Stakes of the Handoff

5.1. Confidentiality is the tip of the iceberg

5.2. Data Utility

5.3. Formalism

5.4. Transparency

5.5. Participation

6. Beyond the Census: Lessons for Transparency and Participation and 6.1 Lesson 1: The handoff lens is a critical tool for surfacing values

6.2 Lesson 2: Beware objects without experts

6.3 Lesson 3: Transparency and participation should center values and policy

7. Conclusion

8. Research Ethics and Social Impact

8.1. Ethical concerns

8.2. Positionality

8.3. Adverse impact statement

Acknowledgments and References


Emerging discussions on the responsible government use of algorithmic technologies propose transparency and public participation as key mechanisms for preserving accountability and trust. But in practice, the adoption and use of any technology shifts the social, organizational, and political context in which it is embedded. Therefore translating transparency and participation efforts into meaningful, effective accountability must take into account these shifts. We adopt two theoretical frames, Mulligan and Nissenbaum’s handoff model and Star and Griesemer’s boundary objects, to reveal such shifts during the U.S. Census Bureau’s adoption of differential privacy (DP) in its updated disclosure avoidance system (DAS) for the 2020 census. This update preserved (and arguably strengthened) the confidentiality protections that the Bureau is mandated to uphold, and the Bureau engaged in a range of activities to facilitate public understanding of and participation in the system design process. Using publicly available documents concerning the Census’ implementation of DP, this case study seeks to expand our understanding of how technical shifts implicate values, how such shifts can afford (or fail to afford) greater transparency and participation in system design, and the importance of localized expertise throughout. We present three lessons from this case study toward grounding understandings of algorithmic transparency and participation: (1) efforts towards transparency and participation in algorithmic governance must center values and policy decisions, not just technical design decisions; (2) the handoff model is a useful tool for revealing how such values may be cloaked beneath technical decisions; and (3) boundary objects alone cannot bridge distant communities without trusted experts traveling alongside to broker their adoption.

1 INTRODUCTION

Recent work on values in technology attempts to understand how technological changes can produce fairer, more accountable, and more trustworthy systems [38, 78, 110]. Transparency and participatory design are often proposed to advance these goals in both academic work and policy [16, 26, 27, 44, 67]. However, scholars, critics, and advocates have raised complications and limitations of transparency and participation, particularly when adopted uncritically [13, 30, 33, 112]. We argue that prevailing models of algorithmic transparency and participation stand to benefit from sociotechnical analysis of transparency and participation on-the-ground. As such, we focus on a single case study: the adoption of differential privacy in the 2020 Decennial Census. Differential privacy (DP) is a mathematical definition of privacy that leverages statistical uncertainty to provably limit leakage of any individual’s sensitive information– in other words, random noise is added to data in order to reduce the possibility for re-identification [43]. Under the leadership of Chief Scientist John Abowd, the Census Bureau implemented DP in its 2020 disclosure avoidance system (DAS), the mechanism used to manipulate census response data prior to publication to ensure confidentiality in accordance with Title 13.[1] Notably, the technical affordances of DP allowed the Bureau to make details of the DAS public for the first time without undermining confidentiality. The Bureau embraced this possibility, introducing many innovations in transparency and attempting to facilitate participation from a wide variety of experts and the public.


Despite these significant efforts, this newfound transparency did not produce the accountability and trust the Bureau hoped to engender [59, 71, 95, 97]. The resulting controversy attracted the attention of critical scholars, who have attempted to adjudicate its history and implications [18, 19, 92]. We build upon this literature; by employing the handoff lens [90], we parse out how a seemingly technological transition - from its previous statistical disclosure (SDL) methods to DP - in fact altered the very function of disclosure avoidance as the Bureau’s methods, experts, and values were reconfigured.


Drawing on this case study, we demonstrate the utility of the handoff model for addressing calls from the critical algorithmic transparency literature to examine transparency in context. In particular, we show that the handoff model makes visible where decisions about values are embedded within sociotechnical systems and identifies the configurations of human actors surrounding these decisions.


Our contribution is three-fold. First, we shed new insight on a case that has been of significant interest to both researchers and policymakers. We provide an account of the specific values decisions at the core of the adoption of DP and how participation was limited in these decisions despite significant efforts. Second, we argue that values should be at the center of transparency and participation efforts and demonstrate the utility of the handoff model for eliciting these values. Finally, we highlight the need for understanding the role of experts in transparency and participation processes. While the literature has focused on developing documentation artifacts for transparency and participation, the census case highlights the insufficiency of artifacts alone to facilitate meaningful participation. Trusted individuals with requisite expertise must exist, within stakeholder communities, in order for such artifacts to be understood and adopted.


This paper is available on arxiv under CC BY-NC-SA 4.0 DEED license.

[1] The Census Bureau is obligated to protect the privacy and confidentiality of individual data according to Title 13 of the U.S. Code

Trending Topics

blockchaincryptocurrencyhackernoon-top-storyprogrammingsoftware-developmenttechnologystartuphackernoon-booksBitcoinbooks