CSJ Director Brandon Garrett’s new piece in the Harvard Data Science Review, a comment on a new piece by Cynthia Rudin, Caroline Wang, and Beau Coker, titled “The Age of Secrecy and Unfairness in Recidivism Prediction.” An excerpt:
I write to comment on “The Age of Secrecy and Unfairness in Recidivism Prediction,” a wonderful new piece by Cynthia Rudin, Caroline Wang, and Beau Coker (2020). The authors argue secret algorithms should not be permitted to make important decisions, such as calculations of criminal sentences, and further, they partially reconstruct one such secret algorithm to show what should and should not trouble us about use in practice. Those empirical contributions add important insights into debates about the use of the COMPAS, an acronym for Correctional Offender Management Profiling for Alternative Sanctions, a case management tool developed by a company now called equivant (previously Northpointe). To be sure, many other risk assessment instruments commonly used are not proprietary or secret and they are not particularly complex. Thus, the findings regarding the one particular (and much criticized) instrument regarding data from one county (Broward County, Florida) do not necessarily extend to other uses of risk assessment.
Unfortunately, though, the concerns raised by the authors regarding COMPAS extend still more broadly to a range of algorithms used in other criminal justice settings. Algorithms have been put to uses in criminal justice ranging from facial recognition, to DNA mixture analysis, to biometric database searches. Nonalgorithmic forensic methods also continue to be used, relying on black box or unvalidated methods. When one looks more broadly at the uses of unsound science in criminal cases, one can observe that a regulatory scheme, perhaps along the lines of the federal legislation introduced, is urgently needed.