How machine learning veils human bias

The promise of introducing machine learning into public administration is that it can counteract human bias. The latent promise of bureaucracy can be realised by systems that won’t be up-ended by the messy imperfections of their human operators. However Virginia Eubanks makes clear in Automating Inequality that the reality is something much more worrying, as the operation of machinic systems does what Andrew Pickering calls ontological veiling: rendering them unrepresentable by taking us on a detour from those aspects of reality. As Eubanks recalls on pg 166:

Human bias has been a problem in child welfare since the field’s inception. In its earliest days, Charles Loring Brace’s orphan trains carried away so many Catholic sons and daughters that the religious minority had to create an entirely parallel system of child welfare organizations. Scientific charity workers had religious biases that tended to skew their decision-making. They believed that the children of Protestants could be redeemed by their families, but Catholics were incorrigible and had to be sent to labor on (mostly Protestant) farms in the Midwest. Today, racial disproportionality shatters the bonds of too many Black and Native American families. Some of that disproportion can certainly be traced to human discretion in child welfare decision-making. But human bias is a built-in feature of the predictive risk model, too.

Compare to the contemporary reality depicted on pg 167:

Once the big blue button is clicked and the AFST runs, it manifests a thousand invisible human choices. But it does so under a cloak of evidence-based objectivity and infallibility. Intake screeners reflect a variety of experiences and life paths, from the suburban white Penn State postgraduate to an African American Pittsburgh native, like Pat Gordon, with over a decade of experience. The automated discretion of predictive models is the discretion of the few. Human discretion is the discretion of the many. Flawed and fallible, yes. But also fixable.