Heterogeneity in Algorithm-Assisted Decision-Making: A Case Study in Child Abuse Hotline Screening

04/12/2022
by   Lingwei Cheng, et al.
0

Algorithmic risk assessment tools are now commonplace in public sector domains such as criminal justice and human services. These tools are intended to aid decision makers in systematically using rich and complex data captured in administrative systems. In this study we investigate sources of heterogeneity in the alignment between worker decisions and algorithmic risk scores in the context of a real world child abuse hotline screening use case. Specifically, we focus on heterogeneity related to worker experience. We find that senior workers are far more likely to screen in referrals for investigation, even after we control for the observed algorithmic risk score and other case characteristics. We also observe that the decisions of less-experienced workers are more closely aligned with algorithmic risk scores than those of senior workers who had decision-making experience prior to the tool being introduced. While screening decisions vary across child race, we do not find evidence of racial differences in the relationship between worker experience and screening decisions. Our findings indicate that it is important for agencies and system designers to consider ways of preserving institutional knowledge when introducing algorithms into high employee turnover settings such as child welfare call screening.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset