000 02965nab a22002777a 4500
999 _c7383
_d7383
005 20250625151601.0
008 211201s2019 -n|| |||| 00| 0 eng d
040 _aAFVC
100 _aKeddell, Emily
_94218
245 _aAlgorithmic justice in child protection :
_bstatistical fairness, social justice and the implications for practice
_cEmily Keddell
260 _bMDPI,
_c2019
500 _aSocial Sciences, 2019, 8, 281
520 _aAlgorithmic tools are increasingly used in child protection decision-making. Fairness considerations of algorithmic tools usually focus on statistical fairness, but there are broader justice implications relating to the data used to construct source databases, and how algorithms are incorporated into complex sociotechnical decision-making contexts. This article explores how data that inform child protection algorithms are produced and relates this production to both traditional notions of statistical fairness and broader justice concepts. Predictive tools have a number of challenging problems in the child protection context, as the data that predictive tools draw on do not represent child abuse incidence across the population and child abuse itself is difficult to define, making key decisions that become data variable and subjective. Algorithms using these data have distorted feedback loops and can contain inequalities and biases. The challenge to justice concepts is that individual and group rights to non-discrimination become threatened as the algorithm itself becomes skewed, leading to inaccurate risk predictions drawing on spurious correlations. The right to be treated as an individual is threatened when statistical risk is based on a group categorisation, and the rights of families to understand and participate in the decisions made about them is difficult when they have not consented to data linkage, and the function of the algorithm is obscured by its complexity. The use of uninterpretable algorithmic tools may create ‘moral crumple zones’, where practitioners are held responsible for decisions even when they are partially determined by an algorithm. Many of these criticisms can also be levelled at human decision makers in the child protection system, but the reification of these processes within algorithms render their articulation even more difficult, and can diminish other important relational and ethical aims of social work practice. (Author's abstract). Record #7383
650 _aCHILD PROTECTION
_9118
650 _aCHILDREN'S RIGHTS
_9135
650 0 _94928
_aPREDICTIVE RISK MODELLING
650 4 _aRISK ASSESSMENT
_9504
650 _aSOCIAL JUSTICE
_910466
650 _aSOCIAL POLICY
_9551
650 4 _aSOCIAL SERVICES
_9555
651 4 _aNEW ZEALAND
_92588
773 0 _tSocial Sciences, 2019, 8, 281
830 _aSocial Sciences
_96421
856 _uhttps://doi.org/10.3390/socsci8100281
_zDOI: 10.3390/socsci8100281 (Open access)
942 _2ddc
_cARTICLE