Failing Our Youngest: On the Biases, Pitfalls, and Risks in a Decision Support Algorithm Used for Child Protection

Therese Moreau Hansen, Roberta Sinatra, Vedran Sekara

Publikation: Konference artikel i Proceeding eller bog/rapport kapitelKonferencebidrag i proceedingsForskningpeer review

Abstract

In recent years, Danish child protective services have experienced increasing pressure, prompting the adoption of a decision-support algorithm to aid caseworkers in identifying children at heightened risk of maltreatment, named Decision Support. Despite its critical role, this algorithm has not undergone formal evaluation. Through a freedom of information request, we were able to partially access the algorithm and conduct an audit. We find that the algorithm has significant methodological flaws, suffers from information leakage,
relies on inappropriate proxy values for maltreatment assessment, generates inconsistent risk scores, and exhibits age-based discrimination. Given these serious issues, we strongly advise against the use of this kind of algorithms in local government, municipal, and child protection settings, and we call for rigorous evaluation of such tools before implementation and for continual monitoring post-deployment by listing a series of specific recommendations.
OriginalsprogEngelsk
TitelFAccT '24: Proceedings of the 2024 ACM Conference on Fairness, Accountability, and Transparency
Publikationsdato2024
Sider290-300
StatusUdgivet - 2024

Emneord

  • Algorithmic audit
  • Technological fairness
  • Algorithmic decision making
  • Algorithmic discrimination

Fingeraftryk

Dyk ned i forskningsemnerne om 'Failing Our Youngest: On the Biases, Pitfalls, and Risks in a Decision Support Algorithm Used for Child Protection'. Sammen danner de et unikt fingeraftryk.

Citationsformater