When Algorithms Rule, Values Can Wither
Building responsible AI systems starts with recognizing that technology solutions implicitly prioritize efficiency.
Interest in the possibilities afforded by algorithms and big data continues to blossom as early adopters gain benefits from AI systems that automate decisions as varied as making customer recommendations, screening job applicants, detecting fraud, and optimizing logistical routes.1 But when AI applications fail, they can do so quite spectacularly.2
Consider the recent example of Australia’s “robodebt” scandal.3 In 2015, the Australian government established its Income Compliance Program, with the goal of clawing back unemployment and disability benefits that had been made inappropriately to recipients. It set out to identify overpayments by analyzing discrepancies between the annual income that individuals reported and the income assessed by the Australian Tax Office. Previously, the department had used a data-matching technique to identify discrepancies, which government employees subsequently investigated to determine whether the individuals had in fact received benefits to which they were not entitled. Aiming to scale this process to increase reimbursements and cut costs, the government developed a new, automated system that presumed that every discrepancy reflected an overpayment. A notification letter demanding repayment was issued in every case, and the burden of proof was on any individuals who wished to appeal. If someone did not respond to the letter, their case was automatically forwarded to an external debt collector. By 2019, the program was estimated to have identified over 734,000 overpayments worth a total of 2 billion Australian dollars ($1.3 billion U.S.).4
Get Updates on Leading With AI and Data
Get monthly insights on how artificial intelligence impacts your organization and what it means for your company and customers.
Please enter a valid email address
Thank you for signing up
The new system was designed to optimize efficiency, but without being attentive to the particulars of individual cases. The idea was that by eliminating human judgment, which is shaped by biases and personal values, the automated program would make better, fairer, and more rational decisions at much lower cost. Unfortunately, choices made by system designers both in how the algorithm was designed and how the process worked resulted in the government demanding repayments from hundreds of thousands of people who had been entitled to the benefits they had received. Some were compelled to prove that they had not illegitimately claimed benefits as long ago as seven years earlier. The consequences for many individuals were dire.
Subsequent parliamentary reviews pointed to “a fundamental lack of procedural fairness” and called the program “incredibly disempowering to those people who had been affected, causing significant emotional trauma, stress, and shame.
References
1. T.H. Davenport and R. Bean, “Becoming an ‘AI Powerhouse’ Means Going All In,” MIT Sloan Management Review, June 15, 2022, https://dev03.mitsmr.io.
2. C. O’Neil, “Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy” (New York: Crown Publishers, 2016).
3. “Accountability and Justice: Why We Need a Royal Commission Into Robodebt,” PDF file (Canberra, Australia: Senate Community Affairs Reference Committee, May 2022), https://parlinfo.aph.gov.au.
4. “Centrelink’s Compliance Program: Second Interim Report,” PDF file (Canberra, Australia: Senate Community Affairs Reference Committee, September 2020), chap. 1, https://parlinfo.aph.gov.au.
5. “Centrelink’s Compliance Program,” chap. 2.
6. Ibid.
7. D. Lindebaum, C. Moser, M. Ashraf, et al., “Reading ‘The Technological Society’ to Understand the Mechanization of Values and Its Ontological Consequences,” Academy of Management Review, July 2022, https://journals.aom.org.
8. O’Neil, “Weapons of Math Destruction.”
9. M. Rokeach, “The Role of Values in Public Opinion Research,” Public Opinion Quarterly 32, no. 4 (winter 1968-1969): 550.
10. O’Neil, “Weapons of Math Destruction.”