Robodebt: The erosion of a social licence to operate through bureaucratic banality
Introduction
All of humanity's tools know neither good nor evil. They are, in general, dual-use. How we use them determines whether the act of using them is good or evil.
In 1963 political theorist Hannah Arendt after watching the 1961 trial of Adolf Eichmann, who claimed, “we were just following orders”, controversially coined the phrase “banality of evil” and posited the question, "Can one do evil without being evil?” (White, 2018).
Did Centrelink have moral agency, and could the tragic outcomes of Robodebt have been avoided if the people who instigated the program, those who designed and managed the system, were guided by a set of ethics principles instead of “just doing what they were instructed to do”?
The Australian government’s “Robodebt” scheme provides a powerful case study of how an absence of ethics in public service automation can undermine an institution’s social license to operate. By blindly prioritising bureaucratic efficiency over people, the government of the day and Centrelink degraded its moral mandate.
Robodebt
From 2015 to 2020, Robodebt used an algorithm to match income data, automatically alleging overpaid benefits. Lacking oversight, the opaque system recouped funds punitively through methods like bank freezing bank accounts, threats of interest on debt, and using debt collectors. Recipients were presumed guilty until proven otherwise, which forced the most vulnerable with the least resources to try and dispute inflated debts.
The algorithm that powered Robodebt was based on a flawed assumption. Blunt averaging produced many false debt assessments that welfare recipients struggled to appeal through incomprehensible automated processes. Public outcry eventually prompted a class action that resulted in $1.2 billion compensation payments to aggrieved social security recipients and a determination of Robodebt’s unlawfulness. While welcomed, this judgement did little to relieve the pain and loss of those families who lost loved ones by taking their own lives due to this scheme.
In a scathing judgement, Justice Murphy of the Federal Court of Australia ruled that Robodebt, “unlawfully and unethically seeks to place the onus on supposed debtors to ‘disprove’ a data-match debt or face the prospects of the amount placed in the hands of debt collectors” (Carney, 2019).
Centrelink failed to live up to at least two elements of data ethics:
- The ethics of data collection: privacy, trust and transparency. Centrelink accessed the Australian Taxation Office and Social Security data to develop the algorithmic decision-making process in determining debt. Social security recipients, while having access to their data, did not have access to the entire datasets putting them at a disadvantage in making a claim against the government.
- The ethics of data analysis: Responsibility and accountability. No consideration was given to the design of the machine learning algorithm and the consequences of Robodebt’s decision-making processes. Failure in this area was particularly egregious for two reasons.
- Firstly, because the basis of Robodebt’s machine learning algorithm was simplistic and did not consider the nuanced nature of the two datasets, it was biased in its decision-making.
- Secondly, Centrelink established a system that removed any accountability on their behalf by placing the onus of proof on social security recipients.
Ethical frameworks (or the lack thereof)
Various ethical frameworks highlight the injustice. Utilitarianism focuses on maximising good – but Robodebt caused widespread harm. Virtue ethics examines moral character – it reflected systemic cruelty, not compassion. Deontology evaluates rights and duties – it degraded due process and social contract. Through Rawls’ “veil of ignorance”, even its creators likely would have rejected its harshness.
Above all, it inverted the purpose of a civil society. Instead of uplifting vulnerable welfare recipients, it prioritised bureaucratic efficiency and revenue first. An apathetic, automated bureaucracy gradually normalised excessive recoupment without human decency as a check.
Welfare recipients suffered under inhumane systems they could neither understand nor dispute. But opaque technology does not absolve agencies of moral duties. They owe society transparency, accountability and avoiding unnecessary harm – not the automated accumulation of fines against vulnerable people.
The lessons are clear: While technology assists administration, unrestrained automation risks dehumanisation. Efficiency means little next to upholding rights and social welfare. Welfare recipients deserve explanations, not opaque techno-bureaucracy. Ethics must guide governance systems, or moral failures creep in.
Robodebt valued fiscal targets over welfare recipients’ humanity. It inflicted gross indignity and anguish through its debt recovery crusade. Losing sight of social uplift for statistics suggests profound institutional failure. Such normalised cruelty serves as a dire warning about ethics and technology.
Most importantly, Robodebt eroded the social license between the government and the governed. The social license depends on the trust that institutions act in society’s best interest. Callously automating welfare into a blunt revenue extraction machine degraded this public faith.
Conclusion
Society expects better than algorithmic ambivalence to suffering. Public service technology devoid of moral guardrails risks abusing decency. Institutions must uphold welfare and justice over efficiency. Failing that duty of care undermines social license, threatening an organisation’s legitimacy.
Recovering from this reputational damage requires reaffirming moral priorities. Public trust hinges on proving that bureaucratic systems can again empower welfare recipients with compassion and dignity. Institutions can only redeem their social license to operate by elevating people over procedures. It begins with leaders reasserting the core human values that civilization depends on.
Robodebt provides a stark example of what happens when public service bureaucracies prize speed over societal obligations. Efficiency means little if achieving it through cruelty betrays basic decency. All complex systems need moral guardrails to prevent the dehumanizing banality of bureaucratic evil.
Ethical foundations matter most of all.
#robodebt #sociallicense #ethics #moralagency #Utilitarianism #Virtueethics #Deontology #artificialintelligence #ai #centrelink #socialwelfare #stakeholderengagement