A failure of artificial intelligence – or bureaucratic bastardry?

By Professor Adam Graycar and Dr Adam Masters.

Automation in public administration is inevitable and can bring great benefits. The broadly accepted law of robotics is that a robot may not injure a human being.

In an attempt to reduce welfare costs in 2016, the commonwealth government engaged in an unlawful debt recovery process. The bureaucratic process was malign and was meant either directly or collaterally to harm and stigmatise welfare recipients. The Online Compliance Intervention – or OCI, but more commonly known as robodebt – used algorithms to average out incomes of welfare recipients by matching ATO income data with social welfare recipients’ income as self-reported to Centrelink with Centrelink. Income fluctuated over time making recipients eligible for welfare payments, but many were issued with debt notices. The onus of proof was firmly placed on welfare recipients to prove their innocence.Ìý

A class action involving 430,000 people was heard in the Federal Court and this led in 2020 to an agreed settlement of $1.2 billion to be paid by the commonwealth.Ìý They paid, but did not admit liability.

Commenting on this matter ANU Professor of Public Policy Peter Whiteford wrote:
Robodept resembles a 'policy fiasco', as the outcomes could have been foreseen at the inception of the initiative. But it differs from other examples of policy failures in that it was intentional, and not the result of mistakes in design or implementation. (Whiteford, 2021, p.340)

Was this an artificial intelligence (AI) failure or bureaucratic bastardy? AI expert Professor Anton van den Hengel, wrote in an email to the authors:

Automation of some administrative social security functions is a very good idea, and inevitable. The problem with Robodebt was the policy, not the technology. The technology did what it was asked very effectively. The problem is that it was asked to do something daft.
Ìý

The policy failed tests of lawfulness, impartiality and integrity. It undermined the trust between government and the people which resulted from an inability properly to establish a system to correctly identifyÌýand reviewÌýdebts owed to the government. The policy was not even cost-effective.

Services Australia’s own calculations showed that 2016 and 2019, their Income Compliance Program identified about $2 billion in debt, of which $785 million was recovered, and repayment arrangements were made for $725 million. They alsoÌý continued to pursue approximately $500 million outstanding – all at a cost of $606 million (Services Australia, 2019). Add to this the $1.2 billion settlement outlined above plus ongoing costs associated with various inquiries, and the program thus did not yield the expected financial result. In fact, it was a financial disaster for the government.

Given the policy failure that has been widely documented, how could such an act of bureaucratic bastardry occur in a sophisticated government system?Ìý

  • Was there evil intent from the outset?
  • Were ministers and bureaucrats pursuing different objectives?
  • Was there too much faith in artificial intelligence?

Even though the policy was malign, there is no evidence of evil intent from the outset. From a policy perspective, the results clearly illustrate an ill-conceived scheme reliant on flawed interpretations of the law, which led to an unprecedented and systemic application of bureaucratic bastardry upon the most vulnerable Australians.

Second, it appears both government ministers and bureaucrats were pursuing the same objectives – recovery of revenue – despite many of the ‘debts’ being nothing more than a mathematical fantasy with no basis in law.

Third, blaming the machine ignores the well-established principle of GIGO – garbage in, garbage out. Artificial intelligence can undertake complex tasks, and even independently develop better ways of doing such tasks, but this was not the case with robodebt. The computers were tasked with performing complex calculations, but tasked outside the legal framework and flying in the face of the quality of governance values.

The policy failed tests of lawfulness, impartiality and integrity. It undermined the trust between government and the people which resulted from an inability properly to establish a system to correctly identifyÌýand reviewÌýdebts owed to the government. The policy was not even cost-effective.

Services Australia’s own calculations showed that 2016 and 2019, their Income Compliance Program identified about $2 billion in debt, of which $785 million was recovered, and repayment arrangements were made for $725 million. They alsoÌý continued to pursue approximately $500 million outstanding – all at a cost of $606 million (Services Australia, 2019). Add to this the $1.2 billion settlement outlined above plus ongoing costs associated with various inquiries, and the program thus did not yield the expected financial result. In fact, it was a financial disaster for the government.

Given the policy failure that has been widely documented, how could such an act of bureaucratic bastardry occur in a sophisticated government system?Ìý

  • Was there evil intent from the outset?
  • Were ministers and bureaucrats pursuing different objectives?
  • Was there too much faith in artificial intelligence?

Even though the policy was malign, there is no evidence of evil intent from the outset. From a policy perspective, the results clearly illustrate an ill-conceived scheme reliant on flawed interpretations of the law, which led to an unprecedented and systemic application of bureaucratic bastardry upon the most vulnerable Australians.

Second, it appears both government ministers and bureaucrats were pursuing the same objectives – recovery of revenue – despite many of the ‘debts’ being nothing more than a mathematical fantasy with no basis in law.

Third, blaming the machine ignores the well-established principle of GIGO – garbage in, garbage out. Artificial intelligence can undertake complex tasks, and even independently develop better ways of doing such tasks, but this was not the case with robodebt. The computers were tasked with performing complex calculations, but tasked outside the legal framework and flying in the face of the quality of governance values.

This article was originally published in 23 September 2021.

Tagged in policy matters