A Law of Averages?

2023-03-01

Robodebt and Algorithmic Accountability

By Mark Brogan, Volunteer with The Greens (WA). Formerly ECU. Advocates on environment and heritage. And, made a submission to the Robodebt Royal Commission.  

Background

Douglas Adam’s Hitchhiker’s Guide to the Galaxy begins with the destruction of planet Earth. The demolition was the work of an interstellar ‘constructor fleet’ commanded by an alien race known as the Vogons. Despised as "one of the most unpleasant races in the galaxy, but not actually evil” the Vogons were notorious for their bad-temper, bureaucracy, officiousness, and lack of empathy.[1]

They were also the bureaucrats of the galaxy, a managerial class that managed planning and works. Earth had to be destroyed to make way for a hyperspace expressway. Any Earthling aggrieved by the Earth’s planned destruction, could appeal the approval at a local planning department in the nearby star cluster of Alpha Centauri, more than four light years from Earth. The Guide notes that humans were allowed fifty years in which to lodge a complaint. But no complaint was ever made.

The Vogons would have appreciated and may even have been a little envious of the eponymous automated scheme to recover welfare debt created by the Australian Government colloquially known as Robodebt. Operating in various configurations from 2015 to 2019, Robodebt was the Turnbull and Morrison Governments’ program to recover welfare overpayments and to defeat welfare fraud. Public revelations of the human consequences of wrongly calculated debts led Greens Senator, Rachel Siewert, to call for a Royal Commission into the scheme in 2020.[2] A Royal Commission into the Robodebt Scheme, was eventually established by the recently elected Albanese Government in August 2022.[3] The final report of the Commission is scheduled to  be delivered to the Governor-General by 30 June 2023.

The Robodebt Royal Commission is pioneering in many ways, not least because its proceedings can be watched as a livestream from the Commission’s web site. Beginning in September 2022 hearings are now in their final phase. They are revelatory in terms of the antics of politicians and bureaucrats who conceived and maintained the Robodebt automated debt recovery system. 

As Social Services Minister in 2015, Scott Morrison had been responsible for steering Robodebt through the Parliamentary budget process.[4] It provided $204.9 million over five years to the Department of Human Services (DHS) to improve its capacity to detect and deter welfare fraud and non-compliance.[5] Together with improvements in the capacity of other agencies such as the Office of the Director of Public Prosecutions, the Government expected this investment to produce net savings of $1.5 billion over four years.[6]

The remainder of 2015 was taken up with a Robodebt pilot program targeting debts of selected welfare recipients accrued between 2011 and 2013. In September 2016, the system moved to full online implementation with automated decision making, based on an algorithm that identified prospective cases of overpayment using averaged PAYG income data sourced from the Australian Tax Office (ATO).[7] This data was compared with beneficiary declared income for each benefit period, the discrepancy reported as a debt where payments exceeded the average income assessed entitlement. In this iteration, the system was known as the Online Compliance Intervention or OCI. In December 2016 the Minister for Social Services,  Christian Porter, announced that the system was capable of issuing debt notices at the rate of 20,000 per week.[8] 

A law of averages?

In the absence of other data, Robodebt calculated benefit period income was derived from averaged annual ATO income data. In early 2015, before Robodebt came into operation at scale, the work of the Royal Commission has shown that it was an open question as to whether an algorithm that identified debt liability on the basis of income averaging was consistent with legal obligations under the Commonwealth’s Social Security Act. But it would take four years and many damaged lives before the program was halted. Checks and balances in the form of appeals to the Administrative Appeals Tribunal, and an investigation by the Commonwealth Ombudsman, could not stop it. 

In the end, in 2019, a death blow to Robodebt was delivered by the Federal Court of Australia in the case of Amato v the Commonwealth.[9] The Court found that the averaging process using ATO income data to calculate debts was indeed unlawful. A bombshell revelation from the work of the  Royal Commission is that the Department of Social Services (DSS) was in possession of legal advice as early as 2014, that questioned the lawfulness under the Social Security Act of relying on averaged PAYG income data for debt calculation. Its position in relation to this advice, has been shown by the Commission to have involved concealment, obfuscation, and as yet unexplained communications failure.

By the time of its discontinuation, the Commonwealth Government claimed to have asserted debts of $1.7 billion against 453,000 Australians.[10] Many of these debts were incorrectly calculated and distressing to recipients of debt notices. Some recipients are claimed to have committed suicide wholly or in part as a consequence of receiving debt notices.[11] A former compliance officer with Centrelink who was called as a witness by the Royal Commission, Colleen Taylor, gave evidence that because many debts were wrongly raised against welfare recipients, the system should be regarded as stealing from them.[12]

The questions of how and why this Vogon inspired mess happened, and why it was allowed to continue for so long, have exercised the Royal Commission and form the main focus of its investigation. Some of the explanation for why it happened is contextual. The election of the Abbott Government in 2013 had been on the back of a zeitgeist in conservative politics around deficits, taxation, and welfare spending. By 2015, Morrison was building electoral capital around the suppression of suspected welfare fraud. Radio shock jocks were feeding community outrage with selected case studies. Enter stage right Scott Morrison with his welfare ‘cop on the beat’.[13] In his testimony to the Royal Commission in December 2022, Morrison did not resile from the tough stance taken. 

Process failures and systems design mistakes plagued Robodebt from its genesis as a concept in late 2014, to its abandonment years later. Many have been revealed in testimony to the  Royal Commission. The DSS practice of treating legal advice with which it was uncomfortable as ‘draft’ and hence not to be ‘finalized’ was used by senior DSS bureaucrats as cover for not communicating unwelcome advice to Ministers. Records of important interactions between senior DSS and DHS executives, ministerial staff and Ministers appear not to have been kept. 

The design of the system itself was not human centric, displayed poor usability and shifted the costs of compliance to benefit recipients. The raising of a debt in favour of the Commonwealth where averaged income was inconsistent with benefit period reported income, was automatic unless recipients could submit pay slips or other financial records showing that the averaged estimate of income from ATO PAYG data was wrong. Retrospectivity extended across the entire period of receiving welfare, potentially years. The documentation cost burden imposed on recipients was huge. Who keeps every payslip that they have ever received?  

Steeped in process automation, anonymity and a burden of proof that was onerous to users, Robodebt tilted the scales of justice in the direction of presumed guilt. It also displayed a total lack of transparency in how debt liability was arrived at. Welfare recipients would be advised in a form letter of income discrepancies. There was no help line and no explanation of how the discrepancy had been calculated. This embedded unfairness was amplified by instructions given to compliance officers. Taylor told the Commission that compliance officers were instructed not to waste time investigating evidence held on file, such as scanned copy of income statements previously supplied by recipients. Also in evidence, Taylor described her efforts in 2017 to bring defects in Robodebt to the attention of senior management, including the Head of Department of Human Services (DHS), Ms Kathryn Campbell AO. These efforts were unsuccessful with some internal DHS documents suggesting that Taylor was ‘overly sympathetic’ to welfare recipients. In June 2020, with Robodebt fatally wounded by the Federal Court and in the throes of abandonment by the Morrison Government, Greens and then Labor called for a Royal Commission.[14]

Whatever the Royal Commission concludes about the Vogon like behaviour of Ministers and bureaucrats behind the Robodebt saga, the Commission’s legacy will be measured in terms of the effectiveness and durability of reforms recommended. This is what will remain after the curtain is called on the livestream theatre of Kings Counsels, squirming Ministers, and bureaucrats. Royal Commissions mostly have a checkered history in terms of durable, meaningful reform. In Robodebt the failings of Ministers and senior bureaucrats, blame shifting, the corruption of the public service by Executive power are all writ large in the evidence gathered by the Commission. But these areas are notoriously difficult when it comes to durable reform. The Royal Commission may need to look elsewhere, if it wants to leave a lasting legacy.

Techno-solutionism and algorithms

Robodebt is an Australian case study in techno-solutionism based on algorithms, data sets and automated decision making. While Robodebt is being treated by the media as unique, algorithmic abuse of vulnerable people is a global phenomenon. Across a variety of applications contexts, poorly designed algorithms and automated decision making have been shown to expose individuals to the risk of violation of their rights. Facial recognition systems have attracted the most attention. A study of London Metropolitan Police’s suspect recognition system conducted by the University of Essex in 2019, described instances of false positives leading to the wrongful identification of innocent individuals as suspects in crowds. In June 2020, the New York City Chapter of the Association of Computing Machinery (ACM) urged a suspension of private and government use of facial recognition technology because of “clear bias based on ethnic, racial, gender and other human characteristics.” Calo and Citron (2020) recount another case study in Arkansas’ Department of Human Services where:

As a result of the automated system’s dysfunction, severely disabled Medicaid recipients were left alone without access to food, toilet, and medicine for hours on end. Nearly half of Arkansas Medicaid recipients were negatively affected. Obtaining relief from the software-based outcome was all but impossible.[15]

Social media platforms are also taking heat on algorithms. In the washup from the January 6, 2021 riots in Washington, Facebook’s news feed algorithm has attracted particular attention for fanning hyper partisanship, assisting with the propagation of conspiracy theories and “incentivizing politicians to take more divisive stands’”[16]  

A new front in the quest to make companies and governments more accountable for their use of algorithms has opened up with the rise of Artificial Intelligence (AI) and machine learning. These developments are transforming the design of automated decision-making systems. Machine learning is a field of inquiry devoted to understanding and building methods that 'learn', that is, methods that leverage data to improve performance on some set of tasks. Machine learning algorithms build a model based on sample data, known as training data, to make predictions or decisions without being explicitly programmed to do so.[17] Machine learning systems have become ubiquitous in the last decade, being most commonly embedded in the architecture of social media applications. Increasingly, machine learning is at the heart of automated decision making that affects almost every facet of daily life including, financial, medical and legal.

Might Robodebt have been re-engineered as a machine learning system? As Robodebt took on water in 2017 from the increasing visibility of wrongly calculated debts and customer usability issues, evidence given to the Royal Commission suggests that moving the system to a machine learning architecture was under active consideration.[18]

With a growing pile of case studies and calls for action, legislators around the world have been working to address the problems of algorithmic accountability and automated decision making. In the United States in August, 2022, Senator Ron Wyden introduced an Algorithmic Accountability Act.[19] The case for the Bill is described in an explanatory memorandum in the following terms:

When algorithms determine who goes to college, who gets healthcare, who gets a home, and even who goes to prison, algorithmic discrimination must be treated as the highly significant issue that it is. These large and impactful decisions, which have become increasingly void of human input, are forming the foundation of our American society that generations to come will build upon. And yet, they are subject to a wide range of flaws from programing bias to faulty datasets that can reinforce broader societal discrimination, particularly against women and people of color. It is long past time Congress act to hold companies and software developers accountable for their discrimination by automation.[20]

In Europe, recent initiatives have targeted Artificial Intelligence (AI) and robotics.  In 2019 the European Parliament by an overwhelming vote adopted a comprehensive European industrial policy on artificial intelligence and robotics.[21] This included recommendations dealing with ethics and governance. The EU’s Digital Services Act (2022) which has a focus on services that connect consumers with goods, services and content, includes provisions that relate to algorithmic transparency.[22] An example of action that specifically addresses government use of algorithms, is the French Law for a Digital Republic enacted in 2016. This law extended the principle of information to algorithmic processing:

any person who is the subject of an individual administrative decision taken on the basis of an algorithm must be informed of the fact and may demand access to the algorithm’s main operational rules (its contribution, data used, etc.).[23]

Technology vendors themselves,  have not been entirely aloof from growing public disquiet over the new frontier of automated decision making with AI and machine learning. Amazon, Microsoft and Google each offer design guidance and software services and tools aimed at promoting accountability and transparency. But without a regulatory framework, abuse of vulnerable people is set to continue.

Conclusion

In its Letters Patent, the Royal Commission is provided with the power to make any recommendations it considers appropriate. Addressing government use of algorithms, AI and automated decision-making systems with vulnerable people is clearly within its reporting mandate. However, to arrive at an understanding of the need for such recommendations, the Royal Commission will need to look beyond the predictable discourse on organizational and human failure that has characterized its work so far. This work is important, but the techno-solutionism behind Robodebt also needs to be investigated. Recommendations for reform are needed that are aimed at combating bias, transparency and usability in the use of technologies that are key enablers of automated decision making. If the Royal Commission succeeds at this, an important step will have been taken towards avoiding the recurrence of Robodebt. An algorithm ‘cop on the beat’ could prove an unexpected, but valuable legacy.

[Opinions expressed are those of the author and not official policy of Greens WA]

ENDNOTES

[1] Wikipedia. (2022). The Hitchhiker's Guide to the Galaxy.  Retrieved from: https://en.wikipedia.org/wiki/The_Hitchhiker%27s_Guide_to_the_Galaxy

[2] Gomes, L. (2020). Calls for royal commission into robodebt and apology from Morrison government.  The Guardian, 2 June 2020.  Retrieved from:

[3] Commonwealth of Australia. Prime Minister of Australia. (2022). Establishment of a Royal Commission into Robodebt. Retrieved from: https://www.pm.gov.au/media/establishment-royal-commission-robodebt

[4] Commonwealth of Australia. House of Representatives. (2015).  Parliamentary Debates.  Appropriation Bill (No. 1) 2015-2016.  Speech.  Retrieved from: https://www.theguardian.com/australia-news/2020/jun/02/calls-for-royal-…

[5] Arthur, D. (2015). Changes to welfare system compliance and ICT systems.  Budget Review 2015-16 Index.  Retrieved from: https://www.aph.gov.au/About_Parliament/Parliamentary_departments/Parli…

[6] Commonwealth of Australia. (2015). Budget 2015: Fairness in Tax and Benefits.  p.15. Retrieved from: https://archive.budget.gov.au/2015-16/glossy/Tax-and-Benefits.pdf

[7] Commonwealth Ombudsman (2017). Centrelink’s automated debt raising and recovery system.  A Report about the Department of Human Services Online Compliance and Intervention System for Debt Raising and Recovery.  P.6 Retrieved from: https://www.ombudsman.gov.au/__data/assets/pdf_file/0022/43528/Report-C…

[8] Martin, Sarah (5 December 2016). "Welfare debt squad hunts for $4bn". The Australian. Archived from the original on 9 February 2021. Retrieved 30 May 2020.

[9] Legal Aid Victoria. (2020). Robo-debts.  Retrieved from: https://web.archive.org/web/20210209080156/https://www.legalaid.vic.gov…

[10] Royal Commission into the Robodebt Scheme (2022). Public Hearing Day 1 Transcript, p.11 Retrieved from https://robodebt.royalcommission.gov.au/system/files/2022-11/transcript…

[11] Gomes, L. (2022). Robodebt Royal Commission to review ‘untold harm’ caused by Coalition’s botched scheme. The Guardian, 25 August 2022.  Retrieved from:- https://www.theguardian.com/australia-news/2022/aug/25/robodebt-royal-c…

[12] Gomes, L. (2022). Centrelink worker recounts ‘callous indifference’ from superiors after raising alarm about Robodebt.  The Guardian, 14 December 2022. Retrieved from: https://www.theguardian.com/australia-news/2022/dec/14/centrelink-worke…

[13] Gomes, L. (2022). How Morrison launched Australia’s ‘strong welfare cop’ – and the pain robodebt left in its wake.  The Guardian, 17 December 2022. Retrieved from: https://www.theguardian.com/australia-news/2022/dec/17/how-morrison-lau…

[14] Gomes, L. (2020). Calls for royal commission into robodebt and apology from Morrison government.  The Guardian, June 2, 2020. Retrieved from: https://www.theguardian.com/australia-news/2020/jun/02/calls-for-royal-…

[15] Citron, K. and Calo, R. (2020).  The Automated Administrative State: A Crisis of Legitimacy.  Retrieved from: https://scholarship.law.bu.edu/cgi/viewcontent.cgi?article=1835&context…

[16] Oremus, W.,Alcantara, C., Merrill, J and Galotcha, A. (2021). How Facebook shapes your feed. The Washington Post, October 26, 2021.  Retrieved from: https://www.washingtonpost.com/technology/interactive/2021/how-facebook…

[17] Machine learning - Wikipedia, https://en.wikipedia.org/wiki/Machine_learning.

[18] Royal Commission into the Robodebt Scheme (2022). Exhibit 1688

[19] Wyden, R. (2022). Wyden, Colleagues Renew Request for FDA to Address Concerns about Dangerous Pulse Oximeter Inaccuracies Affecting Communities of Colour.  Retrieved from: https://www.wyden.senate.gov/news/press-releases/wyden-colleagues-renew…

[20] Chu, K. (2022). Wyden, Booker, and Clarke Introduce Algorithmic Accountability Act of 2022 to Require New Transparency and Accountability for Automated Decision Systems.  Retrieved from:  https://www.wyden.senate.gov/news/press-releases/wyden-booker-and-clark…

[21] Legislative Observatory, European Parliament 2018/2088(INI) - 12/02/2019, https://oeil.secure.europarl.europa.eu/oeil/popups/summary.do?id=157336…

[22] European Commission. (2022). Questions and Answers: Digital Services Act.  Retrieved from: https://ec.europa.eu/commission/presscorner/detail/en/QANDA_20_2348

[23] Open Government Partnership. (n.d.). Transparency of Public Algorithms (FR0035).  Retrieved from https://www.opengovpartnership.org/members/france/commitments/FR0035/