Developing an Audit Framework for Algorithms

by Esther Meijer-van Leijsen PhD, Justin Verhulst MSc, Pieter Oosterwijk PhD, and Miranda Pirkovski MSc RA EMITA, Netherlands Court of Audit

The steady increase in the Dutch public sector’s use of algorithms—sets of rules and instructions that a computer follows to solve problems or answer questions—has generated a corresponding demand for oversight. Public debate, media coverage, and discussions in parliament have intensified, especially after a district court in The Hague agreed with several non-governmental organizations (NGOs) that SyRI (an algorithm-based fraud detection instrument) was in breach of provisions of European law.

The growing use of algorithms for managing operations and delivering services within government poses challenges for Supreme Audit Institutions (SAIs) around the world. Algorithms can make government operations opaque, or a “black box.” With limited guidelines available, how can SAIs assess whether governments are using algorithms in a responsible and lawful manner?

With its January 2021 report “Understanding Algorithms, “The Netherlands Court of Audit (NCA) made a practical contribution to the debate about the opportunities and risks associated with the use of algorithms by government. The report: 1) provided insights into the extent to which Dutch government entities were aware and in control of the use of algorithms within their departments; 2) presented an audit framework that can be used to assess concrete risks; and 3) tested the usefulness of the audit framework.

The ultimate aim of NCA’s project was to lay the groundwork for the responsible use of algorithms by government. This article describes the report’s findings and impact.

Mostly Simple Algorithms Currently in Use

NCA found that the predictive and prescriptive algorithms the Dutch government currently uses to make decisions affecting citizens and businesses are relatively simple. The government uses these algorithms to manage operations and provide services, such as automated mass mailings and the initial selection of benefit applications. Some of the algorithms, however, are more innovative and involve artificial intelligence. On request, ministries and implementing organizations such as the Employee Insurance Agency and the Social Insurance Bank provided NCA with dozens of algorithms for its audit. Among these, NCA did not find any fully self-learning algorithms, i.e., algorithms that implement policy without human intervention.

One challenge NCA faced in conducting this audit was to clearly define terms and definitions. Terms like “black box,” “bias,” and even “algorithm” can be defined in substantially different ways by individuals with a background in law, governance, information technology (IT), or data science. NCA did not wish to delve too deeply into academic debates about definitions. However, it was important to have a clear and concrete understanding of what was being audited and what the quality standards were. To that end, NCA organized brainstorming sessions in which professionals from all levels of government, as well as academia and audit organizations, reached a shared understanding of terms.

Audit Framework for Algorithms

NCA has developed an audit framework that government and private-sector organizations can use to assess whether their algorithms meet specified quality criteria, and whether or not the attendant risks have been properly identified and mitigated. NCA intends for the audit framework to serve as a practical tool and starting point for the challenges auditors face in assessing algorithms. The framework is described in the report and available in Excel format, allowing users to filter for questions that relate to specific categories and principles, such as fairness and accountability.

To develop the framework, NCA incorporated input from experts, as well as other audit frameworks. For example, the General Data Protection Regulation (GDPR)—a European Union (EU) law—already provides a framework to handle sensitive personal data, and governance and IT general control (ITGC) frameworks are also available. The framework assesses an algorithm on the following “pillars”:

  • Governance and accountability
  • Model and data
  • Privacy
  • Quality of ITGC, such as access rights and back-up controls
  • Ethics

Developing the “model and data” pillar proved to be the most difficult task, as most of the innovations in the use of algorithms fall within this category. The model and data criteria deal with questions about data quality and the development, use, and maintenance of the model underlying the algorithm.

There is legitimate public concern about the unethical use of algorithms—such as biased autonomous decision-making—but this issue is often discussed in a theoretical manner. For its framework, NCA drew on the European Commission’s Ethics Guidelines for Trustworthy AI, formulated by the High-Level Expert Group on Artificial Intelligence. NCA linked these guidelines to the concrete risks articulated in the other pillars.

Practical Test on Three Algorithms

To determine the usefulness of its new audit framework, NCA tested it out on three algorithms the government was currently using. For this exercise, NCA selected algorithms that had a substantial impact on citizens, ethical implications, and varying levels of technical complexity, from relatively simple (decision trees) to complex (neural networks for facial recognition). Although the algorithms were all quite different, NCA was able to assess their risks using the audit framework. One valuable lesson NCA learned was that an algorithm is not isolated from other IT processes within government entities; rather, a collection of algorithms is used by different people at various stages of the process. NCA therefore recommends that teams auditing algorithms be multidisciplinary, to enable them to divide up tasks and obtain a complete picture.

Impact of the Report

NCA was surprised by the national and international publicity its report garnered. Two days after publishing the report, NCA organized a webinar to present the audit’s main findings and start a discussion on the responsible use of algorithms by government. The webinar, whose speakers included the National Ombudsman and representatives from the Council of State, Radiocommunications Agency, and Central Government Audit Service (ADR), attracted over 300 participants.

NCA is now leading discussions on creating checks and balances for algorithms. NCA is working with other audit organizations—such as ADR, the Royal Netherlands Institute of Chartered Accountants (NBA), and NOREA, the professional association for IT auditors in the Netherlands—that are eager to start auditing algorithms but have struggled with the lack of methodologies and guidelines. NCA’s framework has been well received by these organizations as a valuable starting point. NCA has also been invited by government ministries and policy organizations to discuss checks and balances for the responsible use of algorithms, and has presented its research to auditors from other SAIs. NCA is looking forward to continuing these valuable discussions.

Feedback

NCA invites other SAIs to use the framework to audit algorithms and share their experiences and feedback with the authors at algoritmes@rekenkamer.nl.

Back To Top
This site is registered on wpml.org as a development site. Switch to a production site key to remove this banner.