SUMMARY
Governments worldwide are increasingly relying on algorithmic tools to determine eligibility for benefits, detect fraud, and allocate resources. In the UK, government departments use algorithms in tax, welfare, criminal justice, immigration, and social care. The UK mandates public sector departments to disclose their use of algorithmic tools, promoting greater transparency and increasing public trust in automated decision-making.
Governments worldwide are increasingly relying on algorithmic tools to determine eligibility for benefits, detect fraud, and allocate resources. Algorithmic tools have the potential to enhance fairness, efficiency, and effectiveness in governance. However, the growing complexity and opacity of algorithmic systems has raised concerns about transparency and accountability, particularly in the public sector.
In the UK, government departments use algorithms in tax, welfare, criminal justice, immigration, and social care. However, high-profile controversies have exposed the risks of algorithmic tools and the consequences of damaging public trust.
Algorithmic tools have the potential to revolutionize public service delivery, driving greater efficiency and more streamlined processes. But maintaining public trust throughout this transformation is crucial.
Widespread use of algorithmic tools
In the UK, government departments use algorithms in tax, welfare, criminal justice, immigration, and social care. Algorithmic tools have the potential to revolutionize public service delivery, driving greater efficiency and more streamlined processes.
Credit: Markus Spiske via Unsplash
Rethinking Policy for Algorithmic Transparency
As algorithmic tools become more prevalent, so too must scrutiny of their implementation and outcomes. This growing concern has prompted the UK government to review algorithmic decision-making within its departments and agencies.
To enhance algorithm transparency and increase trust, the government created policy frameworks with the goal of promoting public understanding of their use. Policy developments in this area included a review of algorithm usage from the Centre for Data Ethics and Innovation (CDEI) [1] on Bias in Algorithmic Decision Making. The report provided recommendations across various industries, including the need for transparency among public sector organizations.
In line with their overall open government ambitions, the government committed to algorithmic transparency and accountability in their fifth Open Government Partnership (OGP) national action plan (2021–2023).
The result? The Algorithmic Transparency Recording Standard (ATRS) launched in November 2021 by the Government Digital Service (GDS), creating a standardized way of recording and sharing information about how the public sector uses algorithmic tools.
Developing the ATRS
The UK government followed a multi-stakeholder approach when developing the ATRS, involving consultations with external experts and government professionals, research and engagement with other countries that have developed similar standards, and the collection of insights from the public to explore their understanding of algorithmic transparency.
Insights revealed that while the public may not actively seek out information on algorithmic transparency, they responded positively to the availability of publicly recorded information. Members of the public felt that such information should be accessible to more specialized audiences who could scrutinize it on their behalf.
Gavin Freeguard, Policy Associate at Connected by Data, explains that “even if members of the public are not directly reading algorithmic transparency records themselves, civil society organisations and others can use them to understand where and how algorithmic tools are being used to support decisions, and communicate that to the public.”
Capturing Data: The ATRS in Practice
Following the extensive research period, the government created an ATRS template for public officials to report on an algorithm’s use, how it works, and its impact on decision-making.
The Algorithmic Transparency Lead at the Department for Science, Innovation and Technology, explains that the template records two tiers of data into a central algorithmic transparency register.
She said, “Tier one is a concise, high-level description of what the algorithmic tool does and why it is used. Tier two then goes into much more detail around the technical specifications, the data used, risks and mitigations and procurement details.” In other words, this tier system makes it easier for members of the public to digest the information they want to know.
Why algorithmic transparency matters
As algorithmic tools become more prevalent, so too must scrutiny of their implementation and outcomes. This growing concern has prompted the UK government to review algorithmic decision-making within its departments and agencies.
Credit: Divinetechygirl via Pexels)
Creating the ATRS
The Algorithmic Transparency Recording Standard (ATRS) launched in November 2021 by the Government Digital Service (GDS), creating a standardized way of recording and sharing information about how the public sector uses algorithmic tools.
Pictured: The GDS is housed under the Department of Science, Technology, and Innovation building (Credit: Wikimedia)
Getting Transparency Right: Challenges for Integrating the ATRS Hub across Central Government
While ATRS began its life as a voluntary standard, that all changed last year. As part of a consultation response to the government’s AI White Paper 2023 [2], the ATRS became mandatory, requiring all central government departments in England to register algorithmic tools on the ATRS Hub.
Elena Hess-Rheingans, Data Ethics Lead in the Transparency and Data Ethics Team at the Government Digital Service, which oversees the ATRS, was responsible for ensuring departments could adopt and use the standard effectively.
“The government response to the AI White Paper consultation secured ministerial backing across departments and all levels of the civil service,” she explains. “However, just because something is agreed in a meeting doesn’t mean teams will immediately submit their ATRS records. So, we developed an engagement plan, breaking it into phases. We first targeted ministerial and non-ministerial departments and are now prioritizing central government and high-priority arms-length [3] bodies.”
Yet, as Hess-Rheingans acknowledges, a mandate alone doesn’t drive immediate action. Departments and arms-length bodies must sift through vast amounts of data to ensure compliance—a process requiring support, cultural shifts, and time. This is reflected in the fact that only 59 algorithm systems have been imputed into the public register at the time of writing.
For the UK to achieve true transparency in the public sector, civil servants must recognize their responsibility for data recording, accountability in decision-making, and the ethical use of AI and algorithmic data to serve taxpayers and citizens effectively.
To facilitate this, the team is collaborating with chief data officers to implement the Standard by mapping algorithmic tools, completing required entries, and identifying new tools for inclusion.
Future Goals for Rollout
Right now, the ATRS mandate does not include local government in England, despite some councils choosing to submit algorithmic tools. The devolved nations of Scotland, Wales, and Northern Ireland are responsible for their own policies in this area, with Scotland committing to a similar concept in its current OGP action plan.
Algorithmic tools have the potential to revolutionize public service delivery, driving greater efficiency and more streamlined processes. However, maintaining public trust throughout this transformation is crucial.
Freeguard welcomes the use of the ATRS, however, he notes that “it is not sufficient in earning trust in the government’s use of AI.” He explains how “politicians and public servants need to demystify what they mean when they talk about ‘AI’ and how they think particular tools will help solve particular problems. Governments need to engage different communities to understand how they feel about the use of AI, and their limits on how it should be used.”
The ATRS provides the UK government with a proactive approach to stay ahead of emerging technologies and developments. In January 2025, the new government launched the AI Opportunities Action Plan, a set of 50 recommendations which puts the UK on the path to being a world leader in AI and an attractive destination for investment. However, for this to occur, true transparency will need to occur, including in the public sector.
[1] Following changes in the UK government since the start of the project, CDEI has changed to the Responsible Data and AI team at Government Digital Service in the Department for Science, Innovation and Technology.
[2] This was published under Prime Minister Sunak’s Conservative government, which ran from 2022 to 2024.
[3] In the UK, arms-length bodies are public bodies that operate with a degree of autonomy from the central government, but are still accountable to UK government ministers.