Skip Navigation

Strengthening Algorithmic Transparency in the United Kingdom

Natalia Domagala|

This piece is part of OGP’s “Open Algorithms Blog Series.” Read other posts in the series here.

Algorithmic transparency means different things to different people, but it can be broadly understood as openness about the purpose, structure, and underlying actions of algorithms used to process information. In the United Kingdom (UK), our current algorithmic transparency work stems from the long-standing work of the open data movement, with the “open by default” policy introduced in 2012, as well as developments in data ethics like the Data Ethics Framework and the Guide to using AI in the Public Sector. However, there is a lack of a consistent, systematic approach to algorithmic transparency mechanisms, which we decided to address at the national level in the UK.

When the Central Digital and Data Office in the UK (where I work) set off to explore what methods of algorithmic transparency would be the most appropriate for the UK, we knew that we wanted to develop this work as collaboratively as possible. We began by asking external experts from civil society and academia about what information on the use of algorithm-assisted decisions in the public sector they would like to see published and in what format. We then ran an interactive workshop with colleagues from across the government, to scope whether the information experts would like to see is currently being collected and could be published as a part of our algorithmic transparency measures. Once we had a shortlist of all categories we could make available, we considered how to balance the trade-off between depth and accessibility. We wanted to provide as much detail as possible, whilst ensuring that any algorithmic transparency mechanism we designed was practical and increased public understanding of how algorithmic tools are used in the public sector.

We decided to run a deliberative public engagement exercise with the Centre for Data Ethics and Innovation (CDEI) and BritainThinks to understand public attitudes towards algorithmic transparency in the public sector. We used a deliberative process after initial surveys from the CDEI revealed that there were low levels of public awareness on the subject. The deliberative approach was used to gradually build up participants’ understanding and knowledge about the use of algorithms in the public sector. We focused on three use-cases to test a range of emotive responses – policing, parking, and recruitment. For this study, 36 members of the public were recruited from across the UK. We had a mix of participants of different age, gender, ethnicity, and socioeconomic groups; levels of trust in institutions; digital literacy; as well as experience of one of the three use-cases in the last six months. We spent over five hours engaging with them over a three-week period. The final stage was an in-depth co-design session, where participants worked collaboratively to create prototypes of an approach to algorithmic transparency that reflected their needs and expectations. 

The current version of our Algorithmic Transparency Standard is based directly on these engagement exercises; with colleagues working on data and AI at the UK government, experts in civil society and academia, and the public. One of the recommendations from the deliberative public engagement study was to divide the standard into two tiers – tier 1 with basic information, aimed at non-expert audiences; and tier 2 with more details for those willing to learn more. We implemented this recommendation; the current version of the standard begins with tier one which includes a short and simple explanation of how and why the algorithmic tool is being used, as well as instructions on how to find out more information. Tier two is divided into five categories: 

  • First, information about the owner and responsibility for the tool including contact details of the team managing the tool and senior responsible owners, as well as information on any external suppliers. 
  • Second, the scope of the tool, technical specifications, justification for its use, and details on what it’s been designed for and what it’s not intended for. 
  • Third,  an overview of how the tool affects decision making through explanation of how it is integrated into the process, and what influence it has on decisions. 
  • Fourth, a list and description of datasets used to train the model and datasets the model is or will be deployed on, with additional information on the data collection process, data sharing agreements, and details on who has access to data. 
  • Fifth, the impact assessments conducted in the process of developing the tool, as well as a detailed description of common risks for the tool and steps taken to mitigate them. 

Throughout the process of developing the standard, I have been a member of the Open Government Partnership’s Open Algorithms Network. This unique opportunity to discuss the latest developments and challenges in algorithmic transparency with colleagues working on the same issues in government around the world was invaluable. Our honest and open conversations about common issues and brainstorming on ways forward were extremely helpful. 

We are currently finalising our fifth national action plan, and one of the working groups set up to deliberate on the next set of commitments has been focusing on open algorithms. Jointly with representatives from civil society, we developed a draft of a commitment exploring actions that would increase accountability of decisions made with algorithmic tools. We will continue to engage with civil society and develop the next steps for this work beyond the initial publication of the national action plan.

No comments yet

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Content

Thumbnail for 10 Lessons from 10 Years of OGP Challenges and Solutions

10 Lessons from 10 Years of OGP

OGP Deputy CEO Joe Powell shares reflections on how OGP’s experience can inform some of the most critical policy debates facing the world today.

Thumbnail for OGP at 10, EITI at 20: Where to next? Challenges and Solutions

OGP at 10, EITI at 20: Where to next?

Since its creation in 2011, the Open Government Partnership (OGP) has successfully promoted open government reforms in 78 countries. Of these...

Thumbnail for One Month to Make 2021 Democracy and Open Government Summits Count

One Month to Make 2021 Democracy and Open Government Summits Count

With the OGP Global Summit and Summit for Democracy just around the corner, we share three steps for the open government community to take to ensure the events deliver concrete…

Open Government Partnership