Skip Navigation

Five Stars? Developing a National Action Plan Review Tool

Tim Hughes|

Last week, civil society advocates gathered in The Hague to review and conclude the pilot phase of the Civil Society National Action Plan Review tool. The idea of the project is primarily to equip national OGP actors with an advocacy tool that can help them push for a stronger partnership and more ambitious plans by assessing the OGP process through a civil society lens. The combined results can also help OGP improving the overall OGP mechanism. Tim Hughes from the UK-based CSO Involve summarized main findings of the event below.

The central mechanism of the Open Government Partnership is, of course, the National Action Plan. It presents the advocacy opportunity for reformers inside and outside governments to secure reforms. And it is against the action plan that a government can be held accountable for the implementation of its commitments.

If we’re to increase the impact of the OGP, the obvious place to start is with the quality of those plans. More specifically, creating incentives to develop action plans in an open and inclusive way, and disincentives to make vague, irrelevant and/or unambitious commitments.

The Independent Reporting Mechanism has a clear role to play in increasing this quality, but so too does civil society. CSOs in a number of countries have already produced their own assessments of their government’s approach to the OGP, the commitments that have been made, and the extent to which they’ve been fulfilled.

For the past 18 months we’ve been exploring whether it would be possible to complement these national initiatives by developing a common methodology and framework for assessing action plans, owned and conducted by civil society.

We’ve had in mind two objectives:

  • To provide an advocacy tool to support the development of a dialogue and partnership process between government and civil society.
  • To inform the improvement of the overall OGP mechanism.

Early last year we developed an alpha version of just such a tool. The tool consists of a range of questions to assess how an action plan was developed, and the quality and ambition of the commitments it contains. Together these questions are intended to provide an immediate and detailed picture of the quality of an action plan, assessed against a common framework. But just as importantly, they set out the building blocks needed for a strong action plan process and commitments.

The majority of questions consist of a four point rating scale, with guidance on choosing the appropriate rating, along the lines of:

1. To what extent was the information that the government made available prior to the start of the public consultation process sufficient to understand how it would work?

  • To a large extent (the information provided was of good detail and quality)
  • To a moderate extent (the information provided was of average detail and quality)
  • To some extent (the information provided was of poor detail and quality)
  • Not at all (the government provided no information to explain the public consultation process)

In the original design, a lead civil society organization conducts a preliminary review, which is shared with other engaged CSOs for comment. The review is then shared with the relevant OGP lead for comments from the government. After each comment period the lead CSO revises the review based on the comments they have received, but have discretion over what they take on board. The final version of the review is published, with comments from CSOs and government included.

The standard framework allows for ratings to be presented for a government’s performance on specific aspects of the review, and an overall score calculated. Scores for each question in the review are weighted according to both how challenging and how important they are to complete.

This, at least, was our initial thinking. To test it, CSOs in a number of countries have piloted the tool during the last round of action plans. Last week, participants met in at the Hivos Head Office to discuss their experience of using the tool and consider how it can be improved.

There was lots of in depth discussion about the tool can be improved, which we’ll be poring over in coming weeks. But for now, here are my five immediate takeaways:

  1. The primary purpose of the tool should be to support national level advocacy. The standard framework across countries is an important part of this, but we must be careful about the comparisons we draw between countries based on the results. Comparison over time, within countries, may be more effective.
  2. It’s important for the tool to be owned and led by civil society, but some degree of endorsement by the OGP would give it added strength. Backing from international civil society and the OGP for the tool will support national level advocacy and encourage engagement from governments.
  3. The process for completing the review should emphasise collaboration between the lead CSO, other CSOs and the government. Combining the government and CSO comment period would quicken the process and, perhaps more importantly, allow for more dialogue to take place.
  4. The current scoring and weighting methodology needs to be simplified and more work is needed to test the weightings assigned to questions. Building the scoring structure into the review would help to make it more transparent.
  5. We’re on to something. Everyone agreed that the tool had been useful in assessing their country’s action plan, and could see the opportunity it can provide for advocacy. So, over the next couple of months we’ll be making some adaptations to the tool based on the experience of the pilots, and then rollout version 2.0 in the next action planning cycle.

Watch this space for updates.

 

Open Government Partnership