Natalia Chivite-Matthews Arianna Rossi Why does a new digital evaluator role in the capability framework matter?
As the UK Government embarks on ambitious digital transformation initiatives as set out in the Blueprint for Modern Digital Government, we need robust evaluation mechanisms in place to ensure accountability and demonstrate value for public money. The Performance Review of Digital Spend by HM Treasury and the Department for Science and Innovation (DSIT) has shown that the way digital projects are funded, managed, and monitored needs to change.
It highlights the importance of strong evaluation plans and outcome metrics to provide a stronger evidence base for decision-making.
In response, the Government Digital and Data profession has recently introduced a new digital evaluator role in the capability framework. It draws on the experience of our team at the Department for Business and Trade (DBT). Digital evaluators are embedded in agile teams, supporting products through continuous learning. They assess digital products and services for their impact on society, how efficient they are, and any unexpected results.
Digital teams provide one of the most rewarding environments for evaluators. They actively welcome insights and feedback to support continuous improvement, as this is part of the ethos of agile working. They are also very supportive of iterative learning as a necessary step in supporting continuous innovation and ideas. It enables the release of resources to move on to new, hopefully better, projects that best meet the needs of users and provide value for money.
Why digital evaluation is importantDigital evaluators play a vital role in demonstrating the impact of digital tools and contributing insights on progress. Their insights help teams "test, learn, adapt," providing clear pointers for decision-making and ensuring accountability and learning across the wider team. Ultimately their expertise ensures that digital products and services are continuously improving, securing benefits to end users of government services and providing value for money. Digital evaluations ensure that digital products and services are aligned with best practices in spending public money.
Methods and processes used by digital evaluatorsDigital evaluators employ a range of evaluation methods, processes and frameworks as set out in the HMT Magenta Book. These include process evaluation, impact and economic evaluation methods and a range of quantitative and qualitative research techniques. They also understand the principles of the HMT Green Book and how to measure value for money. This comprehensive approach not only ensures that digital products and services are thoroughly evaluated to understand what works, what doesn’t and why. It also looks at their social impact, efficiency gains, and potential unintended consequences.
Skills required for digital evaluatorsEvaluators work closely with Performance Analysts (PAs) to develop key performance indicators and support dashboard development for monitoring. They work with user researchers to share insight and often work together in qualitative and quantitative work to ensure it covers both evaluation and user research needs.
The digital environment demands specific skills from evaluators, including the ability to work in agile teams, understand product development stages, and undertake evaluation as part of the product delivery. Digital evaluator skills cover:
evaluation planning and strategy evaluation delivery product and service monitoring (working closely with PA colleagues) monitoring and evaluation across the product life cycle How does this role fit into the wider Government Digital and Data Profession capability framework?Given the wide range of quantitative and qualitative research methods covered by evaluation, the evaluator role sits within the wider data role family. It shares many of the key skills of data roles namely:
quality assurance of data and analysis data ethics and privacy communicating analysis and insight How is a digital evaluator role different from evaluators in other parts of government?Traditional evaluators typically operate outside the program area to secure an objective assessment, and government evaluations are often commissioned to the private sector to deliver. Their methods focus on long-term outcomes after implementation, providing valuable insights into the effectiveness of interventions over time. In contrast, digital evaluators are agile and hands-on, starting their evaluation in the Discovery or Alpha stages and ensuring the evaluation is continuous and dynamic. They develop evaluation plans in-house and have hands-on experience of applied evaluation methodologies. Digital evaluation is more akin to action evaluation, because digital services are developed in an iterative way, so traditional methods are not always applicable or effective. In digital, evaluation is a crucial role to secure continuous improvement.
Learning from experienceIn DBT we have published our Digital Evaluation Strategy and Playbook. Evaluation has enabled the department to stop funding projects or products that do not provide good value for money and has provided the evidence to continue funding those that work well.
Our experience has demonstrated that the role of a digital evaluator is distinct from traditional evaluators in several ways. Their integration within agile teams, hands-on approach, and methods and processes make them indispensable in the digital age. By continuously feeding insights into the development process, digital evaluators ensure that digital products and services deliver value for the public and society.
Where can I find out more?If you want to learn about the role or are interested in becoming a digital evaluator, check out the digital evaluator role and our blog on Understanding the evaluations’ role in measuring the impact of AI interventions and DBT’s Digital Evaluation and Performance Analysis Strategy.
seen at 11:42, 12 February in Digital trade.