Skip to main content

From Roadmap to Reality: Insights from Industry on Advancing AI Assurance

Posted by: , Posted on: - Categories: Algorithms, Artificial intelligence, Assurance

As set out in the Government’s National AI Strategy, the UK aims to establish the most trusted and pro-innovation system for AI governance in the world. A key component of getting this light-touch and pro-innovation governance right, is delivering on the Strategy's aim to build a world-leading AI assurance ecosystem, as set out in the CDEI’s Roadmap to an effective AI assurance ecosystem. AI assurance will help to drive the adoption of AI by building justified trust in AI systems by reliably evaluating and then communicating whether or not these systems are trustworthy. 

Since publication of the Roadmap, we’ve focused on engaging with industry stakeholders from startups, SMEs, and multinationals  to understand current attitudes towards, and engagement with, AI assurance, determining where CDEI is best placed to support industry and drive the adoption of AI assurance across the economy. 

Today we are publishing our “Industry Temperature Check: Barriers and Enablers to AI Assurance”. This report highlights key barriers to, and enablers of, AI assurance within the HR and recruitment, finance, and connected and automated vehicle (CAV) sectors. 

The findings from this report draw on engagements including: 

  • Ministerial roundtables with AI developers and AI assurance service providers
  • The CDEI x techUK AI assurance symposium
  • Semi-structured interviews with industry stakeholders, and 
  • An online survey targeted at organisations in the HR and recruitment, finance and CAV sectors, to identify sector-specific barriers and enablers to AI assurance.  

Developing our sector-specific focus 

We chose to  adopt a sector-based approach to align with the UK’s decentralised, context-based approach to AI regulation, as outlined in “Establishing a pro-innovation approach to AI regulation”. AI raises unique risks depending on its context of use, and different sectors have varying levels of readiness and skill for its implementation and governance. As such, the risks and appropriate regulatory responses must be considered in the relevant context. Our research tracked industry engagement with AI assurance across three discrete sectors: 

  • HR and recruitment
  • Finance
  • Connected and automated vehicles (CAV) 

These sectors were selected based on the variety of the most prevalent risks that are introduced by increased AI adoption. The use of AI in HR and recruitment primarily introduces risks of discriminatory bias, requiring system fairness; the use of AI in finance raises risks of cyberattack and financial fraud requiring technical security and robustness; and the use of AI in CAV poses risks to human life, requiring safety and routes to redress. By focusing on a range of sectors with unique risks, the report aims to identify a wide breadth of barriers and enablers that are likely to be faced by other sectors in the wider economy. 

Next steps 

This report will inform the work of CDEI as we continue to encourage industry adoption of AI assurance techniques and standards. We will now begin to look towards practical support to ensure that we are continuing to drive forward the vision of the Roadmap. To this end, the CDEI will be collaborating with techUK to develop a portfolio of case studies of AI assurance good practice to address the notable desire for signposted guidance identified in the Industry Temperature Check. 

These case studies will demonstrate how organisations are using different AI assurance techniques, like impact assessments and performance testing, in practice. This is a valuable opportunity for companies at the cutting edge of AI assurance to showcase their practical approaches to measuring, evaluating and communicating the trustworthiness of AI systems. Submissions are still open, and we invite and encourage industry to submit their case studies of AI assurance techniques for inclusion.

If you have any questions or would like to find out more about this report, please contact us at

Sharing and comments

Share this page


  1. Comment by Teknik Industri posted on

    What practical support and guidance does the CDEI plan to provide to industry stakeholders to drive forward the vision of the AI assurance roadmap, particularly regarding the collaboration with techUK to develop case studies of AI assurance good practice?

    • Replies to Teknik Industri>

      Comment by Nuala Polo posted on

      Following publication of the roadmap, we engaged with industry to gauge attitudes towards and engagement with AI assurance techniques and standards. The outcome of this engagement was reported in our Industry Temperature Check which looked at the barriers and enablers to AI assurance, and set out clear interventions that the Government and others can make to overcome these barriers. The CDEI is working closely with industry in three sectors that face very different challenges from increased AI adoption - Finance, Recruitment, and Connected and Automated Vehicles - to provide tools that improve trust in AI systems and grow market demand in a variety of environments.

      We're also supporting industry in a number of other ways including through our Data driven tools in recruitment guidance to support the responsible procurement and use of AI and data drive tools in the HR and recruitment sector and our Portfolio of AI Assurance Techniques that is a best practice resource which is regularly updated. As well as our recently launched Fairness Innovation Challenge a grant challenge that aims to drive the development of novel solutions and assurance mechanisms to address bias and discrimination in AI systems.

      We are currently scoping additional opportunities to provide industry and regulators with practical support for implementing the recommendations outlined in the Assurance roadmap.

      Please see below for links to the resources mentioned:

      Industry Temperature Check:
      Data driven tools in recruitment guidance:
      Portfolio of AI Assurance techniques:
      Fairness Innovation Challenge:



Leave a comment

We only ask for your email address so we know you're a real person

By submitting a comment you understand it may be published on this public website. Please read our privacy notice to see how the GOV.UK blogging platform handles your information.