Skip to main content

Helping recruiters to innovate responsibly with data-driven tools

Posted by: and , Posted on: - Categories: Algorithms, Data-driven technology, Trustworthy innovation

The use of data-driven tools is rising across the recruitment sector. The COVID-19 pandemic has heightened the need for effective and efficient digital tools in hiring as recruiters search for the highest calibre candidates in an increasingly virtual world. Companies are using data-driven tools across the recruitment cycle: from sourcing high-quality applicants through targeted advertising on social media, to sifting candidates through CV screening and evaluation software.

These technologies present opportunities to make the recruitment process more effective: data-driven tools can help standardise elements of assessment and enable organisations to reach a wider and more diverse pool of candidates. But they are not without risk. Poorly designed or implemented tools risk perpetuating existing biases or creating new barriers to fair recruitment. Getting decision-making wrong in the recruitment sector negatively impacts on candidates and employers alike: with peoples’ job opportunities and futures on the line, it is imperative that organisations make decisions that are well-informed and fair. 

A first step in providing practical, sector-specific guidance

The CDEI’s review into bias in algorithmic decision-making highlighted these issues and identified a demand from recruiters for guidance on how to use data-driven technologies responsibly. Others, including a recent report from the All Party Parliamentary Group on the Future of Work, highlighted that a key barrier to trusting AI tools in the workplace is uncertainty around how to hold developers to account. 

Responding to this need, the CDEI worked with the Recruitment and Employment Confederation (REC) to develop a bespoke set of industry-led guidance. We combined the CDEI’s expertise on data-driven technology and associated ethical and policy issues, with the REC’s deep knowledge of the recruitment industry, drawn from their broad industry membership. The resulting guidance, published today, aims to help recruiters be discerning buyers and confident, responsible users of a range of data-driven technologies in recruitment.

This guidance brings together law, standards and principles relevant to the use of data-driven tools in recruitment, with the aim of helping recruiters to take a holistic approach. This includes legal frameworks, such as UK data protection law and equalities law, as well as laws and industry guidelines around recruitment, including the REC’s own code of practice. Both the Information Commissioner’s Office (ICO) and the Equality & Human Rights Commission (EHRC) have also indicated that they plan to update their regulatory guidance in this area.

Facilitating responsible procurement of data-driven tools 

Improved knowledge and awareness of the opportunities is crucial for the sector, and this work sets out in an accessible and clear way some of the technological options open to recruiters through the recruitment funnel.

Our guidance recommends practical steps that should be taken by recruiters looking to procure a data-driven tool. It seeks to equip them with the mechanisms to effectively evaluate and responsibly deploy the tool; taking the appropriate steps to mitigate against risks and maximise opportunities to improve the recruitment process. Centrally, the guidance empowers recruiters to ask the right questions of vendors, ultimately driving the supply chain to consider these issues properly. Concrete steps to help evaluate and ensure high standards - both from the vendor and internally - around functionality, accuracy, bias and discrimination, accessibility, data protection and transparency are covered. 

The guidance proposes key actions at four stages of the purchasing and deployment cycle for a tool:

  1. Before purchasing: Articulating the business’ objectives, how the tool will be used in the wider recruitment process and setting a baseline for equalities testing of the tool. 
  2. During purchasing: Seeking information from the vendor about the tool, how it has been tested and verifying legal compliance.
  3. Before use: Running a pilot and putting transparency processes in place.
  4. During and after use: Monitoring and assessing the tool’s performance, both on accuracy and bias. 

Finally, this guidance should encourage recruiters to provide clear information to candidates about how the tool and their data are used in the decision-making process.

Longer term, we see an increasing need for AI assurance services that are able to provide a level of independent validation that tools meet appropriate standards. This is particularly critical in recruitment, where unfair tools can have a significant impact on individuals, and recruitment teams often lack the appropriate in-house specialist knowledge to assess the performance of new technology (although in some cases, such as psychometric testing, the issues are similar to those with more traditional recruitment tools). Our forthcoming AI assurance ecosystem roadmap will set out the steps required to develop a thriving ecosystem of assurance tools and services that can provide this validation.

Next steps

If you are interested in piloting the guidance, or would like to get touch about the content, please get in touch with us at

This guidance will be supplemented by a series of short guides for specific types of recruitment tools, to be published iteratively over the coming weeks. We welcome comments, suggestions and interventions as we develop this work. 

Sharing and comments

Share this page

Leave a comment

We only ask for your email address so we know you're a real person

By submitting a comment you understand it may be published on this public website. Please read our privacy notice to see how the GOV.UK blogging platform handles your information.