Skip to main content
Recrutiment & Employment Confederation
Research

Automated Sifting Tools

Practical guides

This publication forms the third of three short guides focusing on the responsible procurement and use of specific data-driven recruitment tools. It is intended to be read in conjunction with the general AI recruitment guidance published in December 2021, which provides detailed steps to determine whether a tool is fit for purpose.

This guide focuses on considerations specific to automated sifting tools.

The guidance has been developed jointly by the Recruitment and Employment Confederation (REC) and the Centre for Data Ethics and Innovation (CDEI). The CDEI leads the UK Government’s work to enable trustworthy innovation using data and AI. More information about the CDEI can be found on the CDEI web page or by contacting cdei@cdei.gov.uk.

Automated sifting tools

Automated sifting tools, also known as candidate screening software, are a set of data-driven technologies that assess the applications received for a particular role. They often use natural language processing to evaluate CVs and personal statements and assign them a score. This score is then typically used by the recruiter or firm as part of the prioritisation process for invitation to interview. In most cases, the tool will score or rank the candidates using keyword search results based on criteria defined by the employer; for example, matching the keywords in the job overview or candidate specification.

For recruiters and firms, the big opportunity that automated sifting tools can offer is time and cost savings. They can help relieve a large portion of the manual work associated with reviewing applications, freeing up resource to focus on other aspects of the hiring process. This can be a more efficient way for recruiters to identify the most suitable candidates, particularly if there is a high volume of applications for a role, such as on graduate recruitment schemes.

There is also the potential for this technology to remove the human bias inherent in a traditional recruitment process, by applying the same standardised assessment to every application. Vendors are developing tools which actively seek to improve the diversity of recruitment pools. One example is the Contextual Recruitment System developed by Rare: a tool that seeks to level the playing field for candidates from a lower socioeconomic background. It captures thirteen different markers of disadvantage (e.g. whether a candidate has been a young carer, or whether their parents went to university) and allows employers to understand how far a candidate has outperformed their school average, contextualising their ability to perform in the role.1

However, if a tool is trained on biased data, it is extremely likely to perpetuate existing workforce biases. This was evidenced when Amazon’s pilot algorithm apparently downgraded the applications of candidates who attended women-only universities, having been trained on 10 years of historical employment data.2 This algorithm was developed as part of an experiment and was not used in a real-world context.

There are additional risks around the functionality of these tools. Firstly, they are trained to be prescriptive and therefore cannot account for the array of different ways candidates might articulate their suitability. For example, a candidate may use a synonym in place of an exact keyword, and therefore be unnecessarily rejected from the process. This means recruiters may miss out on suitable talent. Similarly, if a candidate has non-standard formatting on their CV (e.g. the CV includes graphics), the tool may downrank an application that otherwise meets the job criteria.

Finally, it is sometimes possible for candidates to manipulate the tool to gain a higher score or ranking, resulting in an unfair process. For example, a recent BBC documentary showed how it is possible to include keywords in white text on a CV that is invisible to the human eye but would be picked up on by a data-driven tool. This risk reinforces the case that it is good practice for a human reviewer to be in the loop. In practice, this means making sure that recruiters are periodically checking the results of the tool by comparing a sample of the output to a sample assessed by a human reviewer. Having a human ‘in the loop’ does not mean requiring an individual to review every decision of the automated sifting tool, but should be focused on monitoring whether or not the system is working as intended.

 

1 https://contextualrecruitment.co.uk/

2 https://www.reuters.com/article/us-amazon-com-jobs-automation-insight-idUSKCN1MK08G

 

1. Evaluation of value-add and effectiveness

Sifting tools can use a technology called natural language processing to analyse the text in a CV or personal statement. In the case of an automated CV sifting tool, the algorithm may have been trained on a bank of existing CVs and, having been told which ones score highly and which do not, the tool will learn patterns and features to look for in assessing new CVs.

It is important to establish whether the tool you are considering is fit for purpose, suitable for your recruitment process, and if so, how it will fit into your wider recruitment process. As part of your consideration, you should evaluate the tool in the context of your existing processes and alternative tools you are considering. For example, are you able to articulate how an automated sifting tool will be able to deliver on your objectives more effectively than, say, a game-based assessment or a task-based assessment.

➔     Discuss with the vendor how you would like the tool to function and ensure that it will meet your requirements in terms of how it prioritises and scores CVs.

  • Other than keyword search, ask how the tool will score CVs. E.g. will it take into consideration: educational background; gaps in employment; formal and informal training?
  • Ensure that the keyword search is aligned with the specific job advert, to allow candidates to accurately tailor their application.

➔     Decide how this tool will fit within your decision-making process and how the applicants’ score or ranking will be used and weighted.

For example, good practice is that a human uses the score as part of the assessment rather than entirely relying on the assessment provided by the data-driven tool.


2. Managing key risks

2.1 Accuracy

It is important to ensure that the scoring assigned by the product is based on factors relevant to the application process and the role.

➔     Ask the supplier for a demo of the tool for a specific role, to assess whether the candidates it recommends are suitable for the role and have the appropriate skillset.

Ask to see a sample of the CVs discarded by the tool to check for false negatives.

2.2 Bias and discrimination

There are notable examples of this kind of tool using discriminatory factors in the judging of CVs. For example, the Amazon tool referenced in the introduction.

The particular risk with a sifting tool is that it embeds learnt bias. This is where data-driven tools “learn” how to make hiring decisions based on existing employment data. If there are patterns of bias in that data, then the data-driven tool is likely to repeat those patterns, which can lead to unfairness. For example, if a disproportionate percentage of your current and past workforce has been male, the tool is likely to repeat this pattern and recommend fewer women for recruitment. This discrimination can occur directly or indirectly for these types of tools; much of the information on CVs can be strong proxies for protected characteristics. For example, while the number of years in paid employment may reflect a candidate’s experience, it could also be a proxy for age.

➔     Consider what competencies and skills are relevant to be assessed as part of the CV sifting

  • Outline the potential side effects of using your chosen competencies and skills and, where possible, mitigate these side effects. (E.g. if the tool takes into account gaps in employment, consider how this might disproportionately affect candidates with caring responsibilities).

➔     Consider the reputation and history of the automated sifting tool provider.

➔     Understand whether there is unfairness:

  • Collecting information about the candidates who were scored highly by the tool, compared to the wider pool of candidates who applied, may indicate whether there is a bias in the way the tool is scoring candidates.

➔     Consider running tests for bias yourself, in pilots of the tool and throughout its use: to assess any potential bias arising from the tools used in relation to the actual hiring process, consider running tests of the system using existing applications on file from candidates with a diverse range of backgrounds. Note that:

  • Collecting and analysing demographic data for the purpose of testing fairness is permitted within UK data protection law where explicit consent has been provided.

An Equality Impact Assessment may be a useful tool for considering the important issues relating to discrimination and is strongly recommended for tools that are likely to have a significant impact on decision making, particularly where a large number of applicants are involved.

2.3 Transparency and due process

Both candidates and recruiters may be nervous about the reduced role of humans in the recruitment process: clear communication and transparency may help to mitigate some of these concerns.

Steps to build transparency into the process:.

➔     Understand whether the tool is making decisions based solely on automated processing (whether Article 22 applies): have a good understanding of how the data-driven system works internally (within your organisation), on what basis decisions are made and the level of human oversight.

  • If this provision does apply, ensure you have a lawful basis and provide routes for human intervention, the individual to express their point of view, and human review where requested.
  • Take appropriate measures to ensure that GDPR duties are met, including conducting a Data Protection Impact Assessment (DPIA). It is good practice to check that the vendor has also undertaken a DPIA of the tool. More information on fulfilling Article 22 duties can be found on the ICO website here.

➔     Clearly explain to candidates that an automated sifting tool will be used as part of the process. Include information on:

  • What the criteria for success are.
  • How the score from the tool will contribute to the overall assessment of the candidate.
  • How candidates can get in touch to appeal or challenge a recruitment decision that has involved an automated CV sifting tool.
  • How candidates can provide feedback about their experience.

2.4 Data protection

Automated sifting tools engage data protection law because they involve personal data (relating to an identifiable individual). It is advisable to seek assurance that your chosen vendor’s practices comply with UK GDPR requirements.

Your data protection responsibilities will be determined by the nature of your relationship with the technology vendor. As the recruiter, you are likely to be the joint data controller (determining the purposes for which the data are processed and the means of processing). If the vendor also uses the candidates’ data from your interviews to train their algorithm, they would also become a controller, as they are determining the means of processing, and so acquire the relevant responsibilities.

Many other data-driven recruitment tools (i.e. targeted online advertising) allow an element of control over the data you access as a recruiter. However, for an automated sifting tool processing CVs & personal statements, you have less control over the data that a candidate includes, and there is a high chance that at least some candidates will include sensitive personal data, including special category data under UK GDPR. This invokes an additional level of responsibility around compliance with data protection law, both for the data controller and data processor, and as such, you should consider carefully what  steps you need to take to ensure that your data processing complies with GDPR requirements, and furthermore that the tool is not making direct use of protected  characteristics data as a basis for making decisions about candidates (which could constitute direct discrimination under the Equality Act). This may not be straightforward to achieve depending on the level of explainability in the underlying model that the tool is based on.

Steps to check for compliance with data protection law:

➔      Agree who has the role of data controller (vendor or recruiter or both) before a procurement contract is signed - controllers will be responsible for complying and demonstrating compliance with UK GDPR.

➔     Ensure that the vendor’s practices comply with UK GDPR requirements, paying particular attention to the guidance that applies to special category data. This includes:

  • Understanding how your candidates' data will be used in the future. For example, if it is used to train the vendor’s model, consent will be needed from candidates before they interact with the tool.
  • Ensuring robust processes are in place for any data storage and mechanisms are in place to delete data.

➔      Complete a Data Protection Impact Assessment (DPIA) - you may want to complete this with the vendor.

  • It is good practice to publish the DPIA so it is accessible to applicants and workers.

➔     Where sensitive personal data is being collected for equalities and fairness monitoring, ensure that:

  • Candidates are given appropriate notice, are provided with a clear explanation of how data will be used, and are also given an option to opt out;
  • This data is not accidentally “leaking” into the decision making process; it is being kept ring fenced and separate.

3. Communication and building trust

Proactive communication and transparency around how a recruitment round is being conducted is important for inclusivity. This is particularly true when there is a data-driven tool involved in the process, as candidates may feel unclear about how decisions are being made.

➔     Practically, this could involve making available a document to applicants which outlines:

  • The justification for the use of an automated sifting tool.
  • How the tool will form part of the decision making process.
  • The criteria for success for the specific role.
  • How candidates can get in touch to appeal a recruitment decision that has involved an automated sifting tool.
  • How candidates can provide feedback about their experience.

➔     Once the tool is in use, continually monitor and evaluate its impact, in line with the key considerations listed above.

A human reviewer should assess a sample of high-scoring and low-scoring CVs to understand the tool’s decisions. If the tool’s scores are not aligned with the scores that the human would have given, revisit the criteria the tool is scoring against.


Other guides in this series