It is no surprise that many organisations are looking to Artificial Intelligence (AI) to streamline some of their processes, one being their recruitment. AI can help improve efficiency, summarise key texts such as CVs and score applicants based on the information provided. On the face of it, it appears a no brainer for recruiting employers. However, the Information Commissioner's Office (ICO) notes that such AI use can lead to risks for individuals regarding their privacy and information rights. As such, they have released guidance and share key considerations following an audit outcomes report to assist employers looking to use AI tools for recruitment purposes.
The six key questions organisations should ask themselves before using AI tools for recruitment purposes are set out below:
1. Have you completed a Data Protection Impact Assessment?
A Data Protection Impact Assessment (DPIA) helps in identifying, recording, and minimising any data protection risks relating to the use of AI tools for recruitment purposes. Completion of such DPIA must take place before the implementation of any AI tools for recruitment. The DPIA should also be kept up to date. This step is a fundamental part of your accountability obligations under data protection laws.
2. What is your lawful basis for processing information?
When processing personal information such as names and e-mail addresses detailed on applications or CVs, organisations must be able to identify an appropriate lawful basis to rely on for such use. This can include consent, contract, or legitimate interests. No single lawful basis has precedence over the other and the right one will depend on the purpose and relationship with the individual. Where sensitive special category data is envisaged to be processed, such as racial, ethnic origin or health data (including if inferences can be drawn), you must ensure you have the appropriate and lawful condition to do so.
3. Have clear responsibilities and processing instructions been documented?
Both recruiters intending to use AI tools and AI providers have data protection compliance responsibilities. A contract must identify the controller (the recruiter) and processor (the AI provider) of personal information. Where the AI provider is the processor, comprehensive and explicit written instructions must be set out for the recruiter to follow, and you should ensure the AI provider is complying at all times. It may also be appropriate to set out performance measures such as statistical accuracy and bias targets.
4. Have you made sure the AI provider has mitigated bias?
The ICO audit report revealed that AI tools were not always processing personal information fairly. For instance, some recruiters filtered out candidates with certain protected characteristics. All processing of personal information must be done in a manner that is fair by monitoring for potential or actual fairness, accuracy or bias issues in the tools and the outputs. Clear assurances should be provided by the AI provider ensuring they have mitigated bias and that they can provide evidence of such.
5.Is the AI tool being used transparently?
Personal data must be handled transparently. You must ensure candidates are informed about how the AI tool will process their personal data. Clear privacy information must be given to the candidate explaining how and why the tool is being used, the logic used by the tool in making predictions and producing outputs which may affect the candidate. Similarly, candidates should be told how they can challenge any automated decision making.
6. How will you limit unnecessary processing?
Only the minimum amount of personal information can be collected to achieve the desired purpose. Personal data must be adequate, relevant, and limited to what is necessary. It should also only be retained for as long as it is necessary for the purposes in which the personal data is processed. The ICO's audit identified that some AI tools gathered considerably more personal information than necessary and retained it indefinitely on databases of potential candidates without their knowledge.
What organisations need to do
There are important steps your organisation must take before using AI tools for recruitment purposes since there are risks to individuals' privacy rights. Processing of personal data must always be in accordance with data protection laws and must be used with caution. Non-compliance can result in the ICO imposing financial penalties, issuing enforcement actions together with any reputational damage and separate legal actions that candidates subject to the AI tools can bring.
For further detail, nearly 300 clear and actionable recommendations have been provided by the ICO in the AI in Recruitment Outcomes Report to assist AI providers and recruiters to improve their compliance. The ICO are providing a webinar on Wednesday 22 January 2025 where the findings are discussed further which can be registered for here.
Please get in contact with any member of the MFMac Commercial Team to assist with implementing policies and strategies relating to the use of AI tools for recruitment purposes.
This article was written by Arina Yazdi, a Solicitor in our Commercial team.