Wednesday, November 6, 2024
Home > ICO > ICO intervention into AI recruitment tools leads to better data protection for job seekers

ICO intervention into AI recruitment tools leads to better data protection for job seekers

We have today issued a series of recommendations to AI developers and providers to ensure they are better protecting job seekers’ information rights.

AI is increasingly being used in the recruitment process to save time and money, helping to source potential candidates, summarise CVs and score applicants. If not developed lawfully, these tools may negatively impact jobseekers who could be unfairly excluded from roles or have their privacy compromised.  

We audited several providers and developers of AI tools for recruitment and made almost 300 recommendations, such as ensuring personal information is processed fairly and kept to a minimum, and clearly explaining to candidates how their information will be used by the AI tool. The companies accepted or partially accepted all recommendations.

Published today, the audit outcomes report summarises the key findings and recommendations from the consensual audits, as well as providing examples of good practice, case studies and lessons learned for both AI developers and recruiters.

The regulator also published key questions for organisations looking to procure AI tools for recruitment, so they can seek clear assurances from developers and providers.

Ian Hulme, ICO Director of Assurance, said:

“AI can bring real benefits to the hiring process, but it also introduces new risks that may cause harm to jobseekers if it is not used lawfully and fairly. Our intervention has led to positive changes by the providers of these AI tools to ensure they are respecting people’s information rights.

“Our report signals our expectations for the use of AI in recruitment, and we’re calling on other developers and providers to also action our recommendations as a priority. That’s so they can innovate responsibly while building trust in their tools from both recruiters and jobseekers.”  

About the audits

Our audits revealed that some AI tools were not processing personal information fairly – for example, by allowing recruiters to filter out candidates with certain protected characteristics. Others were inferring characteristics, including gender and ethnicity, from a candidate’s name instead of asking for this information. WE instructed these providers to collect accurate information directly from candidates and ensure that regular checks are in place to monitor and mitigate potential discrimination.  

The regulator was also concerned that some AI tools collected far more personal information than necessary and retained it indefinitely to build large databases of potential candidates without their knowledge. We recommended that all candidates must be provided with transparent privacy information, including a clear retention period.  

We have followed up with the organisations and confirmed that its recommended actions were implemented.

One of the organisations audited said:

“We are actively working to implement the specific actions agreed with the ICO in our audit plan. For example, we are making sure to provide the relevant information regarding the use of AI in our privacy policies and evaluating the steps taken to minimise bias when training and testing our AI tools.”

Supporting the responsible development and adoption of AI  

Following these audits, we intend to work with the organisations using these AI tools to build on its understanding of the privacy risks and potential harms of using AI to aid recruitment.

We will be delivering a webinar on Wednesday 22 January 2025 for AI developers and recruiters so they can learn more about the findings and how they can be applied. Register for the webinar here.

Last year, we issued updated guidance to educate AI developers on ensuring their algorithms treat people and their information fairly.


Notes to editors

When an organisation is audited by the ICO, the regulator assesses whether it is compliant with data protection law and provides a report with recommendation to how to improve. Find out more here. 

1. The Information Commissioner’s Office (ICO) is the UK’s independent regulator for data protection and information rights law, upholding information rights in the public interest, promoting openness by public bodies and data privacy for individuals.
2. The ICO has specific responsibilities set out in the Data Protection Act 2018 (DPA2018), the United Kingdom General Data Protection Regulation (UK GDPR), the Freedom of Information Act 2000 (FOIA), Environmental Information Regulations 2004 (EIR), Privacy and Electronic Communications Regulations 2003 (PECR) and a further five acts and regulations.
3. The ICO can take action to address and change the behaviour of organisations and individuals that collect, use, and keep personal information. This includes criminal prosecution, civil enforcement and audit.
4. To report a concern to the ICO telephone call our helpline on 0303 123 1113, or go to ico.org.uk/concerns.

 


Original Source