This website will offer limited functionality in this browser. We only support the recent versions of major browsers like Chrome, Firefox, Safari, and Edge.

Search the website
Thought Leadership

You’re Hired: The ICO’s Update on Responsible Automated Decision Making in Recruitment

Picture of Eve Maxwell
Passle image

A renewed focus on fair and lawful use of ADM in recruitment

The Information Commissioner’s Office (ICO) has issued an update on the fair and responsible use of automated decision‑making (ADM) in recruitment. As employers increasingly adopt digital tools and AI‑driven processes, the ICO is emphasising the need to apply data protection law correctly and consistently. 

Since the provisions of the Data (Use and Access) Act 2025 took effect in February 2026, organisations have been able to make solely automated decisions with legal or similarly significant effects on individuals, as long as they apply the ADM safeguards set out in new Article 22C in UK GDPR. See our summary of the new rules here.

The regulator has made clear that it intends to scrutinise how major employers and recruitment platforms deploy ADM, with a particular focus on transparency, discrimination risks and potential misuse. Findings and regulatory expectations will be published, and organisations can expect enforcement action where people’s information rights are not respected.

To inform its update, the ICO gathered evidence from more than 30 employers across different sectors. The research explored how automation is being used to make or support recruitment decisions, from initial screening to final selection.

Bias and discrimination

The ICO recognises that automated tools can help employers process large volumes of applications quickly and consistently, improving efficiency for both organisations and candidates. The regulator’s stated aim is to provide clarity, reduce unnecessary regulatory friction and support public trust in the responsible use of data.

The ICO’s public perceptions research found that people generally understand the value of automated recruitment tools. For example, many are comfortable with their use for tasks such as filtering CVs. However, the public also expressed concern about the potential for new forms of bias that ADM may introduce, particularly in profiling‑based automation. 

Depending on the training data provided to a tool, this could incorporate criteria such as age or sex when making automated decisions on who to hire for a particular role. Similarly, people are generally wary of the use of online behavioural assessments. The ICO noted that, where companies are using ADM, monitoring, bias testing and regular feedback would be important to ensure that the company continually reviews for any bias or discrimination risk arising. 

Key findings: human involvement and legal obligations

A central theme in the ICO’s update is the extent to which employers rely on solely automated decision‑making. Many organisations are using decision-making tools with little or no meaningful human involvement. Where such decisions have legal or similarly significant effects on candidates, such as rejecting an application outright, they fall within the scope of the UK GDPR’s provisions on solely automated decision‑making. This requires transparency as to use, as well as enhanced safeguards, including the right to request human review. The ICO’s evidence suggests that these safeguards are often insufficiently implemented.

Transparency and fairness still need work

The ICO determined that many employers are not providing candidates with clear information about how ADM is used in the recruitment process, or how decisions are made.

Where it is claimed that there is human involvement, the ICO emphasised this must be genuine and applied consistently across all candidates at the same stage. Tokenistic or inconsistent human review will not meet the legal threshold for a process to fall outside of ADM. 

The update also encourages employers to strengthen their monitoring of bias and fairness in automated systems. Regular auditing, clear accountability structures and ongoing evaluation of outcomes are essential to ensure that ADM tools do not inadvertently disadvantage particular groups.

What this means for employers

The ICO’s message is clear: innovation is welcome, but compliance is non‑negotiable. Employers using automated recruitment tools should:

  • Review whether any decisions are being made solely by automated means.
  • Ensure meaningful human involvement where required.
  • Provide clear, accessible information to candidates about ADM processes.
  • Implement robust safeguards, including bias monitoring and fairness assessments.
  • Keep documentation up to date to demonstrate compliance.

As ADM becomes more embedded in recruitment, organisations that take proactive steps now will be better placed to comply with regulatory updates and maintain the trust of candidates.

This article was written by Eve Maxwell and Evelyn Quinn.

 

Related services

Related sectors

Related articles

01
10

See more from Burges Salmon

Want more Burges Salmon content? Add us as a preferred source on Google to your favourites list for content and news you can trust.

Update your preferred sources

Follow us on LinkedIn

Be sure to follow us on LinkedIn and stay up to date with all the latest from Burges Salmon.

Follow us