This website will offer limited functionality in this browser. We only support the recent versions of major browsers like Chrome, Firefox, Safari, and Edge.

Search the website
Thought Leadership

Automated decision-making: ICO consults on new guidance following DUAA reforms

Picture of Amanda Leiu
Passle image

The Information Commissioner’s Office (ICO) has published a consultation on its draft guidance on automated decision-making (“ADM”). The draft guidance follows changes to UK data protection laws introduced under the Data (Use and Access) Act 2025 (“DUAA”) to the ADM regime, which took effect earlier this year (5 February 2026).

For further background on how the ADM regime has changed, please see our previous update here

The consultation on the proposed updates to the guidance closes on 29 May 2026, and can be read in full here.

Key takeaways from the draft guidance

While the draft guidance signals a degree of regulatory flexibility, it is clear that organisations deploying ADM will still need to take care in how systems are designed, governed and explained to individuals. 

The draft guidance includes some helpful insights and clarity on some of the key concepts under the ADM provisions in the UK GDPR. We consider some of the key considerations in the new draft guidance below.

Meaningful human involvement

A decision will only be considered to be ADM if “there is no meaningful human involvement in the taking of the decision". In practice, many decisions that are regarded as automated involve some degree of human intervention at some stage.  

The draft guidance emphasises that human involvement must be active and not a “token gesture” to take a decision outside the ADM rules.

 In practical terms, meaningful human involvement requires the decision-maker to:

  • review the decision at a point where it can still be changed (i.e. before the decision is applied to a person);
  • have the ability to alter the outcome;
  • be sufficiently trained to understand the system’s logic, outputs, limitations and risks; and
  • take into account the relevant data and factors on which the decision was based.

Importantly, this assessment must happen every time a decision is made about a person. Ad hoc spot checks are not sufficient, as it leaves some decisions entirely unchecked.

The draft guidance emphasises the importance of keeping clear records to document how and when human involvement took place.

Profiling

As under existing guidance, profiling involves the automated processing of personal data to analyse or predict aspects of an individual’s behaviour, characteristics or preferences. While profiling is distinct from ADM, the two often overlap: ADM frequently relies on profiling, and profiling may form part of - or, in some cases, all of - the automated processing on which a decision is based. As a result, where organisations carry out ADM, they should carefully assess whether and to what extent profiling is involved, as this will be relevant to the application of the ADM rules.

Significant decisions

The draft guidance adopts a broad view of what constitutes a “decision” for the purposes of the ADM regime, defining it as a conclusion or outcome reached after consideration or analysis that may influence actions taken about an individual or engage their rights. There is an acknowledgement that context is important when assessing whether a decision is ‘significant’. The ICO notes that the same decision may be significant for some individuals, particularly children and other vulnerable groups, but not others. Where organisations cannot clearly distinguish significant from non‑significant outcomes, ADM safeguards should be applied to all decisions made.

The draft guidance sets out a non-exhaustive list of impacts which may indicate significance, including impacts on finance, health and access to essential services. This includes new examples including impacts on behaviour and consumer choice (e.g. algorithmic recommendations, dynamic pricing) which suggest that a wider range of decisions may fall within scope than organisations may previously have assumed.

Information about decisions

The draft guidance makes clear that the “information about decisions” required as part of the ADM safeguards is distinct from the right to be informed. While organisations should explain their use of ADM at the point personal data is collected (e.g. through their privacy notices), the ICO is clear that this is not sufficient on its own. Where ADM is engaged, individuals must also be given decision‑specific information about the actual outcome, rather than simply repeating the information provided in a privacy notice about how the system arrives at its decisions and its potential consequences.

The ICO is clear that the aim is to enable individuals to meaningfully understand the decision and the specific factors that influenced it, so they can make an informed choice about whether to exercise their other rights, such as requesting human intervention or contesting the decision. What this information looks like will depend on context, including the nature of the decision, the systems used and the individual affected.

Lawful basis

Following the changes introduced by the DUAA, organisations can now rely on any of the lawful bases under Article 6 UK GDPR when carrying out ADM, reflecting a more permissive ADM framework. The draft guidance provides that no one basis is better or takes priority, and that the appropriate lawful basis will depend on the purpose of the processing and the organisation’s relationship with the persons involved.

There is, however, one exception. The lawful basis ‘recognised legitimate interest’ cannot be used for ADM. The UK GDPR doesn’t allow ADM to be made entirely or partly on this lawful basis. 

A further restriction applies where special category data is involved. As set out in the DUAA, and covered in the draft guidance, ADM using special category data is only permitted where one of the following conditions apply:

  • the decision is based entirely on the person’s explicit consent;
  • the decision is necessary for a contract between the individual and the organisation and a substantial public interest condition applies; and
  • the decision is required or authorised by law, and a substantial public interest condition applies.

Comment

The draft guidance includes some helpful insight into the ICO’s policy position on ADM as the regulatory framework (including around AI) continues to take shape. While the guidance remains subject to consultation and may evolve, it provides a useful steer on the ICO’s approach ahead of the statutory code of practice on AI and ADM, which is expected to be published later this year. The ICO is required to prepare an appropriate code of practice as to good practice in the processing of personal data in relation to ADM and AI under the Data Protection Act 2018 (Code of Practice on Artificial Intelligence and Automated Decision-Making) Regulations 2026 (SI 2026/425) (the “Regulations”). The Regulations will come into force shortly on 12 May 2026.

For organisations using or considering the deployment of ADM, the draft guidance provides a timely prompt to review ADM use, assess how decisions function in practice, and consider whether existing safeguards and governance remain fit for purpose.

For queries or advice on the content of this article, please contact Hamish Corner, Lucy PeglerAmanda Leiu or a member of Burges Salmon's Commercial & Technology team. 

This article was written by Jenora Vaswani and Amanda Leiu.

See more from Burges Salmon

Want more Burges Salmon content? Add us as a preferred source on Google to your favourites list for content and news you can trust.

Update your preferred sources

Follow us on LinkedIn

Be sure to follow us on LinkedIn and stay up to date with all the latest from Burges Salmon.

Follow us