This website will offer limited functionality in this browser. We only support the recent versions of major browsers like Chrome, Firefox, Safari, and Edge.

Search the website
Thought Leadership

ICO and Ofcom joint statement on age assurance – key considerations for organisations

Picture of Victoria McCarron
Passle image

On 25 March 2026, the Information Commissioner’s Office (ICO) published a joint statement with Ofcom on the interaction between online safety and data protection requirements in relation to age assurance. The statement is aimed at online services likely to be accessed by children and which fall within scope of both the Online Safety Act 2023 (“OSA”) and UK data protection legislation. 

While the statement does not introduce new legal obligations, it provides clarity on how organisations should navigate the overlap between online safety duties and data protection principles when designing and deploying age assurance measures. Importantly, it also signals a more coordinated regulatory approach from the regulators responsible for online safety and data protection in the UK going forward.

For organisations already grappling with implementation of the OSA, the statement serves as a practical reminder that child safety and data protection compliance are not competing objectives but must be considered together from the outset.

The full statement can be accessed here

Regulatory background (ICO / Ofcom)

Age assurance sits at the intersection of two rapidly evolving regulatory regimes. The OSA places increasing pressure on services to prevent children from accessing harmful or age‑restricted content. The use of age assurance technologies inevitably involves the processing of personal data, often including biometric or sensitive data and therefore engages data protection considerations.

The joint statement makes clear that organisations must comply with both regimes. Ofcom and the ICO emphasise their intention to work together through information sharing, coordinated engagement with industry and a collaborative approach to guidance and regulatory priorities. While each regulator retains separate statutory responsibilities, the aim is to align expectations and reduce uncertainty for organisations operating under both frameworks.

This coordinated approach is helpful for organisations in reducing the risk of conflicting regulatory messages, as well as clarifying regulatory expectations around governance, accountability and demonstrable compliance. We have recently considered the ICO’s age assurance expectations in relation to Reddit’s failures in processing children’s data (read here).

Key takeaways for organisations

  1. Compliance with online safety and data protection must be addressed together

A central message of the statement is that meeting online safety obligations does not displace data protection responsibilities. Any age assurance solution must be compliant with UK data protection laws, as well as the OSA.

In practice, this means organisations should avoid approaching age assurance as a stand-alone “online safety” issue. Instead, data protection considerations should be embedded into age assurance decision‑making from the design stage, including lawful basis, transparency, accountability and user rights.

For many organisations, this will require closer collaboration between legal, compliance, product and technical teams, particularly where age assurance tools are being procured or developed at pace.

  1. A proportionate, risk‑based approach is essential

The regulators emphasise a risk‑based and proportionate approach to age assurance, reflecting both the nature of the service and the level of risk posed to children. Organisations are expected to assess:

  • The likelihood of children accessing the service;

  • The types of harm children may be exposed to; and 

  • The extent and sensitivity of personal data processed as part of age assurance.

Higher‑risk services (including  high-reach user-to-user platforms and those with greater risk of users encountering illegal content or content harmful to children) will be expected to deploy more robust age assurance measures, while lower‑risk services (which are platforms, sites, or apps that have a "negligible" or "low" risk of users encountering illegal content or content harmful to children) may be able to justify lighter‑touch approaches. However, organisations should be cautious about relying on self‑declaration alone as an appropriate method to reduce risks and potential harms to children online, particularly where minimum age requirements apply. Self-declaration can be a useful starting point; although the method is limited, as it simply asks users their age without the need for any evidence to confirm it. The regulators state that self-declaration alone is not treated as a highly effective means to determine the age of users and prevent access to harmful content by underage users. Organisations should consider a combination of methods to verify the age of users, including photo-ID matching, credit card checks, digital identity and facial recognition services.

Crucially, the statement reinforces that risk assessments and governance decisions should be documented. Organisations should expect to be able to explain (to either regulator) why a particular age assurance method was chosen and how it balances child safety with privacy risks.

  1. Privacy‑by‑design and data minimisation remain core expectations

The joint statement reinforces the importance of data protection by design and by default when implementing age assurance. Regulators expect organisations to:

  • Collect only the minimum data necessary to establish a user’s age or age range

  • Avoid repurposing age assurance data for unrelated uses

  • Limit retention periods and access to age data

  • Ensure transparency with users about how and why age data is processed

In other words, age assurance should confirm age, not identity, unless there is a clear and proportionate justification for more intrusive measures.

For organisations exploring newer or AI‑driven age assurance technologies, the regulators are supportive of innovation providing privacy, fairness and data protection risks are carefully managed.

Meeting regulatory requirements in practice

The ICO and Ofcom share a flexible, tech-neutral approach. Where a service falls within the scope of the OSA, organisations must have an age assurance process that is highly effective at determining whether or not a user is a child. If an organisation sets a minimum age for using a service, that organisation should use an effective age gate to prevent underage access and avoid unlawful processing under data protection laws.

In order to comply with OSA and data protection obligations, organisations should consider putting in place the following practical steps.

Steps to comply with OSA obligations 

Steps to comply with data protection obligations

  1. Put in place highly effective age assurance processes and be able to demonstrate how the method is highly effective at determining whether a user is child. 

  2. Ensure users identified as children are protected from harmful content. 

  3. Take steps to reduce the risk of children accessing the harmful content. 

  4. Apply and enforce minimum age standards as set out in the organisation’s terms of service effectively. 

  5. Measure and monitor ongoing performance against key indicators on an ongoing basis. 

  1. Conduct a Data Protection Impact Assessment (DPIA) to evaluate the risks associated with data processing activities. 

  2. Apply robust age assurance practices and enforce a minimum age standard that is appropriate and proportionate to the level of data processing risk. 

  3. Apply all data protection principles to age assurance technologies. Principles include data minimisation, storage limitation and purpose limitation. Ensure that the data is only collected as strictly necessary to confirm a user’s age or age range. 

  4. Continue to assess the risks and effectiveness of age assurance processes to ensure they remain fit for purpose and compliant with data protection laws.

 

What this means for organisations

The joint statement is a clear signal to assess existing or planned age assurance arrangements. In particular, organisations should:

  • Review whether age assurance measures are sufficiently robust for the level of child safety risk;

  • Re‑assess data protection documentation, including DPIAs and privacy notices;

  • Ensure governance frameworks clearly address both regulatory regimes;

  • Prepare for greater regulatory scrutiny as Ofcom and the ICO increasingly coordinate their enforcement priorities.

If you have any questions or would otherwise like to discuss how the recommendations may impact you or any other issue raised in this article, please get in touch with Hamish CornerRichard Hugo or Amanda Leiu in our Commercial & Technology team. 

This article was written by Alex Bones, Amanda Leiu and Victoria McCarron.

Related sectors

See more from Burges Salmon

Want more Burges Salmon content? Add us as a preferred source on Google to your favourites list for content and news you can trust.

Update your preferred sources

Follow us on LinkedIn

Be sure to follow us on LinkedIn and stay up to date with all the latest from Burges Salmon.

Follow us