Auditing algorithms effectively – an increasing role for regulators?
This website will offer limited functionality in this browser. We only support the recent versions of major browsers like Chrome, Firefox, Safari, and Edge.
The Digital Regulation Cooperation Forum (DRCF) – a working group of the FCA, ICO, CMA and Ofcom – has published a working paper setting out its plans and priorities for the next year. DRCF has also published a paper on the current and future landscape of algorithm auditing and the role of regulators.
Here we provide an overview on the increasing use of algorithm audits and increased engagement from regulators to promote effective algorithm audits.
Algorithm auditing is clearly in the sights of the DRCF.
The DRCF was set up in July 2020 and consists of The Competition and Markets Authority (‘CMA’), the Information Commissioner’s Office (‘ICO’), the Office of Communications (‘Ofcom’) and the Financial Conduct Authority (FCA). Together, they aim to create and maintain a consistent approach when it comes to regulation of issues such as privacy, competition and data protection that arise within a rapidly developing digital landscape.
The Forum has three goals:
DRCF's collaboration goal includes:
In order to effectively assess algorithmic systems and support the appropriate development and deployment by businesses DRCF will:
The DRCF's plans show that there is more to come for algorithm auditing. Whether and to what extent regulators in the UK converge, or diverge, in their approaches remains to be seen. But the DRCF plans suggest a great deal of convergence in how regulators understand and try to improve algorithm audits.
The DRCF has in parallel published a discussion paper on the current and future landscape of algorithm auditing and the role of regulators.
By way of background, the use of algorithms can involve a lack of transparency and disparity of understanding between creators and the users of the systems, due to unclear governance and often complex inner workings of the decision-making systems. It increasingly looks like a method of ‘audit’ or ‘assurance’ is needed, in order to review algorithmic systems against their intended purpose(s), ‘close the information gap’ and build trust and ensure that no harm is being caused as a result.
We have written about the role of AI auditing and assurance as part of the UK National AI Strategy and as part of innovation in AI. One of the three ‘pillars’ of the UK National AI Strategy is the development of pro-innovation governance and regulation of AI, aimed at supporting scientists, researchers and entrepreneurs to utilise the benefits of AI, while also protecting consumers and citizens from potential harms. To this end, additional regulatory oversight of AI systems is expected, following a consultation on the governance of digital technologies published in March 2022.
The DRCF paper identifies three distinct types of algorithm audits, the applicability of each being dependent on the context:
The DRCF has also identified the creation of standards an alternative method of audit i.e. establishing benchmarks against which systems can be measured, which may then lead to a certification system. For example, the UK's Cabinet Office’s Central Digital and Data Office published a draft algorithmic transparency standard for collecting information about how government uses algorithmic tools which we wrote about last year.
By speaking to a range of stakeholders in academia, industry, the public sector and civil society, the DRCF identifies key issues within the current landscape:
Audits may have high financial costs associated with them, which may mean that larger organisations are better able to adapt and absorb the additional costs than smaller organisations.
Changing role of auditors
The DRCF recognises the importance of regulators in the future audit landscape to ensure that the application of algorithmic processing systems is trustworthy and legally compliant. Possible roles identified for regulators include:
Regulation or self-governance?
The DRCF also recognises self-governance as one possible approach, with both pros and cons:
A middle ground?
Another approach identified by the DRCF is a ‘halfway’ system between a regulator- and industry-led approach.
One of the next steps for the DRCF is a call for Input to help them test ideas and identify further areas of common interest for the future.
The DRCF suggests the following 6 roles that regulators may play in the future and invites feedback on these options:
The DRCF will be accepting responses to the questions until Wednesday 8th June 2022, after which a summary of responses will be published.
If you would like to discuss the potential of the AI auditing and assurance, please contact Tom Whittaker or David Varney.
This article was written by Tom Whittaker.
Supporting improvements in algorithmic transparency: As regulators of digital services, we want to support the use of algorithmic processing in a way that promotes its benefits and mitigate risks to people and competition. Transparency is key to this. This workstream explores ways of improving algorithmic transparency and auditing. We aim to: improve our capabilities for algorithmic auditing; research the market for third-party auditing; and promote transparency in algorithmic procurement.
Want more Burges Salmon content? Add us as a preferred source on Google to your favourites list for content and news you can trust.
Update your preferred sourcesBe sure to follow us on LinkedIn and stay up to date with all the latest from Burges Salmon.
Follow us