Skip to main content

https://rtau.blog.gov.uk/2023/06/14/fairness-innovation-challenge-call-for-use-cases/

Fairness Innovation Challenge: Call for Use Cases

Building and using AI systems fairly can be challenging, but is hugely important if the potential benefits from better use of AI are to be achieved. 

Recognising this, the government's recent white paper “A pro-innovation approach to AI regulation” proposes fairness as one of five cross-cutting principles for AI regulation. Fairness encompasses a wide range of issues, one of which is avoiding bias, which can lead to discrimination. 

This issue has been a core focus of CDEI since we were established in 2019. Our 2020 Review into bias in algorithmic decision making set out recommendations for government, regulators, and industry to tackle the risks of algorithmic bias. In 2021, we published the Roadmap to an Effective AI Assurance Ecosystem, which explores how assurance techniques such as bias audit can help to measure, evaluate and communicate the fairness of AI systems. 

Over this period, this issue has received an increasingly strong focus across industry, academia and government, with significant numbers of academic papers and developer toolkits emerging. However, organisations seeking to address these challenges in real world examples continue to face a range of challenges, including:

  • Lacking access to the demographic data they need to identify and mitigate unfair bias and discrimination in their systems. We have set out the challenges in this area in more detail in a separate report also published today.
  • Understanding how to usefully apply a complex range of statistical notions of bias to understand the fairness of real world outcomes in their particular context.
  • Ensuring that any bias mitigation techniques used are themselves ethical and legal in the UK context.

CDEI’s Fairness Innovation Challenge

To help address some of these challenges, CDEI plans to run a Fairness Innovation Challenge to support the development of novel solutions to address bias and discrimination across the AI lifecycle. The challenge also aims to provide greater clarity about which assurance tools and techniques can be applied to address and improve fairness in AI systems, and encourage the development of holistic approaches to bias detection and mitigation, that move beyond purely technical notions of fairness. 

This challenge will build on our experience running the recent Privacy Enhancing Technologies Prize Challenges (in collaboration with the US government), which brought together industry, academia, government and regulators to help drive technical innovation in a real world context.

The challenges described above are much broader than technical ones, and we are keen to ensure that participants in the challenge are developing holistic solutions to address fairness challenges. Regulators play a key role in this area, and we’re delighted that The Equality & Human Rights Commission (EHRC) and The Information Commissioner’s Office (ICO) have agreed to support the challenges. They will help guide participants through some of the legal and regulatory issues, as well as using learnings from the challenge to shape their own broader regulatory guidance on these issues.

On the importance of addressing fairness in AI systems:

Stephen Almond, Executive Director, Regulatory Risk said: 

“The ICO is committed to realising the potential of AI for the whole of society, ensuring that organisations develop AI systems without unwanted bias. We’re looking forward to supporting the organisations involved in the Fairness Challenge with the aim of mitigating the risks of discrimination in AI development and use.”

Marcial Boo, Chief Executive at the Equality and Human Rights Commission said: 

“Without appropriate regulation, poorly-designed AI systems have the potential to disadvantage groups with characteristics that are protected in law, such as people’s race, sex or age.

“Tech developers and suppliers have a responsibility to ensure their AI systems do not worsen discrimination. In addition, public bodies and all organisations delivering public services through AI have a legal duty to ensure that their systems promote equality of opportunity and foster good relations, preventing discrimination rather than making it worse.

“We’re working closely with the CDEI to ensure AI is used and developed with fairness in mind from the outset.”

Call for Use Cases

As we finalise the design and scope of this challenge, we are eager to hear from you on how we can shape it to be most effective.

In particular, we are keen to identify real world use cases which could form the basis of specific challenge projects. We are today launching a call for use cases, and would welcome submissions of specific fairness-related problems faced by organisations designing, developing, and/or deploying AI systems.

To submit a use case, please use the google form linked here. To discuss potential use cases, or provide feedback on the challenge structure, please get in touch with the team via email, at ai.assurance@cdei.gov.uk

Sharing and comments

Share this page

2 comments

  1. Comment by Ilmu Komunikasi posted on

    Which organizations and regulatory bodies are collaborating in supporting the Fairness Innovation Challenge, and what roles do they play in addressing the legal and regulatory aspects of fairness in AI systems?

    Reply
    • Replies to Ilmu Komunikasi>

      Comment by aliciamatheson posted on

      The CDEI are working in close partnership with the Information Commissioner’s Office (ICO) and the Equality and Human Rights Commission (EHRC) to deliver this Challenge. This partnership allows participants to tap into the expertise of regulators to ensure their solutions marry up with data protection and equality legislation.You can find out more about the Challenge and our partners on our website including an opportunity to attend one of our briefing events in October: https://fairnessinnovationchallenge.co.uk/ Thanks.

      Reply

Leave a comment

We only ask for your email address so we know you're a real person

By submitting a comment you understand it may be published on this public website. Please read our privacy notice to see how the GOV.UK blogging platform handles your information.