Skip to main content

How AI assurance can support trustworthy AI in recruitment

A pair of hands is shown in the foreground and a laptop on a desk in the background.

Today, DSIT’s Responsible Technology Adoption Unit (RTA) is pleased to publish our guidance on Responsible AI in recruitment. This guidance aims to help organisations responsibly procure and deploy AI systems for use in recruitment processes. The guidance identifies key considerations …

Protecting Model Updates in Privacy-Preserving Federated Learning

Posted by: and , Posted on: - Categories: Data, Data collection, Data-driven technology, Data-sharing, PETs Blogs

In our second post we described attacks on models and the concepts of input privacy and output privacy. ln our previous post, we described horizontal and vertical partitioning of data in privacy-preserving federated learning (PPFL) systems. In this post, we …

Algorithmic Transparency Recording Standard: Getting ready for adoption at scale

Posted by: and , Posted on: - Categories: Algorithms, Transparency

Today, the Responsible Tech Adoption Unit (RTA) in DSIT and the CDDO (Central Digital and Data Office) are launching updated products to better support public sector organisations in using the Algorithmic Transparency Recording Standard (ATRS). This includes a new gov.uk …

Data Distribution in Privacy-Preserving Federated Learning

Posted by: , , and , Posted on: - Categories: Data, Data collection, Data-driven technology, Data-sharing, PETs Blogs

This post is part of a series on privacy-preserving federated learning. The series is a collaboration between the Responsible Technology Adoption Unit (RTA) and the US National Institute of Standards and Technology (NIST). Learn more and read all the posts …

Privacy-Preserving Federated Learning: Understanding the Costs and Benefits

Posted by: and , Posted on: - Categories: Data, Data collection, Data-driven technology, Data-sharing

Privacy Enhancing Technologies (PETs) could enable organisations to collaboratively use sensitive data in a privacy-preserving manner and, in doing so, create new opportunities to harness the power of data for research and development of trustworthy innovation. However, research DSIT commissioned …

How the ‘Introduction to AI assurance’ guide is supporting government’s innovative approach to AI regulation

Posted by: , and , Posted on: - Categories: Assurance

DSIT’s Responsible Technology Adoption (RTA) Unit is pleased to publish its Introduction to AI assurance. This guidance is an accessible introduction that aims to support organisations to better understand how AI assurance techniques can be used to ensure the safe …

Privacy Attacks in Federated Learning

This post is part of a series on privacy-preserving federated learning. The series is a collaboration between CDEI and the US National Institute of Standards and Technology (NIST). Learn more and read all the posts published to date on the …

The UK-US Blog Series on Privacy-Preserving Federated Learning: Introduction

This post is the first in a series on privacy-preserving federated learning. The series is a collaboration between CDEI and the US National Institute of Standards and Technology (NIST). Advances in machine learning and AI, fuelled by large-scale data availability …

Championing responsible innovation: reflections from the CDEI Advisory Board

Text on beige background reads reflections from the outgoing CDEI Advisory Board

The Centre for Data Ethics and Innovation leads the Government’s work to enable trustworthy innovation using data and artificial intelligence. At the CDEI, we help organisations across the public and private sectors to innovate, by developing tools to give organisations …