Protecting Trained Models in Privacy-Preserving Federated Learning
Protecting Trained Models in Privacy-Preserving Federated Learning
Find out how you can protect individual's data after it's been used in training models through output privacy.
Find out how you can protect individual's data after it's been used in training models through output privacy.
Privacy Enhancing Technologies (PETs) have become an increasingly important policy priority for governments, multilateral organisations, and the data privacy expert community. PETs refer to a range of digital technologies and techniques that enable the collection, processing, analysis, and sharing of …
This post is part of a series on privacy-preserving federated learning. The series is a collaboration between the Responsible Technology Adoption Unit (RTA) and the US National Institute of Standards and Technology (NIST). Learn more and read all the posts …
Children’s Social Care is one of the most important functions the government carries out. Children who need help and protection deserve high quality and effective support as soon as a need is identified. Data analytics tools, software that enables categorisation …
Today, DSIT’s Responsible Technology Adoption Unit (RTA) is pleased to publish our guidance on Responsible AI in recruitment. This guidance aims to help organisations responsibly procure and deploy AI systems for use in recruitment processes. The guidance identifies key considerations …
In our second post we described attacks on models and the concepts of input privacy and output privacy. ln our previous post, we described horizontal and vertical partitioning of data in privacy-preserving federated learning (PPFL) systems. In this post, we …
Today, the Responsible Tech Adoption Unit (RTA) in DSIT and the CDDO (Central Digital and Data Office) are launching updated products to better support public sector organisations in using the Algorithmic Transparency Recording Standard (ATRS). This includes a new gov.uk …
This post is part of a series on privacy-preserving federated learning. The series is a collaboration between the Responsible Technology Adoption Unit (RTA) and the US National Institute of Standards and Technology (NIST). Learn more and read all the posts …
Privacy Enhancing Technologies (PETs) could enable organisations to collaboratively use sensitive data in a privacy-preserving manner and, in doing so, create new opportunities to harness the power of data for research and development of trustworthy innovation. However, research DSIT commissioned …
DSIT’s Responsible Technology Adoption (RTA) Unit is pleased to publish its Introduction to AI assurance. This guidance is an accessible introduction that aims to support organisations to better understand how AI assurance techniques can be used to ensure the safe …
This post is part of a series on privacy-preserving federated learning. The series is a collaboration between CDEI and the US National Institute of Standards and Technology (NIST). Learn more and read all the posts published to date on the …
Recent Comments