Protecting Trained Models in Privacy-Preserving Federated Learning
Find out how you can protect individual's data after it's been used in training models through output privacy.
Find out how you can protect individual's data after it's been used in training models through output privacy.
This post is part of a series on privacy-preserving federated learning. The series is a collaboration between the Responsible Technology Adoption Unit (RTA) and the US National Institute of Standards and Technology (NIST). Learn more and read all the posts …
In our second post we described attacks on models and the concepts of input privacy and output privacy. ln our previous post, we described horizontal and vertical partitioning of data in privacy-preserving federated learning (PPFL) systems. In this post, we …
Privacy Enhancing Technologies (PETs) could enable organisations to collaboratively use sensitive data in a privacy-preserving manner and, in doing so, create new opportunities to harness the power of data for research and development of trustworthy innovation. However, research DSIT commissioned …
Today marks the publication of the third wave of the Centre for Data Ethics and Innovation (CDEI) Public Attitudes to Data and AI tracker survey. The CDEI leads the Government’s work to enable trustworthy innovation using data and AI as …
Last year, the CDEI launched a responsible data access programme to address the challenges organisations face to access data they need in a responsible way. A key component of this programme is our work to encourage adoption of Privacy-Enhancing Technologies …
Following our review into bias in algorithmic decision-making, the CDEI has been exploring challenges around access to demographic data for detecting and mitigating bias in AI systems, and considering potential solutions to address these challenges. Today we are publishing our …
I have long believed in the power of data to make a meaningful difference to people's lives. Working in the field for over 30 years, I have seen the ways in which data can be used as a powerful force …
Today the Central Digital and Data Office (CDDO) and the Centre for Data Ethics and Innovation (CDEI) are sharing an updated version of the Algorithmic Transparency Standard on GitHub. Sharing the updated Standard on GitHub will allow interested stakeholders to …
We have been working with the Department for Business, Energy, and Industrial Strategy (BEIS) to identify the features of ethical and trustworthy Smart Data schemes. Smart Data refers to the “secure sharing of customer data with authorised third party providers …
Recent Comments