Skip to main content

Public attitudes on the fair use of data and algorithms in finance

Posted by: , Posted on: - Categories: Algorithms, Bias, Data collection, Decision-making

Financial companies are increasingly using complex algorithms to make decisions regarding loans or insurance - algorithms that look for patterns in data which are associated with risks of default or high insurance claims. This raises risks of bias and discrimination …

The ethics of contact tracing apps: International perspectives

Posted by: , Posted on: - Categories: Contact tracing, Covid-19, Data-sharing, Ethical innovation

The CDEI recently hosted a virtual roundtable with people from nearly 10 different countries who are working on contact tracing apps. The discussion focused on public trust and how it can be built while working at speed to develop and …

How is the CDEI supporting the response to COVID-19?

Posted by: , Posted on: - Categories: Covid-19, Data-sharing, Ethical innovation

The CDEI’s mission is to ensure the UK maximises the benefits of data-driven technologies, that those benefits are fairly distributed across society, and to create the conditions for ethical innovation to thrive.  Now, more than ever, this mission is paramount …

In a world biased against women, what role do algorithms play?

Posted by: , Posted on: - Categories: Algorithms, Bias

Recent reports suggest 9 out of 10 people are biased against women in some way. We wanted to mark International Women’s Day this year by talking about bias in a world of data-driven technology and artificial intelligence, and our forthcoming report on bias in algorithmic decision-making.

Creating an environment for ethical innovation to thrive

Posted by: and , Posted on: - Categories: Ethical innovation

Advances in technical innovation should be something that everyone can look forward to. Or, at a minimum, not be something that causes active worry.  Whilst innovation is to be encouraged, innovation alone is not good enough. It needs to be ethical innovation, or none at all.