How can you avoid replicating societal biases, prejudices and structural disparities in your AI healthcare system? In order to help you do this, the Knowledge Centre Data & Society developed the AI Blindspots healthcare card set.
How can you take into account possible prejudices and structural inequalities before, during and after the development of your AI application? In order to help you do this, the Knowledge Centre Data & Society developed the AI Blindspots card set.
The Aequitas tool performs an audit for your project. This method is intended to analyze whether there are prejudices in the data and in the models you use. You can perform the audit via the desktop or online tool.
This toolkit created by and for young people is meant to share the online experience of young people with policy makers, regulators and the ICT industry.
With this Data Collection Bias Assessment form, you make a few choices from the beginning of the data collection so that you can discover possible biases at an early stage.