Unbias Toolkit

This toolkit created by and for young people aims to share young people's online experiences with policy makers, regulators and the ICT industry.

The toolkit was developed within the framework of the Engineering and Physical Sciences Research Council (EPSRC) project “UnBias: Emancipating Users Against Algorithmic Biases for a Trusted Digital Economy”.

What you should know before reading further:

  • Target audience: young adults (internet users), IT providers, policy makers, regulators, and more
  • Process phase: problem analysis and ideation, evaluation and iteration
  • System component: entire application, users and other stakeholders (policy makers, IT providers,...), context AI system
  • Price indication: freely available


The toolkit consists of several elements:

1. Handbook

2. Awareness Cards

3. TrustScape

4. MetaMap

5. Value Perception Worksheets

It is possible to use parts of this toolkit or to organize the entire workshop with young people. This depends on the purpose of your AI system. It is also possible to do the exercises with a team working on the application, despite the fact that they are not the original target group. There is a special manual for those organizing the workshop. This can help while converting the workshop to your needs.


This workshop has no direct result in the form of an evaluation. The workshop will ensure that there is more awareness of the ethical questions that young people ask themselves. When a development team uses the toolkit, the ethical reflections can ensure that they gradually adjust the AI application accordingly.

The following ethical principles apply:

  • bias and prejudice
  • discrimination
  • trust
  • justice

This tool was not developed by the Knowledge Center Data & Society. We describe the tool on our website because it can help you deal with ethical, legal or social aspects of AI applications. The Knowledge Center is not responsible for the quality of the tool.