Report

Report: From Policy to Practice: Prototyping The EU AI Act’s Transparency Requirements

01.02.2024

In an era where artificial intelligence (AI) increasingly shapes our world, transparency and trust in AI systems are increasingly important. Transparency is also one of the foundations of the AI Act of the European Union. Our recent policy prototyping project delved into the legislation and examined and tested the transparency requirements to provide valuable insights for both policymakers and professionals.


The EU AI Act contains two types of transparency requirements:

  • Instructions For Use (IFUs) for high-risk AI systems, as stipulated in article 13 AI Act.
  • Disclosure requirements for interactive AI and AI-generated content, as outlined in article 52 AI Act.

Our report is a comprehensive exploration of these requirements, providing a practical view of their implementation and implications.

Project approach and structure

The report begins with an introduction to policy prototyping, followed by a detailed look at the phases of the project. The core of our report lies in the development and evaluation of prototype IFUs and disclaimers, enriched with feedback from stakeholders. The final part of the report contains a legal analysis of articles 13 and 52 AI Act and offers a multi-dimensional view of the implications of the AI Act. (N.B.: we relied on the European Commission-proposal and the EU Council-compromise text).

Key Findings - Instructions For Use (IFUs):

Our findings emphasize the importance of:

  • Tailoring IFUs to the specific professional audience.
  • Clarity, simplicity, and logical structure in the documentation.
  • Providing detailed information about system performance, data processing, and user instructions.

Key Findings - Disclaimers:

We discovered that effective disclaimers should:

  • Aim for a high degree of transparency and accessibility.
  • Use a layered structure to engage users without overwhelming them.
  • Include provisions for feedback to build trust and transparency.

Feedback on the AI Act’s Transparency Requirements:

Our participants highlighted the challenges and opportunities in complying with articles 13 and 52:

  • Small AI providers may struggle with the complexity and the required resources, due to the need for a multidisciplinary team
  • The need for concrete guidelines and examples to facilitate compliance and to clarify the many unclear concepts.
  • Finding the balance between providing technical details and avoiding information overload.
  • The importance of updating legislation to keep pace with technological advancements.

Conclusion

Our project is a tool for understanding and shaping AI transparency. By bridging the gap between regulatory expectations and practical implementation, we offer a valuable resource for both policymakers and AI professionals.

Contact

Contact

thomas.gils@kuleuven.be

Contact

frederic.heymans@vub.be