Definitions of Article 4 and the AI Act
Article 4 of the AI Act emphasizes the necessity of AI literacy among providers and deployers of AI systems. It mandates that all staff and others involved in the operation and use of these systems possess adequate AI literacy, which is defined in Article 3(56) of the AI Act as “the skills and knowledge necessary to use AI systems responsibly and understand their capabilities and risks.” It is important to note that the Article 4 obligation does not only apply to staff operating AI systems on behalf of the provider or deployer of AI systems, but also to others, such as clients, contractors,….
Compliance with Article 4
The AI Office will adopt a flexible approach to AI literacy programs, determining training needs on a case-by-case basis. Minimum requirements include:
- General Understanding of AI: Staff and others must grasp what AI is, how it functions, and its benefits and risks. For instance, employees using ChatGPT should be aware of potential issues like hallucination.
- Understanding the Organization's Role: Staff and others should evaluate their responsibilities as developers or deployers of AI systems.
- Awareness of AI Risks: Providers, deployers, staff and others must understand the risks associated with AI systems and the possible need to mitigate them. Organizations using high-risk AI systems, for example, may need to organize additional training.
- Design of AI Literacy Programs: Programs should be tailored to the staff's technical knowledge and the context of AI system use, considering industry-specific risks and purposes.
The AI Office will not impose strict training requirements or require certificates from providers, and organizations are not required to formally assess their employees' AI knowledge. It is, however, recommended that organizations maintain internal records of training to demonstrate compliance with Article 4.
Enforcement of Article 4
The obligation for AI literacy took effect on 2 February 2025, with supervision and enforcement starting on 2 August 2026. National market surveillance authorities, to be established by 2 August 2025, will oversee compliance. Sanctions for violations will be based on national legislation and must be proportionate to the severity of the violation. Compliance will be ensured through both public enforcement, enabling national market surveillance authorities to impose penalties and other measures to address violations, and through private enforcement, enabling individuals to sue for damages as a result of non-compliance with Article 4 under national law.
AI Office approach to AI literacy
To support Article 4, the AI Office provides resources such as practical examples, webinars, and a living repository of AI literacy practices. Future plans include a dedicated web page for AI literacy and information and guidelines from national market surveillance authorities. The living repository of AI literacy practices is a place where organizations who have signed the AI Pact can choose to share best practices and expert advice on AI literacy programs, after review by the AI Office for transparency and reliability. Future plans include a dedicated web page for AI literacy and information and guidelines from national market surveillance authorities. Notably, no EU guidelines on AI literacy will be published.
Other useful resources
For everyone:
For SMEs:
For the education sector: