Report

Report: Panel on supervision on AI in Belgium

31.01.2025

During the event for the fifth anniversary of the KCDS, we organized a panel on the supervision of AI in Belgium and the question of which authorities should be tasked with this responsibility. This discussion builds further upon the recently published policy brief of the KCDS which tackles this topic as well and provides a detailed blueprint of the potential Belgian enforcement structure for the AI Act.

In order to set the scene, research associate Thomas Gils opened the panel by summarising the content of the policy brief in an introductory presentation:

Following this presentation, a panel discussion took place between representatives from three relevant public authorities or institutions: Jessica Godoy Chaparro (Legal attaché, FPS Economy - Belgium), Simon Verschaeren (Policy officer, Department of Work, Economy, Science, Innovation & Social Economy of the Flemish Government, WEWIS), and Peggy Valcke (Executive Board Member, Belgian Institute for Postal Services and Telecommunications, BIPT). The Belgian Data Protection Authority was invited as well, but they unfortunately could not attend.

The panel discussion was structured around provocative statements to which the participants could reply. The statements are outlined below, together with the replies of the panelists.

First statement: a centralised enforcement structure should be preferred over a decentralised structure due to its substantive and practical benefits (incl. technical & human resources).

FPS Economy mentioned that a balanced approach to AI oversight should leverage both existing sectoral expertise and centralized AI-specific knowledge. Existing Market Surveillance Authorities (MSAs), such as the FPS Economy, have obtained sectoral regulatory expertise but may lack AI-specific expertise (e.g. on data governance, compute or cybersecurity). This was further illustrated by a reference to the Belgian Federal Agency for Medicines and Health Products (FAMHP) that can provide valuable insights into AI applications in healthcare, due to its background in medical devices. Centralisation should facilitate the creation of a hub of expertise (incl. personnel) to support sectoral regulators, who do have critical domain knowledge, while ensuring consistency in governance. The most effective solution likely lies in a hybrid model – one that enhances existing authorities with centralised AI expertise while preserving their deep sectoral understanding.

WEWIS agreed on the importance of sector-specific expertise, particularly when considering the integration of AI into different industries. However, AI is not only regulated by the AI Act but is also impacted by a broader legal framework. According to WEWIS the complexity is further heightened by Belgium’s state structure, requiring coordination between federal and regional authorities. To ensure legal clarity and effective enforcement, there should be a single authority with ultimate responsibility over AI oversight. However, regional governments should have the opportunity to shape AI oversight by adding specific accents or adapting it to local needs through regional initiatives. This layered approach could strengthen regulatory coherence while allowing for tailored interventions where necessary. In summary, coordination between federal and regional authorities is crucial.

Finally, BIPT remarked that they have been analysing the same questions as discussed in the KCDS’s policy brief including the question regarding (de)centralization, but that it used different criteria than those mentioned in the policy brief. They underscored the need for clarity, certainty, coherence, efficiency, effectiveness and innovation while ensuring the pragmatic use of resources. Where possible, authorities should collaborate and streamline efforts to avoid duplication. The AI Act primarily governs product safety regulation, focusing on technical standardisation, process documentation, and compliance supervision. At its core, AI regulation should ensure that products meet established standards before entering the market. However, existing legal frameworks such as the General Data Protection Regulation (GDPR) and consumer protection laws remain fully relevant and should not be sidelined. The European Data Protection Board (EDPB)’s statement on the topic should, however, be debunked (i.e. that data protection authorities should also be competent for AI Act supervision). BIPT argued that said approach would erode sectoral regulators’ competences. Instead, AI oversight should rely on existing MSAs under Annex I AI Act and financial regulators, with strong central coordination, rather than defaulting to data protection authorities. In conclusion, a nuanced, multi-layered model – where sectoral regulators retain their role but operate under central coordination – should ensure both regulatory clarity and practical enforcement. This was further corroborated by referring to the duty of sincere cooperation enshrined in article 4(3) of the Treaty of the European Union, as confirmed by the Court of Justice in the Bundeskartellamt-case.

Second statement: if support for innovation with AI is a regional responsibility, then also supervision of AI should be a regional matter.

The FPS Economy is exploring how the supervision landscape should be structured, starting from a federal perspective and subsequently aiming to integrate input from regional entities and authorities. It highlights a key distinction in the statement; supervision versus innovation – two distinct yet interconnected pillars of AI regulation. While the AI Act includes innovation aspects, its primary focus remains on compliance, health, safety, and fundamental rights. Innovation should be approached in a more horizontal, cross-sectoral manner. An example that was mentioned is the development of regulatory sandboxes, which the AI Act mandates at the national level by August 2026. These sandboxes will extend beyond the AI Act requirements and are already in development, whereby also regional initiatives will play a role. Additionally the voices of SMEs, start-ups, and other stakeholders should (and will) be heard. However, given the complexity of the issue and limited internal resources, the FPS Economy has outsourced part of this analysis to an external partner. The results of this study are expected by summer 2025 and will help integrate all governance elements into a cohesive framework for regulatory sandboxes.

According to the BIPT, regions must be actively involved in shaping digital regulations and should have the flexibility to emphasise specific regional priorities. This can be illustrated by cybersecurity policy, where regional initiatives—like cybersecurity centers supporting public administrations—already collaborate closely with the federal level.

WEWIS underlined the political sensitivity of the statement, with differing viewpoints within Flanders on whether supervision should occur at the regional or federal level. A warning for the implications of decentralisation was issued as well; if every sectoral or regional authority were to oversee AI independently, the result would be regulatory fragmentation – potentially overwhelming companies with conflicting advice and inconsistent enforcement. Businesses need legal clarity, simplicity, and a single point of accountability. For the Belgian implementation, a careful balance will have to be struck in line with the subsidiarity principle. It was emphasized that this requires earlier involvement of the regions in the federal AI governance discussions. Such proactive approach increases the likelihood of achieving a workable, well-balanced framework, reducing friction between governance levels and ensuring effective implementation across Belgium.

Concluding remarks

The debate revealed that no consensus has yet been reached on which competencies would ultimately be assigned to which authority. The focus appears to be on the federal level, with the regions having provided little to no input thus far, despite conducting their own analyses simultaneously. This is a striking observation given the Flemish government’s intention to transform the Flemish Supervisory Commission (Vlaamse Toezichtscommissie, VTC) into a Flemish data authority, consolidating supervisory functions related to qualitative and privacy-secure data management (as outlined in the 2024-2029 Coalition Agreement). While collaboration with the federal level remains the goal, this move indicates the Flemish government’s intention to strengthen the role of regional authorities. In the meantime, it also became clear that the BIPT would act as the central authority under the AI Act (as foreseen by the Federal Coalition Agreement) Although all speakers underlined the importance of cooperation, in practice, there appears to be little communication between the different levels.

At the same time, there appeared to be agreement on avoiding regional fragmentation. All participants seemed to agree that it would not be in the interest of stakeholders to have them navigate an even more complex regulatory landscape. Another commonly recognised challenge is the potential scarcity of required expertise within (regional) authorities, as the same specialised profiles would be sought by an increasing number of authorities, exacerbating recruitment difficulties. In that regard, an apparent disparity in resource availability was observed. While the BIPT signaled to posses the necessary capacity to integrate the AI Act into their work, the FPS Economy outsourced an analysis due to limited resources. At the same time, each entity emphasised its own area of expertise, with BIPT asserting that it possesses sufficient in-house knowledge to address AI-related supervisory challenges, while WEWIS highlighted the continued importance of regional input in shaping effective oversight.

Finally, there is no apparent political agreement yet on the list of relevant actors under article 77 AI Act (i.e. the role of fundamental rights authorities), although preparatory work has already been undertaken. It remains clear that significant uncertainty persists regarding the final allocation of responsibilities under the AI Act.

Download the policy brief on AI Act supervision