policy monitor

Belgium – unlawful use of facial recognition by federal police

In February 2022 the Belgian Supervisory Body for Police Information (Supervisory Body) issued an audit report about the use of Clearview AI’s facial recognition software by the federal police. The reason for this audit was the discovery that at a Europol Victim Identification Taskforce in 2019, two members of the police had used the software and also conducted searches after the Taskforce had ended. In its report the Supervisory Body was very critical and found that “the use of the Clearview AI facial recognition technology is not legal and was therefore neither authorised nor necessary”.

What: Administrative decision

Impactscore: 1

For who: law enforcement agencies, public sector organisations

URL: https://www.controleorgaan.be/...


According to its own statements, Clearview AI is only accessible for law enforcement agencies. It offers a database of facial images sourced from “public-only web sources, including news media, mugshot websites, public social media and other open sources”. Users can upload images to compare them with the available images in Clearview AI’s database.

Firstly, the Supervisory Body wishes to stress that the nationality of those involved and whether or not they are part of a 'Belgian' investigation file are not relevant. It confirmed that this processing of personal data falls under the responsibility of a Belgian police service. So even if the Clearview AI facial recognition was applied by the federal judicial police to non-Belgian citizens, a legal basis for the application of facial recognition technology is required.

Secondly, the use of facial recognition technology is not (specifically) regulated in the Act on the Police Service (PSA). Article 44/1 § 2, 1° describes the legal base for the processing of ‘biometric data’ in a very general way with the aim of unambiguous identification of, among others, suspects of a criminal offence and missing persons. However, it does not contain a sufficient legal basis to apply the form of facial recognition technology as offered by Clearview AI.

Moreover, the Supervisory Body refers to its report on the facial recognition case of Brussels Airport and clarifies that the Clearview AI case differs in certain (essential) aspects:

  • Facial recognition was not applied on camera footage of public places. Already available photographs or image materials were used, making it a targeted use of facial recognition;
  • The federal police relied on a third party for the provision of facial recognition technology, in this case Clearview AI, a private and commercial entity based in the United States. Personal police data was transferred to a party in a third country without it being clear whether the recipient ensured an adequate level of protection or appropriate safeguards.
  • Belgian police forces transferred or, at least, made available information and personal data to a third private recipient which was not regulated in the PSA. Such transfer was not permitted (irrespective of whether the company was a Belgian or EU institution or company, let alone a US company).
  • A user of the Clearview AI application has no control whatsoever over the processing of biometric data. The photographs and images are uploaded via a URL, thereby transferring the photographs and images to the US company Clearview AI. The photographs and images, including the biometric data (the template containing the unique personal data), are sent and processed outside the police environment (and outside the EEA). The police entity that provides the photos and images therefore has no control whatsoever (any more) over the processing of the biometric data, nor over the further processing by the recipient.

What both cases do have in common is that both were performed in a so called ‘test environment’. However, there are no derogations from the legal framework even if it would concern a test phase or pilot project, meaning that both the PSA and the Belgian Data Protection Act have to be complied with.

Furthermore, the Supervisory Body stressed that that it has to be taken into account that police photos are being transferred to a commercial company and that biometric data is being stored by this company. The processing of these pictures and images forms the business model of Clearview AI. Additionally, it is a misconception to think that only photos of perpetrators or victims are being stored since it concerns photographs that are part of police files. Not only photos of potential offenders, but also of victims (and even witnesses or bystanders) thus end up in the hands of a company that generates and optimises its existence and profits on the basis of highly sensitive information.

The Supervisory Body concluded by recommending that awareness should be raised among staff (and managers) about the use of open source intelligence in the context of the applicable general legal framework and data protection law in particular. It also ordered the police to make sure that Clearview AI deletes the submitted pictures from its database, as well as the related biometrical processing activities. It also issued a warning about future use: “every future (potential) use of the Clearview AI facial recognition technology, a similar application or the use of a similar database is unlawful.”