On 17 June 2020, the CNIL published its position on the new so-called “smart” video systems used, in particular, to facilitate the fight against the COVID-19 pandemic (e.g. automatic temperature measurement (thermal cameras), detection of the wearing of masks or compliance with social distancing measures).
Given the risks to privacy associated with the implementation of these video systems, the CNIL calls for vigilance and specifies the applicable rules.
However, without prohibiting the implementation of these systems, the CNIL considers that most of them do not comply with the GDPR provisions because of the absence of a specific legal framework providing for the necessary guarantees, as is the case for video protection on the public highway (in France).
Context and purposes of the implementation of smart video systems
Smart video systems are, in principle, intended to combat the spread of the COVID-19 virus by making it possible to assess/detect the risk of contagion and take appropriate action.
Controllers often plan to set up these systems on the public highway or in (or near) shops, public transports or workplaces.
The risks for individual freedoms
According to the CNIL, the objectives pursued by these systems are often legitimate (fight against the COVID-19 outbreak) but “their uncontrolled spread presents the risk of generalizing a feeling of surveillance among citizens, of creating a phenomenon of habituation and trivialization of intrusive technologies, and of generating increased surveillance, likely to undermine the proper functioning of our democratic society. »
It goes on to say that the massive spread of these systems could lead to a change in the behaviour of the data subjects, whether intentional or accidental.
Therefore, the systems legally implemented in this period must be exceptional and remain proportionate to the particular objectives pursued in this period.
The systems must comply with the data protection principles
Purposes and legal bases of the processing operations
Organisations implementing such systems, even on an experimental basis, must identify the purposes and the appropriate legal basis of the processing operations.
The legal basis may, for example, be a public interest task carried out by public authorities or a legitimate interest pursued by private bodies.
Furthermore, processing operations carried out by a competent authority, within the meaning of Article 3 of the “Police-Justice” Directive and to prevent and detect criminal offences, requires a data protection impact assessment (DPIA) subject to prior consultation of the CNIL and the adoption of a regulatory text.
Processing sensitive data
When “smart” video systems involve the processing of special categories of data (e.g. biometric data or body temperament etc.) or criminal data, their processing must be based respectively on an exception provided for in Article 9.2 GDPR or a specific text authorising the processing of criminal data unless it is under the control of the public authority (Article 10 GDPR).
In the case of thermal images, which are considered as health data, the legal basis for their processing may be either:
- an important public interest (Article 9(2)(g) of the GDMP); or
- public interest in the field of public health (Article 9(2)(i) of the GDPR).
In both cases, a specific law (of the European Union or the Member State) must specify these public interests and authorise such system.
Otherwise, the only legal basis left for allowing the processing of health data is the explicit consent of the data subject. However, in practice, it is difficult to obtain and its validity is not always guaranteed, in particular, if the processing results in denying individuals access to premises.
Necessity and proportionality of the processing operations
The use of the “smart” video system must not disproportionately infringe on privacy.
To demonstrate that the use of such a system is necessary and proportionate, the following criteria will have to be taken into account:
- there are no less intrusive means of achieving the intended purposes;
- the importance of the data processed (i.e. data minimisation, pseudonymisation, anonymisation, no individual monitoring);
- the duration of use and scope of the systems (e.g. number of cameras concerned, their field of view, duration of their use, length of data retention, etc.);
- Report to data controllers.
According to the health authorities interviewed by the CNIL, some of these systems also present a risk of failing at identifying infected people because some of them may be asymptomatic and others may be willing to circumvent the system by consuming products that lower body temperature, without treating the causes of the fever.
Thus, the need for such intrusive technologies may be questioned if the number of people who could voluntarily or involuntarily circumvent the system is too high.
Enforcement of the individuals’ rights
The CNIL reminds that the rights of individuals over their personal data still apply.
However, in practice, people will not be able to exercise their right to object or will have great difficulty in doing so and will only be able to obtain the deletion of their data a posteriori.
In this regard, the CNIL also specifies that saying “no” with a head sign is not a satisfactory way to object to data processing, especially if smart video systems are becoming more and more common.
Thus, if individuals cannot object to the processing of their data, a specific legal framework provided either by the European Union or by French law must frame the use of such systems.
A specific legal framework must be adopted to legitimise “smart” video system
The CNIL points out that, unlike video protection on the public highway, the use of “smart” cameras is not currently provided for in any particular statute.
Thus, the CNIL considers that most smart video systems are unlikely to comply with data protection laws, especially if they allow the collection of sensitive data or prevent individuals from exercising their right to object. As stated above, a specific law must be adopted.
This post is also available in fr_FR.