Contribution signée Yves Moreau (KU Leuven)
Digital monitoring technologies have been touted as offering cheap tools to help control the COVID-19 pandemic that can be immediately rolled out at population scale. However, it is clear that this pandemic is the perfect storm for promoting the further development of the surveillance state and that some surveillance tools raise major fundamental rights issues.
Controlling an epidemic is a population-level “game” that involves a myriad of unknown and random interactions between individuals. Control is achieved when “on average” each future patient will infect fewer than one other person. This population-wide perspective is important because many forms of digital monitoring have effects at the personal level, as does repressive enforcement by authorities. Whatever combination of measures achieves epidemic control is appropriate from an epidemiological perspective and choosing among such combinations should be based on social, ethical, or economic factors. The need for people to “follow the rules” is only acute when epidemic control is not achieved and less stringent enforcement might be acceptable, or even desirable, when the epidemic is adequately controlled. Epidemic control is a cooperative game that involves the whole population. Building trust is essential because the goal is to let the population “win” against the disease.
From this perspective, repression is an admission of defeat. As some of the proposed technologies are similar to those used against convicts or terrorists, it is thus worth reiterating that citizens and patients are not terrorists and that they may not be treated as suspects by default, including non-compliant individuals and those at the bottom of the social ladder (homeless people, prisoners,undocumented refugees, etc.). Moreover, lack of trust and fear of social stigma can lead individuals to be uncooperative and avoid the healthcare system as long as possible. Because of the automation underlying digital technologies, it is essential to guarantee that they are not turned into automated repression tools. The large number of fines already levied against the population (e.g., 500000 fines in France during the first month of COVID-19 lockdown) shows that such fears are not baseless.
Regarding digital technology, it is important to note that in a crisis situation the desire for a “technological fix” (i.e., an easy technological solution to a complex social problem) is hard to resist, and that negative side effects of such technological fixes are then easily overlooked. For example, the evidence of the effectiveness of digital contact tracing technology (vs. tracing by healthcare workers) is not yet fully established (1), while cases of stigmatization have already been reported (2). Self-quarantine apps could be turned into a large-scale house arrest system by illiberal regimes.
What does the GDPR say?
The key legal instrument with respect to such technologies in Europe is the General Data Protection Directive (GDPR). Personal data related to infectious disease is medical data and therefore personal sensitive data. Therefore, any processing of such data requires a legitimate basis (Art. 6) and an exemption from the prohibition from processing sensitive personal data (Art. 9). Multiple options are available for the legitimate basis (consent (6.1(a)), compliance (6.1(c)), vital interest (6.1(d)), public interest (6.1(e)), or legitimate interest (6.1(f))). Multiple options are also available for the Art. 9 exemption, but 9.2(i) stands out in this context: “processing is necessary for reasons of public interest in the area of public health, such as protecting against serious cross-border threats to health”. This grants a broad exception for the processing of sensitive personal data in the context of epidemic control. However, the data minimization and privacy-by-default principles of the GDPR still stand firm and central (Art. 5.1(c), Art. 25.1, Art. 89.1), so that no more data can be collected and processed than necessary for the legitimate purpose pursued. Moreover, the rights of the data subject (Art. 12-23) remain applicable at all times, including the right to object to the processing of sensitive data (Art. 21), with the open question of under what circumstances public or legitimate interests could override the right to object of the data subject.
Digital monitoring technologies
We briefly overview the digital technologies that have so far been proposed to support epidemic control.
Digital epidemiological monitoring
Such systems collect data from our digital footprint, such as telecom operator data or smartphone GPS signals, and deliver only aggregate statistical results to epidemiologists and decision-makers. One such example is Google Community Mobility Reports (3), which track community activity over time across different countries and regions. While the processing of such location data is sensitive and for example Google could misuse the underlying location data in any number of ways, such processing does fall squarely into the purview of the GDPR. Such processing can be legitimate on public or legitimate interests grounds. Standard GDPR best practices would apply here, such as 1) collecting only relevant data, 2) anonymizing, pseudonymizing, or aggregating data as early as practical, 3) deleting personal data shortly after processing, 4) abiding by the rights of the data subject, and 5) dismantling the infrastructure after the epidemic.
Poland has deployed an app (4) to monitor compliance with quarantine orders. Quarantined individuals must submit upon random request a geolocalized selfie checked via facial recognition to guarantee that they are staying at home. Bahrain has issued electronic bracelets linked to a smartphone app to similarly monitor house quarantines (5).
The UN Siracusa Principles (6) offer guidance as to when fundamental rights can be limited to prevent the spread of infectious diseases, in particular such measures should use least restrictive means and be well-supported by scientific evidence. Given that the processes outlined in Poland and Bahrain closely resemble electronic monitoring of house arrest, it is unclear whether such massive infringement of fundamental rights can survive constitutional scrutiny in an independent legal system. It is moreover unclear whether such procedures offer benefits over less intrusive quarantine monitoring, while they have the potential of handling every citizen as a criminal.
Drones, thermal cameras, and facial recognition
The use of drones to monitor public spaces has been noted in China, Italy, France, and Belgium. Cameras with facial recognition software have been used to spot people not wearing masks in China. Thermal cameras have been used to spot people with a fever. In Italy, drones equipped with thermal cameras have been used to spot and fine people with a fever (7). Such solutions with a high creep factor embody the longing for the technological fix and the unadulterated control fetish of the state over its own citizens.
Multiple initiatives have emerged to enable digital contact tracing (as opposed to contact tracing by healthcare workers): Pan-European Privacy-Preserving Proximity Tracing (8), Decentralized Privacy-Preserving Proximity Tracing (9), TraceTogether (10) (11) (Singapore), MIT SafePaths (12), COVID-Watch (13), or the initiative recently announced by Apple and Google (14). The idea is that proximity between smartphones is detected via Bluetooth and that when people have been diagnosed with COVID-19, they can automatically inform all the people with whom they have been in contact. While a naïve design would centralize the GPS coordinates of all the smartphone owners in the population, it is actually possible to design the system in such a way that contact tracing can be performed fully anonymously (i.e., only the smartphone owners are aware of the fact that they have been in close contact with an infectious individual and it is up to them to self-quarantine or to reach out to a healthcare provider). This is illustrated in the cartoon in endnote reference (15). While a privacy-preserving design might not provide all the bells and whistles desirable for epidemiological monitoring (because it does not collect location data), such digital epidemiological monitoring aspects can be covered by the approaches discussed in Section 3.1. Such a privacy-preserving design should not be called a “contact tracing app”, which suggests an invasion of privacy, but an “exposure alert app” or “exposure warning app” (alerte/avertissement d’exposition, contactalarm/waarschuwing).
Also, one should be weary of the technological fix illusion. Although it is plausible that such a technology could be a useful replacement for or complement to interviews by healthcare workers, and although it is perceived as useful in combating COVID outbreaks in South Korea and Singapore, scientific evidence of its effectiveness is still limited at this point. Also, lack of privacy protection by authorities have been reported (16). Therefore, the effectiveness of such solutions should be carefully monitored and the solutions must be dismantled if ineffective.
Moreover, one should be conscious of the limitations of the technology. For example, not everyone uses a smartphone, contacts can be registered for interactions that cannot lead to infection (for example, across a window), and Bluetooth tracing cannot account for delayed contamination via contactsurfaces.
Furthermore, the description of such systems mostly stops at the notification to the person who has been in contact with an infectious patient. What happens next? How long should people self-quarantine? Will they be incentivized to comply with self-quarantine by protecting them from financial hardship? Will they have access to healthcare and psychological support? Will they have access to diagnostic testing to exit quarantine early? Only if such a solution is embedded in a full process can it beeffective.
Finally, as in all things cryptographic, a claim of privacy features is no guarantee for actual privacy protection and the devil is in the details of the implementation. Any design should be submitted to public scrutiny by the best cryptosecurity experts. Proposed solutions should not rely on the benevolence of government authorities, telecom operators, or tech companies. They must be fully transparent and open source, and aim at establishing an open international standard (17).
Comply with GDPR best practices for digital epidemiological monitoring. The obligations from the GDPR are clear, and public health exemptions do not eliminate other key provisions, such as privacy-by-default, data minimization, and the right to object.
Carefully evaluate designs of contact tracing and exposure alert apps.The devil is in the details, so involve the best cryptosecurity experts, as well as epidemiology, virology, and public health experts in creating or selecting a design. If an app is deployed, the privacy-by-default and data minimization principles of the GDPR call for a privacy-preserving design.
Dismantle monitoring infrastructures after the epidemic. The GDPR requires purpose limitations of personal data processing. Digital epidemiological monitoring infrastructure serves no purpose once the disease stops recurring. Infrastructure that does not exist cannot be abused. Epidemiological detection infrastructure (i.e.,the capacity to detect the emergence of a new disease or the reemergence of an old disease) can be useful, but it is a distinct functionality with different design requirements. It should not be used as an excuse to indefinitely maintain monitoring infrastructures. Do document and archive software, so that it can be used to bootstrap future efforts.
Reject intrusive technologies, such as electronic monitoring of quarantines, drones, thermal cameras, and facial recognition. These technologies carry major social risks and damage trust and cooperation between the population and authorities.