Dr Maria Grazia Porcedda, Trinity College Dublin
On 9 March 2020, the day when Italy announced a full lockdown, the Guardian featured an article by Lily Kuo titled ‘The New Normal’: China’s Excessive Coronavirus Public Monitoring Could Be Here To Stay’. The article questions whether the pandemic has given the Chinese government a pretext for accelerating mass surveillance, and retaining such an approach beyond the end of the crisis. In the darkest hour of the lockdown many were wondering whether that was an example to follow. The answer to the question of whether the European Union (EU), and Ireland within it, could follow a path similar to China is contained in data protection laws. Spoiler alert: the answer is no.
Data protection was born, as it were, in the 1960s, and its profile was raised, twenty years on, by a number of legislative interventions at EU level. The late Directive 95/46 made of data protection an object of compliance, the Charter of Fundamental Rights of the European Union elevated it to a right and the General Data Protection Regulation turned it into a topic of broad public debate. The journey of data protection was all but linear; there were setbacks matched by fierce legal battles, aimed at obtaining recognition that the abuse of personal data impinges on people’s dignity. Battles addressed challenges as diverse as data-driven efforts to fight against terrorism and the dragnet collection of personal data by the NSA and Cambridge Analytica. A few battles started in Ireland, notably challenges to the legality of the Data Retention Directive and the transfer of data to the United States of America, which sparked references for preliminary rulings leading to landmark pronouncements by the Court of Justice of the European Union.
Both the comprehensive legislation and experiences in resisting data-thirsty emergencies, provide the legaland policy framework setting the background for data processing for the purpose of containing the pandemic. Covid-19 related operations cover all possible forms of processing, i.e. digital and manual, targeted and large-scale; types of data, e.g. location (though more often proximity) and medical; and purposes, e.g. health and security. The devil is, as is known, in the detail, and the analysis of different combinations of types of data, forms and purposes of processing will be covered as part of the long-term work of the Observatory.
The public debate has tended to focus on “the processing of personal data wholly or partly by automated means” (Art. 2(1) GDPR), namely tech-based solutions, such as Bluetooth-enabled contact tracing apps. This is justified by the greater risks to the rights and freedoms of individuals inherent in digital technology, as learnt through the many emergencies and scandals of the past 20 years. Thus, there has been an abundance of interventions to guide the development of apps, such as the Principles for legislators on the implementation of new technologies signed by some of the most prominent Irish data protection experts.
Conversely, there has been relatively little attention to the second limb of Art. 2 (1) GDPR, which defines its material scope, namely “processing other than by automated means of personal data which form part of a filing system or are intended to form part of a filing system”. This is also not surprising, because manual processing easily slips under the radar. Yet, diffuse unlawful processing operations can represent as big a challenge to the rights and freedoms of individuals whose data are being collected as the use of apps, not least because these operations arguably pose real challenges for enforcement.
During a recent ICEL webinar, Deputy Commissioner Colum P. Walsh commented on the challenges of addressing the many concerns of organizations and businesses that had to quickly transition to remote working. DBEI’s Return to Work Safely protocol features dedicated data protection guidance, buttressed by guidelines issued by the Data Protection Commissioner.
However, guidance for retail protection in preparation for phase 3 of the lockdown seems to have taken the opposite direction. The document features a questionnaire that was incorporated into sectoral body guidance(at p. 17), to the effect that those wanting a much-needed haircut are facing medical screening through written questionnaire. The questionnaire raises fundamental data protection issues. As they currently stand, these questionnaires fail to meet all principles laid down in Art. 5 GDPR, such as lawfulness, minimisation, storage limitation, integrity and confidentiality. They lack information notices, in defiance of the principle of transparency (Art. 5, 12 and 13 GDPR); as a result, data subjects are not informed of their rights, which form part and parcel of the definition of the right to the protection of personal data. This is all the more worrying as the questionnaire collects data concerning health, which are special categories of personal data deserving reinforced protection (Art. 9 GDPR).
By recommending their members collect such information, sectoral bodies are potentially sending businesses down a dangerous path, exposing them to the possibility of hefty fines attached to the violation of their duties as data controllers (Art. 83 GDPR). Even if the manifest faults of these questionnaires were set straight, however, they raise deeper questions owing to the type of data they aim to collect. The Data Protection Commission has not ruled out the use of questionnaires, but clarified that they must be used only when strictly necessary and proportionate, taking into account the specific circumstances of the organization using them. As Prof. Orla Lynskey noted during the abovementioned ICEL webinar, the notion of necessity is not settled. Arguably, however, having a haircut would fail to meet even the lowest threshold of necessity (and proportionality) for the sake of collecting health data. Furthermore, the questionnaire is unlikely to be useful as a screening mechanism.
These DIY contact-tracing and screening initiatives, no doubt well-meaning and meant to reassure the public, may run counter to painstaking efforts to get apps, and other tech-based solutions, right. Moreover, forcing customers to either fill in a questionnaire or renounce a haircut reminds us of the take-it-or-leave-it tracking policies that we already encounter daily online. There is surely no need to further entrench such toxic practices, and the power imbalances they are based on, in the real world. One possible lesson is that the journey of personal data protection is not over. While legal and judicial interventions have created a burgeoning infrastructure to protect individuals from the adverse consequences stemming from the misuse of personal data, data protection still needs to be ‘entrenched’ in society. The next step, after the pandemic has been addressed and fears healed, may be to work towards a ‘data etiquette’, where the protection of data is not only something that happens in court but also in daily life, much like we no longer smoke indoors because we know it is both unlawful and toxic.
Dr Maria Grazia Porcedda is Assistant Professor of Information Technology Law at Trinity College Dublin.
Suggested citation: Maria Grazia Porcedda, ‘Under the radar: lessons from ordinary data processing in easing pandemic lockdown ’ COVID-19 Law and Human Rights Observatory Blog (3 July 2020) http://tcdlaw.blogspot.com/2020/07/under-radar-lessons-from-ordinary-data.html
Return to home page of the COVID-19 Law and Human Rights Observatory.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.