The Amsterdam Law & Technology Institute’s team is inviting external faculty members to publish guest articles in the ALTI Forum. Here is a contribution authored by Wes W.P. Damen, Adam Harkens, Wenlong Li, Emma Ahmed-Rengers, and Karen Yeung.
***
Data protection in post-Brexit Britain
A response to the Government of the United Kingdom’s proposed reforms to UK data protection law
1. Introduction
After roughly 5 years of tense negotiations, the United Kingdom finally exited the European Union. Following this exit, the UK government has made it clear that they view Brexit as an opportunity to enact new legislation, diverging from EU rules. One of the areas where the UK is keen to lay down different rules than the EU, is the area of (not?) regulating the ways new technologies are to be used, and how citizens’ personal data is handled where it is used as a necessary component of such technologies. On the 10 September 2021, the UK Department for Digital, Culture, Media & Sport (DCMS) launched its public consultation for proposed reforms to the UK’s data protection regime. This document is written in response to the open consultation proposal. This blog post presents a recap of the most interesting, and frankly: harmful, propositions in this consultation document, which we have analyzed more in-depth in this analysis.
2. Why analyze the UK Government’s proposal
Careful eyes need to be kept on the UK’s data protection regime, in order to ensure that existing legal frameworks maintain the safeguards for citizens and are appropriately future proofed. Especially if the UK law on data protection will start to diverge from the EU’s standards, there is the very real threat that public and private sector use of technologies that require the processing of high-volumes of personal data (in many cases making use of machine learning techniques) expose data subjects to considerable harms to their fundamental rights and personal interests.
The specific proposals set out in the ‘Data: A new direction’ consultation are deeply troubling for a number of reasons, which we set out in more detail below. In short, we do not believe that an adequate and proportional balance has been achieved between the government’s desire to boost innovation and the ever-increasing need to ensure that citizens can be appropriately assured of their protection against data harms.
We have written this response with the above in mind and seek to express precisely where, why, and how the delicate balance between innovation and protection against data harms – which is somewhat achieved in the existing data protection regime (even though certain aspects could be improved) – would be negatively affected by the implementation of the proposed reforms. We believe that for all possible critiques of the existing regime, the appropriate response is not to lower protection, but to ensure clarity and legal certainty regarding existing provisions, in order to strengthen protection for data subject. This is not achieved in the proposed reforms, in which necessary aspects of data protection are in effect framed as burdens, rather than enhancements of, democratic society. Instead, data protection is sacrificed for the sake of innovation. We therefore reject the claim that the proposed new regime successfully manages to “maintain high data protection standards without creating unnecessary barriers to responsible data use” (p. 7 of Data: a new direction).
Our response proceeds by first setting out a general overview of our main concerns, which we have organised into two categories: discursive concerns (par. 3) relating to the overall framing of the proposed reforms and the inferences we can draw from this regarding consequences for future data protection law and policy in the United Kingdom; and substantive concerns (par. 4) relating to specific reform proposals. Having set out our concerns, we then respond directly to the questions posed within the consultation
3. Discursive concerns: framing innovation as an unmitigated good while ignoring threats to human rights
The UK government describes the proposed data protection reform as a “New Direction.” This begs the question: what was the “old direction,” and why does the UK choose to deviate from it? The introductory section to the consultation document does not systematically set out the government’s reasons for wanting to deviating from EU data protection law, yet it provides clues: it mentions that “some existing rules and guidance are either too vague or overly prescriptive,” and that the “New Direction” will “deliver better outcomes for people.”
These outcomes, however, are described solely in economic terms. The new direction must unlock new economic opportunities, support vibrant competition, and establishing a pro-growth regime that eases costs for businesses. This framing of the justification for a new direction is then backed up by the claim that the new direction will bring a net direct monetised benefit of more than a billion pounds over 10 years.
When we compare this to the framing of the “old” European justification for data protection law, the difference is clear. The EU GDPR has two main purposes:
1) the facilitation of the free flow of data in the internal market, based on the recognition that data processing can bring social and economic benefits,
and
2) the protection and promotion of fundamental rights, recognising that data processing is only beneficial to all if it is done legitimately and subjected to appropriate safeguards preventing harms and wrongs.
While the “New Direction” exalts the virtues of the first purpose, it fails to recognise the crucial importance of the second. The justification for the proposed UK data protection reform does not strengthen any data rights, and indeed does not refer to fundamental rights at all. The framing of the document, and the proposals derived from it, fail entirely to acknowledge or recognise that the value of data protection lies in its status as a fundamental right. Even though the condition of having one’s rights protected and promoted might be hard to capture as an “outcome,” and even harder to capture in pounds sterling, it should be at the core of any legislative reform in a society that wishes to call itself a democracy.
The consultation document emphasises the need for “responsible innovation.” We posit that responsible innovation does not primarily require the maximisation of economic growth, but rather is closely tied to lawfulness and the rule of law. This means the setting of legal standards which appropriately distribute responsibility for harms and wrongs caused by data processing.
Secondly, the document fails to recognise that lowering data protection standards in the UK may well erect serious barriers to innovation, rather than lowering them. The transferring and sharing of personal data with countries outside of the EU is crucially dependent on those countries ensuring “an adequate level of protection” (article 45 GDPR). This means that any national economy wishing to reap the benefits of these troves of data, must uphold levels of data protection considered adequate by the EU. A consequence of the UK adopting rules that significantly lower data protection safeguards for its citizens is that the EU is likely to consider UK protection levels inadequate. This would lead to much more uncertainty and many, many times the current amounts of “red tape”, that the UK Government is so keen on cutting. In short, the notion that data protection law stifles responsible innovation is unsubstantiated and contentious at best. The contentious narrative deflects the attention from the real concerns that the UK government should be addressing if they wish to safeguard the cross-border data sharing that is so vital to technological innovation.
A final discursive concern about the consultation document arises from poor practices adopted in the collection and use of survey evidence and its misleading use of statistics. In particular, it references the Centre for Data Ethics and Innovation’s (CDEI) survey of public attitudes towards data sharing. Not only does this survey repeatedly phrase questions in a one-sided and value-laden manner (“How comfortable are you with data sharing by researchers ‘to improve knowledge and to help keep the public safe?”), but the conclusions that the UK Government draws from the data are baffling – if not manipulative. Our preprint discusses this slightly more in-depth, but rest assured that if any of our undergraduate students handled data in this manner, they would fail our classes. This falls well below the level that can be expected of both civil servants and legislators, and should not to be used as a foundation for legislative decision-making affecting millions.
4. Substantive concerns: objections to specific proposals
Having argued that the general framing of this consultation relies too heavily on promises of economic growth and pays too little attention to lawfulness and fundamental rights, we now turn to the specific reforms proposed within the consultation document.
In this blog post, we will only summarize some key concerns. Basically, the UK Government proposes to blanketly scrap some of the most important safeguards that data subjects currently have against being subjected to data harms.
4.1 The proposed relaxing of the principle of purpose limitation
The government’s proposal (point 54 and beyond) is to significantly diminish the force of the purpose limitation principle. The principle of purpose limitation dictates that a data controller cannot use personal data in whatever way it likes, but is instead largely bound by the purposes for which it originally acquired the data. It is one of the most important principles of privacy and data protection law, both in the European Convention on Human Rights and in the EU GDPR. It is this principle that lays down the rules about what entity gets to use what data, and what the limits are to sharing it with other organizations. As such, it functions as one the absolute pillars of the fundamental rights to privacy and data protection, and of citizens’ legal protections against harms caused by unfettered access to personal data, illegal profiling, and biased algorithmic decision-making.
The changes proposed by the UK risk undermining these fundamental rights and erode much needed and hard-fought legal protection. The UK government proposes “to clarify that further processing for an incompatible purpose may be permitted when it safeguards an important public interest”, and wishes to “confirm that further processing may be permitted, whether it is compatible or incompatible, when it is based on a law that safeguards an important public interest.”
Permitting ‘further processing’ means removing the barrier that is the purpose limitation principle – the cornerstone of privacy and data protection law. The notion that this should be permitted when it safeguards an important public interest is a false dichotomy: the legal protection offered by the purpose limitation itself is an important public interest. The notion that diminishing citizens’ legal rights in favour of (“other”?) public interests, and the idea that an important public interest should categorically override a citizen’s right to privacy and to data protection is a blatant attack on the core of the fundamental right to data protection. The strength of human rights is that they cannot be easily overridden – especially not by governments. The propositions quoted above undermine this concept of human rights at a fundamental level.
4.2 The proposed scrapping of art. 22 GDPR
Article 22 GDPR concerns the right not to be subject to automated decision-making. It is one of the most contentious provisions in the GDPR and has been subjected to extensive debates both in its legislative history and in academia. The UK Government proposed to just scrap the article. Their justification for this, however, offers no substantive or convincing reasons for the removal of this right, except that the right is difficult to exercise by the data subject and difficult to respond to by the controller. The substance of the right not to be subject to automated decision-making is rooted in the fundamental rights to due process and fair procedure, which have long been regarded as essential requirements of the rule of law and as central tenets of British administrative law. Any effort to reform this right must therefore be geared towards strengthening it, rather than eradicating it entirely.
4.3 The proposed scrapping of DPOs, DPIAs, and record-keeping requirements
The UK Government proposes to scrap the requirements to designate a data protection officer (dpo), to perform data protection impact assessments, and various record-keeping requirements.
We believe that the requirement to designate a Data Protection Officer (DPO) should not be scrapped in a blanket manner. While we acknowledge that the designation of a DPO entails costs for every organisation (and hence is considered particularly unwelcome for small businesses), there are contexts in which having a dedicated individual is highly beneficial to ensure (a) accountability for compliance with legal standards, and therefore (b) for the prevention of harm in high-risk contexts (including law enforcement and other public sector bodies with coercive powers, such as immigration authorities). Further, businesses without sufficient resources for an in-house DPO should be able to avail of an alternative option, such as the use of an external provider.
The legal obligation to perform and publish data protection impact assessments (DPIA) should also not be removed in a blanket manner. Under current data protection law, a DPIA is only required when processing data which poses a high risk of harming the rights and freedoms of citizens. In those scenarios, the time it takes to perform this procedural check is a small sacrifice in light of the possible harms of the projected processing operations. Removing this safeguard in a blanket fashion places data subjects even further on the back foot against possible data harms, in exchange for limited efficiency gains for data controllers.
Similarly, the legal obligation to maintain records of personal data processing should not be removed, particularly in high-risk contexts. To do so would risk significant data harms to relevant individuals. For example, in the context of an automated decision that may have significant adverse effects on an individual, it would be very difficult for a public authority to offer reasons for said decision – and therefore meet their legal obligations under administrative law – if adequate records of the entire decision-making process are not maintained.
4.4 The proposed changes to the supervisory authority (the ICO)
In light of some of the current UK Government previous interactions with “reforms” for supervisory authorities, such as the Parliamentary Commissioner for Standards, or the chairpersonship of media regulator Ofcom, any proposed reform to a supervisory authority must be viewed with suspicion. A supervisory authority is independent when it decides for itself what its objectives and priorities are. The UK Government’s proposal to submit the ICO to “strategic objectives and duties that the ICO must fulfil when exercising its functions,” is threatens the ICO’s independence. These proposals raise serious concerns about sufficient checks on government power.
5. Conclusion
None of this is to say that European data protection law cannot be improved upon. There is plenty of scope for criticising certain GDPR provisions as being too vague or barely operationalised. However, such criticism should not be used as a pretext for removing protections without due consideration of their critical role in safeguarding both individual citizens from intrusive and dangerous data-driven technologies and the democratic political culture that makes individual freedom, dignity and autonomy possible.
Wes W.P. Damen, Adam Harkens, Wenlong Li,
Emma Ahmed-Rengers & Karen Yeung
***
Citation: Wes W.P. Damen, Adam Harkens, Wenlong Li, Emma Ahmed-Rengers & Karen Yeung, Data protection in post-Brexit Britain, ALTI Forum, March 15, 2022.