And so this is GeekMas…
This is my obligatory “GDPR at 5 years” post. Everyone will be doing one today as it is the 5th anniversary of the General Data Protection Regulation coming into force. Some of the posts will be positive and extolling the virtues of the impact the legislation has had globally. Others will be negative and critical, highlighting the failures of regulators (with lots of focus on one, but my experience of others is also crappy, so I’d suggest glass houses/stones should be a mantra here). Some will discuss the impact of litigation over the past few years.
This one will be written along the structure of John Lennon’s famously uplifting Christmas carol, “Merry Christmas (War is Over)”, because that tune always lifts my spirits at Christmas. In the same way as an anvil soars like an eagle in the summer sky.
And so this is Geekmas (and what have you done)?
Five years in, what have we achieved? While there is an understandable focus on large technology companies and transfers to the United States, there seems to be a degree of forgetting that data protection law is about a whole lot more than that.
It’s about the trainee lawyer whose bank has screwed up his credit record so badly he can’t actually meet the fitness and probity tests to get a practicing certificate. It’s about the IT service provider in Zimbabwe or Ghana who wants to understand how to trade with European businesses to try and capitalise on the freshly minted Data Protection laws in various sub-Saharan countries. It’s about the survivor of a Mother and Baby Home who has managed to get access to the data about their family origins that had been previously locked away as a dark secret.
The true meaning of Geekmas
But, just as Christmas has become more about the Christmas Jumpers, Late Late Toy Show, tinsel, and the risk of sexual harassment litigation after the Christmas Party, Data Protection Geekmas has become (for some) more about the performative dance of documentation and checklists and the false idol of fines as a way of keeping score, with the (hopefully) unintended consequence of building a lovely shiny transparent wall around the data of Europeans that, while shiny and transparent, is still somewhat wall-like in its nature. This has led, inevitably, to a race between the ladder manufacturers in BigTech/Big Law who sought out loopholes and mechanisms to get things done and the legislators, litigators, Regulators, and the CJEU who have interpreted and evolved the law (which is their job).
This has led to a dilution of focus and attention from the less headline-friendly small data stories and the importance of day-to-day data protection for the individual. It has also led effectively to the shutting off of Europe as a potential market for emerging industries in developing nations because they don’t have an adequacy decision, they have different constitutional and cultural traditions to Europe, and many of them have a recent history of less than spotless human rights records.
If I have to bring ethics into this, we’re going full-on Kant, OK?
Deontological ethics tells us that we should take care not to act on any maxim that we would not want to be a universal proposition. So, if we are going to shut down transfers to the United States because of the existence of a risk of access/interception by intelligence agencies or law enforcement without appropriate safeguards or right of redress, then we will need to accept that that will slam the door in the face of any other country, be it a developed economy or developing economy, being able to provide data processing services to EU companies or organisations.
While this might be one way we stop the abuse of knowledge workers in Africa who are paid a pittance to perform human reinforcement learning in AI systems, it does have the unintended consequence of potentially harming the chances those knowledge workers may have of actually developing an indigenous IT / Information Services sector in their countries, potentially leading to a brain-drain to Europe or elsewhere.
Oh… and if the shiny transparent wall doesn’t take account of how the internet and world-wide-web work down in the bowels of the technology, a lot of our nice first world things will eventually start to break.
(Maybe the brain drain will have to go the other way.)
Another Regulation Over (and a new one just begun)
And, at this Geekmas time of reflection, we need to think of all those ghosts of Regulations Yet to Come. The AI Act, which has many flaws as it lurches through the legislative process, but dangles the same promise of a ‘risk-based’ approach to regulating emerging AI tools as was apparently part of the Ghost of Data Regulation Past. If the interpretation and application of this legislation follows the same arc as that which has gone before, we may find ourselves with a strong case of digital agoraphobia – terrified to venture outside our shiny transparent walls because of the risk of unintended consequences.
And don’t get me started on the sheer brain numbing hypocrisy that is the emerging war on encryption and the CSAM proposals. Remember – the CJEU and the EDPB have just pulled the shutters down on the use of SCC’s for transfers (by Facebook) to the United States because of issues with safeguards and oversight, and the use of SCCs to other jurisdictions will inevitably hinge on the deployment of technical controls (because organisational controls can’t eliminate risk) such as encryption, which the Commission and various MEPs want to backdoor/turn off/bypass. The CSAM proposals which suggest that ALL content must be scanned IN CASE some of it MIGHT be an undesirable form of content represent a proposal for mass surveillance on a grand scale, ‘just in case’. Mass surveillance not dissimilar to the kind that has already been struck down repeatedly by the CJEU for being beyond what is necessary and proportionate in a democratic state.
Again, deontological ethics say that we need to act only on the maxim that we can accept will be a universal proposition. There is either encryption or there is not. There is either a safeguard, or there is not. And if there is a backdoor into encryption there is a non-zero chance that that backdoor will be abused by authorities or will be found by malicious actors. If there is a list of undesirable content, there is a non-zero chance that it will grow over time to include “sharing satirical memes about the government”.
A non-zero chance of a certainty.
And for anyone who thinks that that wouldn’t happen, here’s a vignette from Ireland’s very recent past involving a Prime Minister, the National Gallery, and some satirical paintings. And for anyone who thinks the non-zero chance is still low, remember the decision on SCCs and Meta hinged on the existence of a non-zero chance of an interference with the essence of a right.
So, let’s not celebrate GDPR when the Commission, Parliament, and Member States are saying the quiet bits out loud about ripping the guts out of the fundamental principles that underpin GDPR and the rights to Privacy and Data Protection in the Charter of Fundamental Rights.
Of course, if we want the measure of ‘meaningful equivalent protections’ to include ‘does the government monitor all social media communications, including encrypted messaging for content that the government deems undesirable‘, that might actually solve the whole cross-border transfer dilemma to various countries.
And so this is Geekmas (I hope you had fun)
Five years ago lots of people went on three day courses and became EXPERTS. And many of them have grown into the title. Others have fallen by the way side. Back then I was quietly cautioning people that:
- Fines would be unlikely to appear for a few years and wouldn’t be a massive motivator for behaviour change.
- Turning the ship of an organisation and fixing the data culture so we had data protection by design/default was a 2-3 year journey not ‘instant pudding’.
- The devil would be in the detail of decisions and the development of the half-baked consistency mechanism.
I do hope the people who thought this was about checklists and technology tools to magically do things have had fun with that. But the lesson of Geekmas needs to be that it’s about PEOPLE.
A very merry Geekmas (and a Happy New Year)
Don’t let my boyish good looks and George Clooney charm fool you… I’ve been doing this data thing a long time. So I’ve earned the right to be a little cynical. I worry that we are at an liminal point where the pursuit of perfection may result in the sacrifice of the good, undermining faith in legislation, rights, and the fundamental core objectives of data protection law going back to its origins in the 1970s (and indeed before that).
That pursuit of perfection as opposed to protection, coupled with the mind bending hypocrisy of emerging proposals may ultimately serve to kill the spirit of Geekmas.
What will your New Year’s Revolution be?