Yesterday afternoon, Facebook’s most recent whistleblower Frances Haughen spoke to the Oireachtas Joint Committee on Tourism, Culture, Arts, Sport and Media on Online Disinformation and media literacy (Disinformation and misinformation on online platforms).

I listened in with interest not just because I’ve been publicly critical of Facebook’s ethics washing and exploitative practices since 2016 (that whitepaper almost never got published because just as I was about to finish Facebook did another awful thing!), but also because I’d been asked to join Matt Cooper on The Last Word to comment on Haughen’s critical statements about the Data Protection Commission. Perhaps unsurprisingly due to world events, Matt Cooper ran out of time to focus much discussion at all on regulation for online harms because the discussion of Ukraine and sanctions against Russia ran over. So, I’m expanding on what I didn’t get to say here.

There was a lot in the discussion, but I suspect a lot of the sound bites for commentary are going to be around the substanceless comments Haughen made about “needing to learn lessons from the criticism of the DPC” rather than some of the more interesting substantive issues within her realm of expertise.

This Oireachtas committee wasn’t necessarily the best place for her to speak to her area of expertise, and I would very much like to hear more about some of the things Haughen tried to emphasize as important rather than having to answer wrong-headed leading questions from legislators about ID verification and banning anonymity on social media.

Profits vs Ethical Action

Haughen rightfully puts a lot of emphasis on Facebook (now Meta)’s pursuit of profits over public safety. The harms she saw in this were why she left Facebook as a whistleblower. This is a wicked problem, but the root cause of things here is the profit-focused shareholder value model for business, a model which focuses on monetary profits (particularly quarterly profits) as the key metric for success of organisations, rather than considering other benefits or value to the organization’s stakeholders (Stakeholder theory), or a focus on the common good.

This is something that my colleague Daragh O Brien and I discuss at length in our book on Ethical Data and Information Management. (There is also a short eLearning course based on the book is available here.) It’s a challenge globally. (Interestingly, GDPR actually requires a Stakeholder theory approach, at least where processing of personal data is concerned. Organsiations processing the data of people in the EU must consider the rights and interests of data subjects as stakeholders in a risk-based approach to data processing.)

A lot of the issues raised are what we in Castlebridge call “Wicked problems”, thorny difficult issues which do not have simple solutions. Not only that, but often what appears to be a simple obvious solution is potentially dangerously wrong. Haughen’s suggestion of the Irish government providing a hotline for tech workers to promote transparency might be one of these. Who answers the phone? The Gardaí? What powers would they have? Who would have access to information from the hotline? The potential issues spiral out, and while there may be an interesting idea for future legislation, expecting past regulatory action within the legislative context in existence is unrealistic.

Haughen repeated a call for “mandatory transparency” multiple times, saying “We got where we are today because Facebook is not transparent . . . because Facebook has only had to report profit and loss, that’s what its priorities are.” However, she only gave a few examples for reporting data and transparency. So what exactly is the proposal on transparency? What will be disclosed or measured? And what will happen?

She stated that the systems of platforms should be focus of regulation for online safety, not content. This is something I’d much like to see expanded because this is a very important question. Right now, I’m simply asking, “how”?

Definitely, transparent key metrics for performance are key. What gets measured gets done. What we choose to measure influences behavior. There are important issues here, but we need to have clear defined metric and definitions so we can have a meaningful policy debate instead of relying on soundbites.

What metrics do we need for meaningful decisions?

One thing that the fields of data and regulation have in common is the need for clearly defined, enforceable definitions and metrics. For data analysis, if you don’t have a definition for what an “apple” is, you can’t tell if you are comparing apples to apples, or apples to pineapples (or potatoes, because of your machine translation of pommes de terre). For regulation, you need clearly defined actionable metrics, and clear definitions for what is being regulated to begin with. What is an online harm?

There needs to be clear definitions, so we can compare apples with apples, so we can have a meaningful policy debate instead of relying on soundbites.

And, we also need clearly defined metrics for what to consider regulatory “success”. This is an area where Haughen was clearly out of her depth. She made broad statements about the Irish supervisory authority with soundbites such as “we need to learn lessons from criticism of the DPC”, and claims that Ireland has “stepped back from regulating” the GDPR, without substance.

Soundbites vs. Substance — Haughen on the DPC

Most interestingly, once you leave the soundbites behind, her actual substantive comments actually often support things the DPC does or agree with things the DPC has focused on strategically. (Perhaps inadvertently, as she has openly admitted she doesn’t understand how European regulatory agencies work. She also called the Data Protection Commissioner a “government official”, which the DPC as an independent regulatory agency most emphatically is not, nor is the Commissioner herself. This lack of understanding of the role and independence of the Commission, which as an independent regulator has enforced regulatory action against the State, may explain broad, unspecific comments about “conflict of interest”.)

I think we can all agree that the DPC has been facing massive challenges and could improve some of the ways it handles these challenges. The DPC has been open about some of these challenges itself. But what standards or metrics are we actually using to objectively discuss the DPC’s successes or failures to regulate effectively? Haughen’s quoted statistics on closure of cases by the DPC (2%) were incorrect and have been corrected previously.

But then, we get to more substantive comments…

Haughen’s focus on class action rather than individual complaints isn’t particularly helpful for our jurisdiction . . . The only place in Irish law that we have that concept is actually Data Protection, thanks to GDPR. Haughen stated that allowing individual complaints to be taken to a regulator for online safety or harms instead of restricting complaints to class action would bog a regulator down in so many individual complaints that they would not be able to address systemic issues. This implicitly supports the Data Protection Commission’s strategic focus on taking own-volition investigations of systemic issues.

However, she does not appear to have recognized that EU data protection legislation is focused on the fundamental human rights of individuals and therefore requires the regulator to handle individual complaints. Attention to this is not a failure of the regulator, but a necessary part of their function. It does suggest, however, that for the supervisory authority to regulate effectively, it needs sufficient resources to be able to address systemic issues and individual complaints. The judicial review of the DPC’s investigation Facebook’s cross-border transfers has clarified procedurally that the DPC can take whatever approach to an investigation the Commissioner determines appropriate, so own-volition inquiries are now easier to initiate for areas of concern. The due process has been slow, but this clarification should be welcomed by those who want the DPC to focus on systemic issues.

Haughen was also very positive about a focus on ongoing risk assessments and pointed out that impact assessments should include stakeholder representation, not just be internal assessments. This is another area where the DPC has focused strategic action. The Data Protection Commission has made it clear that Data Protection Impact Assessments should include stakeholder/data subject representation, and the DPC has issued enforcement notices to public bodies that required them to engage with NGOs in certain sectors.

In fact, one of the DPC’s notable enforcement actions against Facebook was landing in on their offices physically and requiring documentation (such as… a Data Protection Impact Assessment) when Facebook announced on short notice that they were launching a dating app, requiring some significant changes before Facebook eventually launched several months later than originally announced.

Haughen touched upon the fact that funding the regulator is really important. Funding is key, and I expect the DPC would enthusiastically agree with her here. If the regulator isn’t resourced properly, then no matter if they handle everything perfectly, they will be hamstrung in their ability to effectively regulate.

Looking forward to new regulation as a solution?

So, we’ve established that the functioning of EU regulatory agencies are not Haughen’s area of expertise. This makes it interesting to me how optimistic she is about the proposed EU Digital Services Act as a solution, calling for the Irish Government to legislate in harmony with it.

The European Digital Services Act is at proposal stage. It had its first reading just last month.  When it eventually passes it will be a Regulation, which means it will apply directly here just as GDPR does. It won’t need enabling legislation, and it will supersede any conflicting national legislation. So her emphasis on Ireland needing to legislate in concert with it is unneeded and misplaced. The subsidiarity principle applies. Of course, there is nothing stopping the government from passing ADDITIONAL safeguards.

My concern is the timeline here. One of the things that has hampered effective regulation of the big tech companies since GDPR came in is that a lot of the issues relating to tracking and AdTech, are more specific to ePrivacy. And unfortunately, the draft ePrivacy Regulation that was supposed to replace the old directive and give more regulatory teeth in 2018 has been delayed and delayed and is still a draft six years later thanks to the delaying tactics of lobbyists.

We are all frustrated by that, but I have a suspicion that this is most frustrating to the regulator trying to address complaints regarding cookies, trackers, and AdTech. The DPC is still working under SI 336 of 2011 (which transposes the ePrivacy Directive), but their powers are limited there. And they can’t just investigate a complaint and deliver a fine under that. It’s a whole roundabout process of demonstrating non-compliance, enforcement notice, prosecution against the controller if they don’t comply with the enforcement notice. In the meantime, commentators are looking for volume and level of fines as a metric for success!

It’s very important, and I think both Haughen and the DPC would agree here, that the Irish government implement proper supports for effective independent regulation both when it comes to Data Protection and when it comes to Online Harms and Safety. There should also be clarity and transparency regarding the scope of regulation and powers of the regulator, and meaningful metrics for what constitutes success or effective regulation.

Dr Katherine O Keefe

Dr Katherine O Keefe

Dr Katherine O'Keefe is the Director of Education in Castlebridge. Katherine oversees the development of our various data literacy and data education training products and services. Katherine also works with clients as a consultant and as a Data Protection Officer. She has a PhD in Anglo-Irish Literature, and also holds CIPP/E qualifications from the IAPP and a CDMP from DAMA International.