Instagram and the DPC Fine

By Daragh O Brien
September 6, 2022
14min read
children's datadata protectiondata protection by design

News has broken of the Data Protection Commission’s latest enforcement action against a Big Tech firm. Today (5th September 2022) the DPC announced a fine of €405 million for failures on the part of Instragram to properly protect personal data of children.

While the decision has not been published (I suspect Meta are being given some time to identify any commercially sensitive information that might be in the written decision), a review of the timeline for the investigation might reveal hints as to the key issues at stake and the implications for other Data Controllers.

What has happened?

Back in 2020 the DPC received reports based on an analysis of Instagram profiles world wide which identified that approximately 60 million children using the platform had had the option to change their profiles to business accounts.

The problem here is that business accounts required the publication of email addresses and phone numbers of the users.

So… the email addresses and phone numbers of approximately 60 million children were being published by Instagram. In addition, that data was also encoded into the HTML of pages presented by Instagram, which meant it could be scraped by malicious processes (bots) seeking to gather personal data.

(And let me remind everyone… this is data of children).

The DPC commenced an own volition formal inquiry in September 2020.

Yesterday it was revealed that the DPC has issued a decision to Meta/Instagram that includes a fine of €405million. To put this in context, in 2020 Facebook had a paltry corporation tax liability in Ireland of €266.3 million and they had booked a provision of €1.02billion to resolve “regulatory matters” in all Meta platforms (Facebook, Instagram, Whatsapp).

The Process of the Decision

As this decision affect data subjects in other EU member states, it needed to go through the EDPB’s Consistency and Cooperation mechanism (Article 60). The DPC sent a draft decision to the EDPB in December 2021. The Article 65 dispute resolution process was invoked in April/May 2022 after six regulators wouldn’t back proposed penalties, and binding decision was adopted by the EDPB in July 2022.

It is clear from the EDPB note on the binding decision that there were a number of areas in scope for the decision. These included

  • Article 5(1)(a) Transparency and Article 5(1)(c) Adequacy & Necessity
  • Article 12(1), Transparency and Communication. It is notable in the context of the complaints that this Article makes specific reference to the appropriateness of language when communicating information to a child.
  • Article 13 Transparency – Provision of information about processing ,
  • Article 24 Responsibilities of the Data Controller regarding Organisational and Technical Controls,
  • Article 25 Data Protection by Design and by Default
  • Article 35 Data Protection Impact Assessments

The final decision will be published on the DPC and EDPB websites in due course.

It is worth noting that during the course of the investigation process Meta/Instagram took steps to address the issues that were raised in the investigation. So… remedial actions were taken. But that appears to have been insufficient to mitigate a fine. We do not know yet what other remedial actions the DPC decision might require Meta/Instagram to take. It’s also worth noting that Meta/Instagram have said they will appeal this fine, just like they are appealing the previous Whatsapp decision and the associated fine of €225million.

Key Takeaways

Without the final decision to look at, it is difficult to describe key takeaways for data controllers at this time. However, looking at the trigger for the investigation and the topic areas which were in scope for the decision, we can surmise the following:

  1. Children’s data needs to be protected with extreme care, and infringements that put children’s data or rights at risk will carry a premium in the calculation of penalties.
  2. Data Protection by Design/Default is the key to effective transparency and assessment of risks. As children are, by definition, vulnerable data subjects that means DPIAs are required. And for services that might be used by children (or indeed other users) even if they are not the primary target user group, consideration needs to be given to how child users/vulnerable users are protected.
  3. Internal controls and evidence of their operation are not just for your day to day activities. They are an essential evidence base of operation of controls and evidence of mitigating factors that can be considered when calculating fines. So… PRIVACY ENGINEERING IS AN ESSENTIAL DISCIPLINE!!
  4. Doing Data Protection by Design means never having to say you’re sorry. Or at least, it means you can demonstrate how you did everything you could to avoid having to say sorry.

What’s Next?

With fines from the DPC now running almost to a bus timetable (service is still slower than many would like, but it is becoming more predictable), and with a clear message now being sent that the DPC will levy large fines where it is warranted, Controllers need to focus on fundamentals and ensure good data governance structures for personal data, pay attention to the what is personal data, and ensure they are clear about their transparency obligations.

They also need to ensure that where they offer services that might be accessed or used by children they ensure that they assess the risk of harm arising from children inadvertently having their personal data disclosed and ensure that appropriate preventative safeguards are in place.

Related Insights


Keep up to date with all our latest insights, podcast, training sessions, and webinars.

This field is for validation purposes and should be left unchanged.