Skip to main content

Clifford Chance

Clifford Chance

Data

Talking Tech

Instagram hit with historic GDPR fine: EU privacy watchdog urges companies to "leave them kids alone"

Big Tech Social Media Data Privacy 21 September 2022

On 2 September 2022, the Irish Data Protection Commission (the DPC) issued a 405 million euros fine against Instagram for violating children's privacy under the EU General Data Protection Regulation (GDPR). In addition to becoming the highest fine ever handed down by the Irish regulator and the second-largest fine issued under the GDPR, this decision is undoubtedly part of a regulatory trend in the EU over the last few years regarding processing of children data, which can no longer be ignored.

This fine follows a two-year long investigation by the DPC and is the third fine levied against a Meta-owned company, after a 225 million euros sanction targeting WhatsApp announced in September last year and a 17 million euros sanction against Facebook in March.

Protection of Children's Data: An Increasing Scrutiny

The decision, which was finally adopted on 2 September and made public on 15 September 2022, is set in a context of increasing scrutiny on the collection and use of children's data by major tech stakeholders:

  • Back in 2019, TikTok had already been sanctioned by the U.S. Federal Trade Commission (FTC) for illegal collection of the names, email addresses and locations of children under the age of 13. That same year, Google, which was also under the FTC's radar, was condemned by the latter to a 170 million dollar fine for alleged violation of children's privacy on its YouTube platform.
  • More recently last year, the Dutch supervisory authority imposed a fine of 750,000 euros on TikTok for non-compliance with transparency and information obligations required under Article 12 GDPR.
  • Mid-September 2022, the Irish DPC submitted a draft decision in a large-scale inquiry into TikTok – which had been commenced in September 2021 – to other EU supervisory authorities. According to the Irish DPC's press release, the inquiry focuses on processing of children users' personal data in the context of the settings of the TikTok platform – in particular, the inquiry covers the "public-by-default processing of such platform settings in relation to users under age 18 accounts and age verification measures for persons under 13", as well as the platform's compliance with applicable transparency requirements under the GDPR.

Instagram Decision: Behind the Scenes

The DPC's investigation arises amid at least six other investigations into Meta-owned companies and after having triggered the GDPR's pan-European dispute resolution process to take into account other supervisory authorities' views on the penalty. For the record, the DPC supervises several 'Big Tech' companies, including Apple, Google and Meta, which EU headquarters are based in Ireland.

Why was an investigation opened in the first place? Back in 2019, a data scientist published allegations about Instagram after having found that more than 60 million users under the age of 18 were able to switch to business accounts, leading to their contact information – such as their email addresses and phone numbers - being displayed on their profiles (this feature had been removed by Instagram from personal accounts).

In this context, the DPC decided to open two inquiries into Meta's processing of children's data on Instagram:

  • A first investigation to assess Meta's reliance on certain legal bases to process children's personal data and on the measures taken to ensure adequate protection / restrictions on the Instagram platform;
  • A second investigation focusing, this time, on Instagram accounts run by children and on the settings applicable to such accounts, to determine the appropriateness of such settings in light of applicable data protection requirements. Under this second investigation, the Irish regulator particularly focused on the public disclosure of email addresses and phone numbers of children using the Instagram business account feature and a public-by-default setting for personal accounts of children on Instagram.

The DPC shared its draft decision with other EU supervisory authorities for their views back in December 2021. As the different supervisory authorities were unable to reach consensus on some of the objections expressed by such authorities (notably on the legal basis for processing and the determination of the fine), a dispute resolution procedure was initiated, leading to the adoption by the European Data Protection Board (EDPB) of a binding decision on 28 July 2022.

The Irish regulator then had one month as from the EDPB decision's notification to adopt its final decision.

Instagram Decision: Infringements and Findings of the Irish Regulator

In its Decision, the DPC considered that Meta had failed to comply with several of its obligations under the GDPR, in particular those resulting from articles 5 (principles), 6 (legal basis of the processing), 12 and 13 (transparency and information obligations), 24 (responsibility of the controller), 25 (privacy by design and by default) and 35 (obligation to carry out a data protection impact assessment):

  • No valid legal basis: According to the Decision, Meta could not have relied on either the performance of a contract or its legitimate interests to process children's contact information.
  • Failure to comply with transparency and information principles: The DPC found that the information provided by Meta in connection with (i) the public-by-default processing and (ii) the processing of children's contact information was not compliant with Meta's obligations under articles 5, 12 and 13 GDPR and that Meta should have notably:

1)  Provided child users with sufficient information on the purposes of the public-by-default processing (i.e. operating a social media network which, by default, allows the social media posts of child users to be seen by anyone), in a clear and transparent form;

2)  Taken measures to provide child users with sufficient information on the purposes and categories of recipients with respect to the contact information processing, using a clear and plain language;

3)  Adequately notified existing business account users of the removal of the requirement to publish a phone number and/or an email address in the context of a business account as of 4 September 2019, in compliance with fairness and transparency principles under article 5.1.a) GDPR.

  • Failure to comply with data minimisation, data protection by design & by default principles: The DPC notably raised that:

1) Meta had (i) published children's contact information in the HTML source code of certain Instagram profile webpages and (ii) implemented a default Instagram account setting for child users which allowed anyone (on or off Instagram) to view social media content posted by these users. Thus, the DPC found that the processing activities carried out by Meta had infringed the data minimisation principle under article 5.1.c) GDPR.

2)  Meta had failed to implement several measures to ensure that, by default, the contact information and social media content of child users were not made accessible to an indefinite number of natural persons without the individual's intervention. Thus, the DPC found that the processing activities carried out by Meta had infringed the principle of data protection by default under article 25.2 GDPR.

3) Meta had failed to implement appropriate technical and organisation measures to (i) implement data protection principles effectively and (ii) integrate the necessary safeguards to meet GDPR requirements and protect child users' rights. Thus, the DPC found that the processing activities carried out by Meta had infringed the principle of data protection by design under article 25.1 GDPR.

  • Failure to carry out a data protection impact assessment (DPIA): The DPC considered that, given the nature, scope, context and purposes of the processing activities at stake, such processing activities were likely to result in a high risk to the rights and freedoms of child users, requiring Meta to conduct a DPIA prior to such activities taking place. Since no DPIA had been conducted by Meta in this respect, the DPC found that it had infringed its obligations under article 35 GDPR.

As a consequence of these infringements, the Irish DPC (i) imposed by a record fine amounting to 405 million euros and (ii) required Meta to bring its processing activities into compliance within a period of three (3) months of the date of notification of the Decision. In the meantime, Meta will also be required to submit a report to the DPC detailing the actions taken to achieve such compliance.

Protection of Children's Data: More to Come?

This increased scrutiny on tech stakeholders has undeniably led to a global willingness to tighten up regulation in the area of children's privacy. In the EU, the adoption of a European code of conduct on age-appropriate design by 2024 was announced by the Commission back in May, as part of the "European strategy for a better internet for kids" it adopted on 11 May 2022. In France, the CNIL published a set of recommendations in August last year to enhance children's protection online. In the UK, changes to protect children's privacy had already been introduced in September 2020, with the adoption by the ICO of its own age-appropriate design code of practice.

Beyond privacy, there is a strong need for EU harmonisation when it comes to rules applicable to children's activities over the Internet. For instance, rules applicable to children's capacity to act online and to the enforceability of agreements concluded by them via the Internet (e.g., subscription to online services, access to e-commerce, gaming) differ from an EU Member State to the other.

An EU-wide harmonisation of such rules would benefit all stakeholders: children, who would be granted enhanced rights, and businesses, who would be better placed to be able to comply with a harmonised – rather than a fragmented – legal framework.