Irish Regulators Fine Instagram €405 Million For Violating Children’s Data Privacy

The Irish watchdog has fined Instagram owner Meta a record €405 million ($403,8 million) for allowing children to set up accounts that publicized their emails and phone numbers.

Ireland’s Data Protection Commission (DPC) undertook a two-year-long investigation into Instagram’s practices, researching claims of potential European Union general data protection regulations (GDPR) violations. The DPC regulates Meta on behalf of the EU.

Personal Data of Under-Age Publicized

The investigation covered complaints about Instagram’s default settings, which make accounts available to all users. However, it includes those below 18, public, as well as the specifics of business accounts, which publicize the personal information of minors.

The platform allowed users between 13 and 17 to create business profiles, which exposed their phone numbers and email addresses. We adopted our final decision last Friday and it does contain a fine of €405m, Ireland’s Data Protection Commissioner (DPC) commented.

Instagram responded to the decision, arguing that it had already updated its settings, with the fine relating to a year-old scenario. The company said it set teenagers’ accounts to private by default since July last year. It is hence planning to appeal.

The Irish Regulators May get a Counter Sue

The firm in a press release voiced its displeasure with the fine. They maintained they had fully cooperated with the DPC throughout their inquiry and felt the penalty was unjust.

Instagram told BBC news that it will continue to carefully review the rest of the decision. This is the third fine issued to Meta – also the owner of Facebook and WhatsApp – by the regulator and the largest the DPC has ever given for a breach of the European Union’s General Data Protection Regulation.

Last year, the DPC fined WhatsApp €225 million ($224 million) for violating privacy laws, with another fine worth €17 million ($16,9 million) issued to Facebook.

Meta Receives More Than a Million Appeals Over Removed Posts

Meta’s system of appeals against its decision to remove content from Facebook and Instagram received roughly 1.1 million cases in its first year. The disputed posts, most of which originated in the US, Canada, or Europe, had largely been removed for violence, hate speech, or bullying.

Of the 20 cases about which the Oversight Board published decisions, it ruled against Meta 14 times.