Facebook and Instagram are under formal investigation in the European Union over child protection concerns, the Commission announced Thursday. The proceedings follow a raft of requests for information to parent entity Meta since the bloc’s online governance regime, the Digital Services Act (DSA), started applying last August.
The development could be significant as the formal proceedings unlock additional investigatory powers for EU enforcers, such as the ability to conduct office inspections or apply interim measures. Penalties for any confirmed breaches of the DSA could reach up to 6% of Meta’s global annual turnover.
Meta’s two social networks are designated as very large online platforms (VLOPs) under the DSA. This means the company faces an extra set of rules — overseen by the EU directly — requiring it to assess and mitigate systemic risks on Facebook and Instagram, including in areas like minors’ mental health.
In a briefing with journalists, senior Commission officials said they suspect Meta of failing to properly assess and mitigate risks affecting children.
They particularly highlighted concerns about addictive design on its social networks, and what they referred to as a “rabbit hole effect,” where a minor watching one video may be pushed to view more similar content as a result of the platforms’ algorithmic content recommendation engines.
Commission officials gave examples of depression content, or content that promotes an unhealthy body image, as types of content that could have negative impacts on minors’ mental health.
They are also concerned that the age assurance methods Meta uses may be too easy for kids to circumvent.
“One of the underlying questions of all of these grievances is how can we be sure who accesses the service and how effective are the age gates — particularly for avoiding that underage users access the service,” said a senior Commission official briefing press today on background. “This is part of our investigation now to check the effectiveness of the measures that Meta has put in place in this regard as well.”
In all, the EU suspects Meta of infringing DSA Articles 28, 34, and 35. The Commission will now carry out an in-depth investigations of the two platforms’ approach to child protection.
Meta has been contacted for a response. Update: A company spokesperson emailed us this statement: “We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools and policies designed to protect them. This is a challenge the whole industry is facing, and we look forward to sharing details of our work with the European Commission.”
The EU opened a similar probe into addictive design concerns on video sharing social network TikTok last month.
The Commission also already opened two DSA investigations on Meta’s social networks. Last month it said it would investigate separate concerns related to Facebook’s and Instagram’s approach to election integrity.