EU Finds Meta Falling Short on Child Safety as DSA Probe Intensifies

EU Finds Meta Falling Short on Child Safety as DSA Probe Intensifies

By
Key Takeaways
  • Preliminary DSA Breach Findings: The European Commission found that Meta may be in breach of the Digital Services Act for failing to effectively prevent children under 13 from accessing Instagram and Facebook.
  • Weak Age Verification Controls: Regulators identified that minors can easily bypass safeguards by entering false birth dates, with no effective verification mechanisms in place to confirm user age.
  • Ineffective Reporting and Enforcement: Reporting tools for underage users are difficult to use and often fail to trigger meaningful follow-up, allowing flagged accounts to remain active.
  • Flawed Risk Assessment: The Commission said Meta’s risk assessment is incomplete and contradicts broader EU evidence showing 10–12% of children under 13 are using the platforms, while also failing to account for known harms to younger users.
  • Potential Financial Penalties: If confirmed, the findings could lead to fines of up to 6% of Meta’s global annual turnover and additional penalty payments to enforce compliance.
Deep Dive

The European Commission has preliminarily concluded that Meta is failing to adequately prevent children under 13 from accessing Instagram and Facebook, raising fresh concerns about how one of the world’s largest platforms is enforcing its own age restrictions under the Digital Services Act.

The findings, published this week following an in-depth investigation, suggest that Meta’s existing safeguards are not only ineffective but also built on what regulators described as an incomplete and inconsistent assessment of risks to minors.

At the center of the Commission’s concerns is a familiar but unresolved issue of self-declared age verification. According to the preliminary findings, children under 13 can still bypass safeguards by simply entering a false birth date when signing up, with no meaningful checks in place to verify the accuracy of that information.

Even when underage users are identified, enforcement appears uneven. The Commission pointed to reporting tools that require multiple steps to access and lack basic functionality, such as auto-filled user data. In many cases, reported accounts are not followed up on effectively, allowing underage users to continue using the platforms without further checks.

A Risk Assessment That Doesn’t Add Up

Regulators also took aim at Meta’s underlying risk assessment process, describing it as insufficient and at odds with broader evidence across the European Union. While Meta’s own analysis appears to downplay the scale of the issue, external data suggests that roughly 10–12 percent of children under 13 are accessing Instagram or Facebook.

The Commission further noted that readily available scientific evidence on the heightened vulnerability of younger children to online harms was not adequately reflected in Meta’s approach.

That gap, regulators argue, has real consequences. Without a more accurate understanding of how underage users interact with these platforms, mitigation efforts risk being both misdirected and ineffective.

These findings stop just short of a final ruling, but they lay out a clear expectation for what comes next. Meta is being urged to overhaul its risk assessment methodology and significantly strengthen its ability to prevent, detect, and remove underage users.

That includes implementing more robust age assurance mechanisms, such as age estimation and verification technologies, which the Commission has identified as appropriate tools under its 2025 guidelines on protecting minors online.

The Commission has also developed a blueprint for an EU-wide age verification app, intended to provide a privacy-preserving and user-friendly framework for compliance.

A Wider Investigation Still Unfolding

The case is part of broader formal proceedings launched in May 2024 examining Meta’s compliance with the Digital Services Act, particularly its obligations to safeguard minors and mitigate systemic risks.

That investigation extends beyond age verification. Regulators are also examining whether platform design features, including interface mechanics that may encourage prolonged engagement, could exploit the vulnerabilities of younger users and contribute to so-called “rabbit hole” effects.

Meta now has the opportunity to respond to the findings and propose remedial measures. If the Commission ultimately confirms its conclusions, the consequences could be significant. Under the DSA, non-compliance can trigger fines of up to 6 percent of a company’s global annual turnover, alongside potential periodic penalty payments to enforce corrective action.

As Executive Vice-President Henna Virkkunen put it, terms and conditions “should not be mere written statements, but rather the basis for concrete action to protect users, including children.”

The outcome of this case will likely shape how platforms approach age assurance, risk assessment, and child safety controls across the EU and could set a precedent for how regulators evaluate whether digital safeguards are truly working in practice.

The GRC Report is your premier destination for the latest in governance, risk, and compliance news. As your reliable source for comprehensive coverage, we ensure you stay informed and ready to navigate the dynamic landscape of GRC. Beyond being a news source, the GRC Report represents a thriving community of professionals who, like you, are dedicated to GRC excellence. Explore our insightful articles and breaking news, and actively participate in the conversation to enhance your GRC journey.

Oops! Something went wrong