Introduction
In a significant move that underscores the European Union's commitment to online safety, the EU has accused Meta Platforms, the parent company of Facebook and Instagram, of failing to adequately prevent users under the age of 13 from accessing its platforms. This allegation comes amid heightened scrutiny of social media companies regarding their responsibility for protecting children online, particularly in light of the Digital Services Act (DSA), which aims to establish a safer digital environment.
The Digital Services Act: A Framework for Accountability
The Digital Services Act, which came into effect in the European Union in 2022, sets stringent guidelines for technology companies to follow in order to foster a safer online space. One of its critical provisions mandates that platforms must take effective measures to prevent minors from accessing inappropriate content. This includes strict age verification processes to ensure that children under the age of 13 cannot create accounts or engage with the platforms' features.
EU's Findings on Meta's Compliance
On April 29, 2026, EU officials, led by executive vice president Henna Virkkunen, announced preliminary findings from an investigation that commenced in 2024. The investigation aimed to evaluate Meta's compliance with the DSA, particularly regarding the protection of underage users. According to Virkkunen, the evidence suggests that Meta has not implemented sufficient measures to enforce its own rules prohibiting access to children under 13.
Claims of Inadequate Measures
Virkkunen emphasized that the platforms are doing "very little" to uphold their own policies, which raises serious concerns about the safety of younger users. Despite Meta’s claims of having detection and removal systems in place to identify and block underage users, the EU's findings suggest that these measures are grossly insufficient. The lack of effective age verification processes and the ongoing presence of underage accounts on Facebook and Instagram indicate a failure to comply with the DSA's mandates.
Meta's Response to the Allegations
In response to the EU's preliminary findings, Meta has disputed the claims, asserting that it has established various detection and removal mechanisms designed to identify underage users. The company argues that it is committed to maintaining a safe environment for all its users and has invested significantly in technologies aimed at age verification and content moderation.
Meta's Defense Strategies
- Age Verification Technology: Meta has developed systems that use machine learning to analyze user data and identify potential underage users.
- User Reporting Tools: The platforms encourage users to report suspected underage accounts, which are then reviewed by Meta’s moderation team.
- Content Moderation Teams: Meta employs teams that monitor content and user behavior to ensure compliance with community guidelines.
However, critics argue that these measures are not robust enough to effectively prevent children from accessing the platforms, highlighting a significant gap between Meta's stated policies and actual practices.
The Implications of Non-Compliance
The implications of the EU's findings could be far-reaching for Meta. If the final decision corroborates the preliminary findings, the company could face hefty fines of up to 6% of its global annual revenue. This potential penalty underscores the seriousness with which the EU is treating the issue of child safety online.
Financial Stakes for Meta
As one of the largest social media companies in the world, Meta's global annual revenue reached approximately $117 billion in 2022. A 6% fine could amount to a staggering $7 billion, a financial blow that could impact the company’s operations and future investments in safety technologies.
The Broader Context of Child Safety Online
The concerns raised by the EU regarding Meta are part of a larger conversation about child safety on social media platforms. With increasing evidence linking social media use to negative mental health outcomes among children and adolescents, there is a growing demand for stricter regulations and accountability from technology companies.
Global Trends in Regulating Social Media
Other countries, including the United States and the United Kingdom, are also grappling with how to regulate social media use among minors. Initiatives such as the Children’s Online Privacy Protection Act (COPPA) in the U.S. aim to protect children's privacy online, but enforcement remains a challenge.
The Future of Online Safety Legislation
The EU's actions against Meta may serve as a catalyst for more stringent regulations globally. As lawmakers and regulators observe the outcomes of the EU's investigation, there may be increased pressure on other tech giants to enhance their user safety measures, particularly regarding children.
Potential Legislative Developments
- Stricter Age Verification Requirements: Future legislation may mandate more reliable age verification processes across all platforms.
- Enhanced Parental Controls: Governments may require social media companies to provide more robust parental control features to monitor and limit children's usage.
- Mandatory Reporting of Underage Users: Companies could be compelled to report incidents of underage users accessing their platforms to regulatory authorities.
Such measures could help create a safer online environment for children, but they also pose challenges for tech companies that must balance user privacy with safety compliance.
Conclusion
The European Union's investigation into Meta Platforms highlights a critical juncture in the ongoing conversation about child safety on social media. As Meta prepares to respond to the EU's allegations, the outcome of this case may set important precedents for how technology companies regulate access for underage users. The stakes are high, not only for Meta but for the entire tech industry as it navigates the complexities of online safety and youth protection.
As parents and guardians become increasingly concerned about their children's online experiences, the pressure will mount for social media platforms to prioritize the safety of younger users. The actions of the EU against Meta could serve as a wake-up call for the industry, prompting a reevaluation of policies and practices aimed at safeguarding children in the digital age.

