Meta has not implemented necessary safeguards to keep children under 13 off Instagram and Facebook, in violation of a European Union online safety law, officials said Wednesday.
Meta does not have an adequate system to identify and remove the accounts of children who violate the social media giant’s age limits, the European Commission, the European Union’s executive branch, said in a preliminary ruling. Without changes, Meta could face fines and other sanctions.
European regulators are cracking down on social media companies over child safety concerns. Snap and TikTok have also been targeted by regulators in Brussels, while the governments of Spain, France and Denmark are among those considering new rules to prevent young people from using social media.
Regulators said Meta appears to be violating the Digital Services Act, a law passed in 2022 to force social media companies to police their platforms more aggressively. The company was found to lack effective controls to verify the accuracy of a person’s self-declared date of birth when setting up an account, making it easier to circumvent rules intended to keep children under 13 off social media sites.
Regulators said Meta’s tool for reporting minors is “difficult to use and ineffective,” with up to seven steps required just to access the necessary form. After a minor is reported for being under 13, the company often does not follow up and the user can continue using the service without any review, regulators said.
Across the European Union, evidence suggests that about 10 to 12 percent of children under 13 access Instagram and Facebook, according to regulators.
“Instagram and Facebook are doing very little to prevent children under this age from accessing their services,” Henna Virkkunen, the commission’s executive vice president for technological sovereignty, security and democracy, said in a statement. “Terms and conditions should not be mere written statements, but rather the basis for concrete actions to protect users, including children.”
The European Union, as well as several individual countries in the 27-nation bloc, are exploring new online age verification tools to prevent young people from accessing certain content.
Meta said he disagreed with the commission’s conclusions and called age verification an “industry-wide challenge.”
“We are clear that Instagram and Facebook are intended for people over 13 years of age and we have measures to detect and delete accounts of anyone under that age,” the company said in a statement. “We continue to invest in technologies to find and remove underage users and will have more to share next week on additional measures that will be implemented soon.”
For more than a decade, Europe has been the world’s toughest regulator of the tech industry on issues of privacy, anti-competitive business practices and illicit online content. Authorities have pressed ahead with investigations of American companies even as the Trump administration has threatened retaliation.
The European Union is also investigating Meta over other issues, including whether Facebook and Instagram are addictive by design, as well as a case investigating their recommendation systems.
In the United States, Meta and other social media companies also face increasing scrutiny over child safety. In March, Meta and YouTube were found guilty by a California jury of harming a young user’s mental health through addictive designs and other features.
The European investigation into Meta’s age verification tools began in 2024. After preliminary charges were filed on Wednesday, the company has the opportunity to give a response to regulators. A final decision on possible sanctions may take more than a year.
The commission can impose a fine of up to 6 percent of Meta’s worldwide revenue, although a fine that large would be extremely rare. The two parties can also reach an agreement to resolve the case.
Jeanna Smialek contributed reporting from The Hague, Netherlands.




