Meta CEO Mark Zuckerberg testifies before the Senate Judiciary Committee at the Dirksen Senate Office Building on January 31, 2024 in Washington, DC.
Alex Wong | Getty Images
Parent company of Facebook dead On Thursday, YouTube was the subject of a major European Union investigation into alleged violations of the bloc's strict online content law over risks to children's safety.
The European Commission, the European Union's executive body, said in a statement that it was investigating whether the social media giant's Facebook and Instagram platforms “may stimulate behavioral addiction in children, as well as create so-called rabbit hole effects.”
The committee added that it was concerned about age verification processes on Meta platforms, as well as privacy risks associated with the company's recommendation algorithms.
“We want young people to have safe, age-appropriate experiences online, and have spent a decade developing more than 50 tools and policies designed to protect them,” a Meta spokesperson told CNBC via email.
“This is a challenge for the entire industry, and we look forward to sharing details of our work with the European Commission.”
The committee said its decision to launch the investigation comes against the backdrop of a preliminary analysis of the risk assessment report submitted by Meta in September 2023.
Thierry Breton, the EU Commissioner for the Internal Market, said in a statement that the regulator “is not convinced (that Meta) has done enough to comply with its daily subsistence allowance obligations to mitigate the risk of negative impacts on the physical and mental health of young Europeans.” On its platforms.”
The European Union said it would conduct an in-depth investigation into Meta's child protection measures “as a matter of priority.” The bloc can continue to collect evidence through information requests, interviews or inspections.
The Commission said launching an investigation into the DSA allows the EU to take further enforcement steps, including interim measures and non-compliance decisions. The Commission added that it could also consider commitments made by Meta to address its concerns.
Meta and fellow US tech giants have increasingly found themselves in the spotlight of EU scrutiny since the introduction of the bloc's landmark Digital Services Act, a groundbreaking law from the European Commission that seeks to tackle harmful content.
Under the EU's daily subsistence allowance law, companies can be fined up to 6% of their global annual revenue for violations. The European Union has not yet issued fines to any of the tech giants under its new law.
In December 2023, the European Union opened infringement proceedings against Company X, the company formerly known as Twitter, over suspected failure to combat disinformation and content manipulation.
The commission is also investigating Meta over alleged DSA violations related to its handling of disinformation in the election.
In April, the European Union launched an investigation into the company and said it was concerned that Meta had not done enough to combat disinformation ahead of the upcoming European Parliament elections.
The European Union is not the only authority taking action against Meta over child safety concerns.
In the United States, the New Mexico Attorney General sued the company over allegations that Facebook and Instagram enabled child sexual abuse, enticement and trafficking.
A Meta spokesman said at the time that the company was deploying “cutting-edge technology” and taking other preventive steps to eliminate predators.