The European Union (EU) has initiated an official investigation into Meta, the parent company of Instagram and Facebook, over concerns that these platforms negatively impact children’s health, contribute to addiction, and that Meta has not made sufficient efforts to address these issues. Recently, the effects of Instagram and Facebook, which fall under the Meta umbrella, on children’s health have been a major topic of debate. The European Union (EU) has now launched an official investigation into Meta, expressing concerns that Meta has not made enough effort to protect the mental and physical health of children using its social media platforms.
Investigation by the EU
The investigation announced by the European Commission will assess whether Meta complies with the rules under the Digital Services Act (DSA). Additionally, the Commission has raised concerns that the user interfaces and algorithms of Facebook and Instagram could lead to addiction among children. Thierry Breton, the European Commissioner for Internal Market, shared on his X account, “We are not convinced that Meta has made sufficient efforts to comply with DSA obligations to reduce the risks of negative impacts on the physical and mental health of young Europeans on the Facebook and Instagram platforms.”
Moreover, the investigation highlights Meta’s failure to take necessary measures to prevent minors from accessing inappropriate content and the inadequacy of age verification tools. Additionally, the adequacy of Meta’s recommendation systems and default privacy settings for children will be evaluated within the scope of the investigation.
As part of the investigation, the Commission will gather additional evidence in the next steps. Currently, no specific date has been set for the conclusion of the investigation. However, the Commission allows for interim measures to be taken against Meta’s platforms while the investigation is ongoing. If Meta is found to have violated DSA rules, the company could face fines of up to 6% of its global revenue.