The European Commission is investigating Facebook and Instagram parent company Meta as part of a disinformation probe after the US firm was suspected of breaching the EU’s content rules.
The move comes as the trading bloc ramps up efforts to fight disinformation ahead of EU parliament elections in June, with the commission accusing Meta content moderation efforts “insufficient.”
Implemented last year, the Digital Services Act (DSA) requires so-called ‘Big Tech’ firm to go further to counter potentially harmful and illegal content on their social media networks.
The news comes amidst reports of a foiled Russian attempt to influence the upcoming vote by paying EU politicians to promote Moscow-bacled narratives, notably concerning the Ukraine invasion.
Subscribe to Marketing Beat for free
Sign up here to get the latest marketing news sent straight to your inbox each morning
“We suspect that Meta’s moderation is insufficient, that it lacks transparency of advertisements and content moderation procedures,” EU digital chief, Margrethe Vestager said.
“So today, we have opened proceedings against Meta to assess their compliance with the Digital Services Act,” she said.
Meta has vigorously defended its risk mitigation process, despite the EU expressing concerns about the deprecation of its disinformation tracking tool CrowdTangle. A new ‘Content Library’ currently in development is set to replace it.
The Facebook owner now has five days to provide Meta with information about the remedial actions it is taking to address its concerns. Other platforms such as Amazon, Snapchat, TikTok and YouTube are also being monitored.
A Meta spokesperson said: “We have a well-established process for identifying and mitigating risks on our platforms. We look forward to continuing our cooperation with the European Commission and providing them with further details of this work.”



