The European Union said on Tuesday that it suspects Meta’s social networking platforms, Facebook and Instagram, of breaking the bloc’s rules for larger platforms in relation to election integrity.
The European Commission (EC) has opened formal infringement proceedings to investigate Meta under the Digital Services Act (DSA), an online governance and content moderation framework. Penalties for confirmed breaches of the rules can include fines of up to 6% of a company’s global annual turnover.
The EU’s concerns here span several areas. First is Meta’s moderation of political ads, which the EC suspects is inadequate. Then the EU is worried that Meta’s policies for moderating non-paid political content, which the bloc suspects are opaque and overly restrictive in contrast to the DSA’s requirement that platforms’ policies deliver transparency and accountability. Finally, the Commission is looking at Meta’s policies that relate to enabling outsiders to monitor elections.
The proceeding also targets Meta’s processes to let users flag illegal content, which the EC is concerned aren’t user friendly enough. Also under the lens are the company’s internal complaints handling system for content moderation decisions, which the Commission suspects are ineffective.
“When Meta gets paid for displaying advertising, it doesn’t appear that they have put in place effective mechanism of content moderation,” said a Commission official briefing journalists on background on the factors that led the EC to open the bundle of investigations. “Including for advertisements that could be generated by a generative AI — for example, deep fakes — and these have been exploited or appear to have been exploited by malicious actors for foreign interference.”
The EU is drawing on some independent research (by AI Forensics) enabled by another DSA requirement that large platforms publish a searchable ad archive, which, the EC suggests, has shown Meta’s ad platform being exploited by Russian influence campaigns targeting elections via paid ads. The Commission also said it has found evidence of a lack of effective ad moderation given that Meta is generally exploited by scammers, and pointed to a surge in financial scam ads on the platform.
On organic (non-paid) political content, the EU said Meta seems to limit the visibility of political content for users by default but does not appear to provide sufficient explanation of how it identifies content as political, nor how moderation is done. The Commission also said it had found evidence to suggest Meta is shadowbanning (aka limiting the visibility/reach of) certain accounts with high volumes of political posting.
If confirmed, such actions would be a breach of the DSA, as the regulation puts a legal obligation on platforms to transparently communicate the policies they apply to their users.
On election monitoring, the EU is particularly concerned about Meta’s recent decision to shutter access to CrowdTangle, a tool researchers have previously been able to use for real-time election monitoring.
The EC has not opened an investigation on this yet, but has sent Meta an urgent formal request for information (RFI) about its decision to deprecate the research tool, giving the company five days to respond. Briefing journalists about the development, Commission officials suggested they could take more action in this area, such as opening a formal investigation, depending on Meta’s response.
The short deadline for the response clearly conveys a sense of urgency. Last year, soon after the EU took up the baton overseeing larger platforms’ DSA compliance with a subset of transparency and risk mitigation rules, the Commission named election integrity as one of the priority areas for enforcement of the regulation.
In today’s briefing, Commission officials pointed to the upcoming European elections in June, questioning the timing of Meta’s decision to deprecate CrowdTangle. “Our concern — and this is also why we consider this to be a particular urgent issue — is that just a few weeks ahead of the European election, Meta has decided to deprecate this tool, which has allowed journalists … civil society actors and researchers in, for example, the 2020 U.S. elections, to monitor election-related risks.”
The Commission is worried that another tool Meta has said will replace CrowdTangle does not have equivalent/superior capabilities. Notably, the EU is concerned the tool will not let outsiders monitor election risks in real-time. Officials also raised concerns about slow onboarding for Meta’s new tool.
“At this point, we’re requesting information from Meta on how they intend to remedy this lack of real-time election monitoring tool,” said one senior Commission official during the briefing. “We are also requesting some additional documents from them on the decision that has led them to deprecate CrowdTangle and their assessment on the capabilities of the new tool.”
When reached for comment about the Commission’s actions, a company spokesperson said in a statement: “We have a well-established process for identifying and mitigating risks on our platforms. We look forward to continuing our cooperation with the European Commission and providing them with further details of this work.”
These are the first formal DSA investigations Meta has faced, but not the first RFIs. Last year, the EU sent Meta a flurry of requests for information, including in relation to the Israel-Hamas war, election security and child safety, among others.
In light of the variety of information requests on Meta platforms, the company could face additional DSA investigations as Commission enforcers work through multiple submissions.