The European Union enacted a slew of new rules earlier this year to fight the kind of hate speech and disinformation that officials say is rife on Elon Musk’s X platform, but the bloc isn’t able to enforce them yet.
The EU’s main tool to clamp down on objectionable content, the Digital Services Act, went into effect on Aug. 25, but the bloc won’t be able to impose fines or other penalties on social media platforms for violations until next year at the earliest.
That means the EU is still largely a spectator in Musk’s latest drama. The billionaire sparked fresh outrage this week after he affirmed an antisemitic post accusing Jewish communities of hating white people.
Read More: The Impact and Cost of Musk’s Endorsement of Antisemitism on X
Up to now, the EU has been focused on what it says is illegal and even “terrorist” content on X, particularly related to the conflict between Israel and Hamas. European Commission Vice President Vera Jourova has called it the biggest outlet for disinformation.
On Friday, the commission said it’s halting all advertising on the platform, but will continue using X to post news items and other information.
“It has nothing to do with our presence on X per se,” Johannes Bahrke, a commission spokesman, said Friday. “This is really about advertisement, campaigns and ad spending.”
X noted that the commission has advertised from several accounts, including @EU_Social, @EUClimateAction and @EUHomeAffairs.
“The European Commission has only advertised about $5,000 so far this year, but is still organically posting across all its X handles,” Joe Benarroch, head of business operations for X said in a message to Bloomberg. “Plus, the European Investment Bank told us they will still be actively advertising on X.”
The EIB denied offering any such assurances to X. A spokesperson for the bank said it’s coordinating with other EU institutions and has no further advertising campaigns scheduled on X.
Read more: Fallout From Musk’s Endorsement of Antisemitic Post Spreads
Europe introduced sweeping legislation to battle the spread of harmful online content as technology has helped disinformation proliferate. The act gives the EU’s executive arm unprecedented power to determine whether the world’s biggest social media sites are doing enough to police their platforms, but those powers to fine or ban violators haven’t been put to the test.
Officials have said that proper enforcement of new regulations takes time, in part because any actions or penalties need to stand up to legal challenges. The commission has tried to speed up the process by encouraging member states to quickly appoint digital coordinators, but it’s unclear how fast governments will move.
Last month, EU regulators demanded answers from X, formerly known as Twitter, about illegal posts. The company sent a response, which the commission is evaluating.
Read more: Musk’s X Is Biggest Outlet of Russia Disinformation, EU Says
X faces possible fines if it provides the commission with “incorrect, incomplete or misleading information” in response, according to a news release.
More broadly, social media companies are required to hire more content moderators and use risk mitigation methods to decrease the spread of harmful content. Companies that fail to comply could eventually face fines as high as 6% annual revenue or even be banned from the bloc if they repeatedly break the rules.
But the EU can’t fine platforms for failing to comply until February at the earliest. Officials will be able to launch legal proceedings earlier, but they need time to build a case and a new board of digital coordinators needs to be in place in order to levy fines. Even then, it will be up to individual member states to tell the platform to take down illegal content.
Read more: Europe’s Two-Track Approach to Policing Big Tech: QuickTake
The commission can push the company to do more to tackle disinformation and harmful content, but this also won’t happen anytime soon. Companies sent their risk assessments to the EU’s executive arm in August, and those will be vetted next year by independent auditors.
Musk has said publicly that he intends to comply with EU regulations, but he’s taken several steps that run counter to the bloc’s prescriptions. X has laid off swaths of content moderators and cut back on election integrity efforts under his tenure. The company also pulled out of an EU voluntary code of conduct, which aims to set industry standards.
--With assistance from Lyubov Pronina, Katharina Rosskopf and Ed Ludlow.
(Updates with EIB response in ninth paragraph)