The European Commission announced that by July, all major online platforms, excluding Elon Musk’s X, will see their voluntary commitments regarding disinformation formalised under the Digital Services Act (DSA). The EU integration of the code of practice on disinformation into the DSA aims to establish a benchmark for assessing compliance with the DSA.
Which companies are involved in combating online disinformation?
The code, which dates back to 2022, was initiated and signed by 42 companies, including Google, Meta, Microsoft, and TikTok. It outlines a wide array of voluntary commitments and measures designed to combat online disinformation, emphasizing transparency in political advertising and collaboration during elections.
Formalizing these commitments will enable the Commission to more effectively evaluate whether these companies adhere to the DSA. However, a senior EU official clarified that signing the code does not imply “presumption of innocence.”
Originally, the Commission aimed to complete this work by January. “For platforms, it should lead to more meaningful engagement; it should not be a tick the box exercise,” stated Paul Gordon, the assistant director at the Irish digital services coordinator Coimisiúin na Meán, which oversees DSA compliance for various online platforms.
Since the DSA came into effect in August 2023, the Commission has launched multiple investigations into online platforms, including X, TikTok, and Meta’s Facebook and Instagram. Last month, Meta revealed it would discontinue its fact-checking efforts in the US, replacing them with a community notes system similar to X, claiming the move would return the company to the “roots of free expression.”
Read more: Will Elon Musk pull his $97.4 billion offer for OpenAI?
What happens if a company opts out of the code?
Despite signing the code, Meta remains committed, according to the EU official. However, the Commission cannot compel any company to remain part of the initiative should they choose to exit. Notably, X abandoned the code when Musk acquired the company in 2022.
In the previous month, the Commission had already formalised initiatives by Big Tech companies aimed at countering illegal hate speech online by incorporating their industry commitments into the DSA. The Code of Conduct on countering illegal hate speech online, first drafted in 2016, has been signed by platforms such as Facebook, Instagram, LinkedIn, Snapchat, TikTok, X, and YouTube.
What led to the Commission’s review of X?
In 2023, the European Commission began a thorough examination of X due to various controversies surrounding its content moderation, advertising transparency, and potential algorithmic manipulation for political purposes. In July 2024, the Commission identified several breaches of the DSA by X, including its failure to provide access to essential data for research, non-compliance with advertising transparency regulations, and misuse of its verification system, which enabled fraudsters to impersonate others.
Consequently, a deeper investigation into X has been launched, particularly focusing on its potential role in amplifying far-right political messaging. In December 2024, Elon Musk’s livestream with Alice Weidel, leader of Germany’s nationalist AfD party, raised concerns about possible algorithmic manipulation aimed at favoring specific political parties ahead of Germany’s parliamentary elections in February 2025.
This situation has prompted the Commission to implement additional regulatory measures, requiring X to furnish detailed information about recent algorithmic changes and to retain pertinent data regarding its content-serving practices. The findings of this investigation could result in substantial fines and stricter oversight of the platform’s operations within the EU.