EU Initiates Formal DSA Proceedings Against X Over Content Moderation and Transparency
The European Commission has initiated formal proceedings against X under the Digital Services Act, investigating alleged failures in content moderation, disinformation countermeasures, and transparency. This marks the first such action against a major online platform under the new EU regulation.
The European Commission has initiated formal proceedings against X, formerly Twitter, under the Digital Services Act, marking the first such action against a major online platform for alleged content moderation and transparency failures.
The Decision
The European Commission (EC) has opened formal proceedings against X under the Digital Services Act (DSA). This represents the first formal proceeding launched against a Very Large Online Platform (VLOP) since the DSA’s full enforcement began. The EC’s investigation focuses on X's compliance with its DSA obligations, specifically concerning illegal content management, disinformation countermeasures, content moderation transparency, and data access for researchers. The EC clarified that these proceedings pertain solely to X and not to xAI.
How It Works
The Digital Services Act mandates VLOPs to implement robust measures against systemic risks, including the spread of disinformation and illegal content. Platforms are required to operate transparent content moderation systems, maintain clear terms of service, and provide researchers with data access. If violations are confirmed during these proceedings, X could face fines reaching up to 6% of its global annual turnover. The investigation will scrutinize X's resource allocation for content moderation, particularly concerning elections and civic discourse. Additionally, it will examine the efficacy of X's "Community Notes" feature and the potential for deceptive practices related to its user interface design, such as the implications of paid verification checkmarks.
Winners & Losers
Users and researchers seeking greater transparency and a safer online environment stand to benefit from stricter enforcement. The European Union reinforces its regulatory authority in the digital space. X faces potential substantial financial penalties and increased operational burdens to ensure compliance. Its public image regarding content moderation practices is now under significant regulatory scrutiny. Other VLOPs may experience heightened pressure to proactively align with DSA requirements to avoid similar investigations.
Strategic Implications
This action sets a precedent for DSA enforcement against major online platforms, underscoring the EU's role as a global digital regulator. It could influence operational standards for large platforms worldwide, given the potential for significant fines. The proceedings highlight the ongoing challenges platforms encounter when balancing principles of free speech with the responsibilities of content moderation under evolving regulatory frameworks.
What to Watch
The outcome of the European Commission’s investigation and any subsequent fines or mandated changes for X will be closely monitored. X’s response and implementation of any required adjustments will provide further insight. Future DSA enforcement actions against other Very Large Online Platforms, such as TikTok, Meta, and Google, remain a key area of observation. The broader impact on global platform design and content governance models will continue to unfold.