Two Years After the Digital Services Act, Brussels Is Testing Its Power

Two Years After the Digital Services Act, Brussels Is Testing Its Power

By
Key Takeaways
  • The DSA Has Entered Its Enforcement Phase: Two years after entering into force, the regulation has moved decisively from policy design to active investigations and supervisory scrutiny.
  • System Design Is Now a Regulatory Target: Recommender systems, addictive interfaces, and algorithmic amplification are being treated as systemic risk factors, not just product features.
  • Architecture, Not Just Content, Is Under Review: Investigations into X, Shein, and TikTok demonstrate that compliance extends beyond content moderation into governance, transparency, and platform engineering.
  • Corrective Measures May Reshape Business Models: Beyond fines of up to 6% of global turnover, the Commission can require structural changes to how platforms operate.
  • Governance Maturity Is Becoming a Competitive Variable: The DSA embeds risk assessment, documentation, and board-level accountability into the digital operating model of large platforms.
Deep Dive

When the Digital Services Act entered into force two years ago, it was framed as a reset for the online economy. The rhetoric focused on safer digital spaces, stronger protections for fundamental rights, and curbing manipulative or harmful platform behavior.

Two years later, the story is no longer about aspiration. It is about execution.

Since February 2024, the DSA has applied to most platforms operating in the EU, excluding micro and small enterprises. It covers social media networks, online marketplaces, app stores, and travel and accommodation platforms. But the real shift is not in who is covered. It is in how accountability is being enforced.

Brussels has moved from drafting obligations to testing them.

The Compliance Era Is Over. The Supervision Era Has Begun.

The DSA created a shared enforcement model. National Digital Services Coordinators oversee most platforms. The European Commission directly supervises the largest platforms, those capable of posing systemic risks due to their scale and societal reach.

That division of authority is now visible in practice.

The Commission has opened several investigations into major platforms. Some have already led to concrete changes. Others are ongoing, with procedural steps unfolding in public view.

The scrutiny of X is emblematic. The Commission recently launched an investigation into the platform’s deployment of Grok and extended its existing investigation into X’s recommender systems. This is not merely a content moderation issue. It is a governance question about how algorithmic systems shape exposure, amplify narratives, and potentially influence public discourse.

Similarly, the Commission has launched an investigation into Shein, citing concerns over addictive design, transparency obligations, and the sale of illegal products. In another high-profile case, the Commission has preliminarily found that TikTok’s addictive design may breach the DSA. TikTok now has the opportunity to examine the file and respond before any final decision is taken.

These cases illustrate something larger. The DSA is not simply targeting illegal content. It is targeting the architecture of influence.

Addictive Design as a Regulatory Risk

One of the most consequential features of the DSA is its treatment of so-called “addictive design.” For years, concerns about infinite scroll, push notifications, and algorithmic personalization were debated in ethical and academic circles. Under the DSA, they are regulatory risk vectors.

The Commission’s preliminary finding on TikTok’s design choices signals that user interface and engagement strategies are not outside the regulatory perimeter. Nor are recommender systems treated as neutral infrastructure. They are viewed as systems that can generate systemic risk if left unchecked.

This marks a structural shift in regulatory thinking. Compliance is no longer confined to removing illegal posts or publishing transparency reports. It requires platforms to assess, document, and mitigate the broader societal risks created by their design and algorithmic systems.

Financial Exposure With Structural Consequences

The DSA gives the Commission the power to impose fines of up to 6 percent of a platform’s global annual turnover. For the largest players, that figure is not symbolic.

But the more enduring impact may lie in corrective measures. Platforms found in breach can be required to change how their systems operate. That could mean redesigning features, altering algorithmic parameters, increasing transparency around recommender systems, or strengthening internal governance controls.

In other words, the DSA reaches into the operating model of digital businesses.

For boards and executive teams, this changes the compliance calculus. The regulation demands not only policy statements, but demonstrable oversight of systemic risk assessments, mitigation strategies, and internal accountability structures.

Europe’s Broader Digital Ambition

The DSA does not stand alone. It sits within a wider European strategy aimed at shaping the digital economy in line with democratic values, competitiveness, and technological independence.

Through funding initiatives and regulation, the EU has made clear that it intends to influence how digital platforms are designed and governed, not merely how they respond to crises after the fact.

Two years on, the DSA has become a test case for that ambition.

The investigations into X, Shein, and TikTok show that Brussels is willing to scrutinize recommender systems, product design, and commercial practices in granular detail. The shared enforcement model between national authorities and the Commission is operational. The potential fines are significant. The procedural mechanisms are active.

The conversation has moved from whether the DSA would have teeth to how hard it will bite.

For large platforms operating in Europe, the era of high-level commitments is over. What matters now is evidence, risk assessments, mitigation frameworks, documentation, and the ability to defend system design choices under regulatory examination.

Two years after entering into force, the Digital Services Act is no longer a promise of safer online spaces. It is a supervisory regime that is actively reshaping how those spaces are built and governed.

The GRC Report is your premier destination for the latest in governance, risk, and compliance news. As your reliable source for comprehensive coverage, we ensure you stay informed and ready to navigate the dynamic landscape of GRC. Beyond being a news source, the GRC Report represents a thriving community of professionals who, like you, are dedicated to GRC excellence. Explore our insightful articles and breaking news, and actively participate in the conversation to enhance your GRC journey.

Oops! Something went wrong