top of page

The Digital Services Act: A New Era of Online Accountability in the EU

Introduction: The Challenge of Governing the Digital Space

For years, the European Union has sought to regulate the fast-evolving digital landscape. Social media, online marketplaces, and digital intermediaries have created transformative opportunities—but also significant societal risks. From disinformation and online hate to illegal product sales and algorithmic opacity, the problems of the internet demanded a new generation of legal tools. That tool is the Digital Services Act (DSA)—a bold legislative framework designed to create a safer, more transparent, and more accountable digital space across the EU.

Fully enforceable as of February 17, 2024, the DSA is not just a reform; it's a regulatory revolution. It brings clear obligations for digital service providers, new rights for users, and powerful enforcement mechanisms for regulators.


Scope and Applicability: Who Must Comply?

The DSA applies to a wide range of digital services. These include:

  • Intermediary services: ISPs and DNS services;

  • Hosting services: cloud storage and web hosting;

  • Online platforms: marketplaces, app stores, and social media;

  • Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs): platforms with over 45 million active monthly users in the EU.

The DSA uses a tiered approach, meaning obligations vary depending on the size and function of the service. While basic transparency and due diligence rules apply across the board, VLOPs and VLOSEs face the strictest requirements—reflecting their disproportionate influence on online discourse and commerce.


Key Obligations Introduced by the DSA

  1. Illegal Content RemovalPlatforms must act “expeditiously” to remove illegal content once notified. They are not required to actively monitor everything, but they must have robust mechanisms for notice-and-action.

  2. Transparency in Moderation and AlgorithmsPlatforms are now obliged to publish clear information on content moderation policies and how algorithms influence what users see. VLOPs must conduct yearly risk assessments to identify dangers related to disinformation, electoral manipulation, gender-based violence, and more.

  3. Advertising TransparencyUsers must be able to see who placed an ad and why it was targeted to them. Platforms must also disclose whether content was promoted through payment or algorithmic relevance.

  4. Internal Complaint Systems and Out-of-Court Dispute SettlementPlatforms must offer users an internal system for contesting content removals or account suspensions and cooperate with certified dispute resolution bodies.

  5. Risk Mitigation and Crisis ResponseVLOPs and VLOSEs must take concrete steps to mitigate systemic risks, such as by adjusting algorithms or moderating content more actively during public crises (e.g., elections or pandemics).


The Role of National and EU Regulators

The DSA establishes a multi-level enforcement structure:

  • Each EU Member State must appoint a Digital Services Coordinator (DSC) to oversee national implementation.

  • The European Commission has exclusive enforcement powers over VLOPs and VLOSEs, including the ability to launch investigations, demand compliance plans, and impose fines up to 6% of global annual turnover.

The Commission has already initiated formal proceedings against certain major tech companies for potential DSA breaches—signaling its readiness to act decisively.


Impact on Tech Companies: Compliance or Consequences

The DSA forces platforms to rethink their operations. Companies must now invest in:

  • Legal teams to monitor content compliance,

  • Transparent algorithmic design,

  • Accessible redress mechanisms,

  • Public reporting and risk analysis.

Non-compliance is costly—not only due to financial penalties, but also reputational damage. Smaller platforms are also affected but may benefit from support mechanisms and less burdensome rules.


User Empowerment: A Stronger Digital Citizenship

From a user’s perspective, the DSA creates:

  • Clear rights to appeal content decisions,

  • More insight into why they see certain content or ads,

  • Greater control over algorithmic personalization.

It also strengthens protections for minors and other vulnerable groups online, ensuring platforms adopt appropriate design and safety measures.


Conclusion: A Global Regulatory Blueprint?

The DSA is already being watched closely outside the EU. Much like the General Data Protection Regulation (GDPR), the DSA may become a de facto global standard, especially as companies seek harmonized compliance strategies across regions.

With 2025 as the first full year of enforcement, legal professionals, tech leaders, and civil society will be watching closely. Will the DSA succeed in balancing innovation with rights and accountability? One thing is clear: the rules of the digital game have changed—and the world is taking note.

 
 

Recent Posts

See All
EU Regulatory Roundup – May 2025

From sweeping climate reforms to trade tensions and digital identity tools, May 2025 was a month of high regulatory activity in the...

 
 
bottom of page