“EU Digital Services Act New Responsibilities For Online Platforms And Social Media”
- Vinay Rawat

- Sep 29
- 17 min read
Abstract
The European Union's Digital Services Act (DSA), which became fully applicable on February 17, 2024, represents the most ambitious and comprehensive regulatory framework for digital services in over two decades. Designed to create a safer, more transparent, and more accountable online environment, the DSA fundamentally reshapes the responsibilities of intermediary services, including internet access providers, cloud services, online platforms, and most significantly, very large online platforms (VLOPs) and very large online search engines (VLOSEs). This article provides a detailed analysis of the DSA's new responsibilities. It begins by contextualizing the DSA within the existing legal landscape, notably the e-Commerce Directive of 2000, which it modernizes. The core of the article is a meticulous examination of the tiered obligations, which escalate based on the size, role, and impact of the service provider. Key areas of focus include the stringent due diligence requirements for all intermediaries, the specific content moderation and user redressal rules for hosting services and online platforms, and the unprecedented systemic risk management, independent auditing, and data access obligations imposed on VLOPs and VLOSEs. The article also explores the DSA's robust enforcement mechanism, centered on Digital Services Coordinators and the European Board for Digital Services, and discusses the global implications of this groundbreaking regulation, arguing that it sets a new global standard for platform accountability.
1. Introduction: The Need for a New Digital Rulebook
The dawn of the 21st century was marked by an optimistic vision of the internet as a boundless global agora—a democratizing force that would connect humanity, foster innovation, and break down traditional barriers to information. The EU's e-Commerce Directive of 2000 embodied this spirit, establishing a light-touch regulatory framework that was instrumental in facilitating the explosive growth of the digital single market. Its cornerstone principles, such as the country-of-origin principle (allowing a service provider to operate across the EU based on the laws of its home member state) and the liability exemption for intermediaries (ensuring platforms were not held liable for user-generated content as long as they acted as mere conduits), provided the legal certainty needed for companies like Google, Facebook, and Amazon to flourish.
However, over the past two decades, the digital ecosystem has transformed beyond recognition. A handful of very large online platforms have become central to economic, social, and democratic life. While these platforms offer undeniable benefits, they have also become conduits for a range of systemic risks that were unimaginable in 2000: the viral spread of illegal content and disinformation, the undermining of fundamental rights, the manipulation of consumer behavior through opaque algorithms, and the creation of market structures that can stifle competition.
The existing framework of the e-Commerce Directive proved inadequate to address these challenges. Its liability shield, while still essential, was sometimes misinterpreted as a "hands-off" rule, leading to passive or inconsistent content moderation practices. The regulatory landscape became a fragmented patchwork of national laws, creating legal uncertainty for businesses operating across borders and inconsistent protection for EU citizens.
Recognizing this regulatory gap, the European Commission embarked on a historic legislative journey to create a new rulebook for the digital age. The Digital Services Act (DSA), together with its sibling legislation the Digital Markets Act (DMA) which focuses on contestability and fairness for "gatekeeper" platforms, forms the core of the EU's digital strategy. The DSA's primary goal is to create a safer digital space where the fundamental rights of all users (both in the EU and outside if they target the EU market) are protected, and to establish a level playing field for innovation, growth, and competitiveness within the single market.
The DSA is not a repeal of the e-Commerce Directive but a modernization and extension of its principles. It maintains the crucial liability exemptions but builds upon them with a comprehensive set of harmonized, proportionate, and transparency-driven obligations. Its approach is "tiered," meaning that the responsibilities imposed on a service provider are calibrated to its size, nature, and societal impact. A small niche forum has minimal obligations, while a platform like TikTok or X (formerly Twitter), used by over 45 million people in the EU, is subject to the most stringent rules due to its systemic impact.
This article will provide a comprehensive dissection of these new responsibilities. We will navigate the DSA's layered structure, detailing the obligations for all intermediaries, the enhanced duties for hosting services and online platforms, and the far-reaching requirements for VLOPs and VLOSEs. We will also examine the powerful new enforcement architecture and conclude by reflecting on the DSA's profound implications for the future of the global internet.
2. Foundational Principles and Tiered Structure of the DSA
Before delving into the specific responsibilities, it is essential to understand the DSA's foundational principles and its tiered structure, which is key to its proportionality.
2.1. Preservation of the Liability Shield
The DSA explicitly reaffirms the core liability exemptions from the e-Commerce Directive. These are found in Chapter II of the DSA:
» Mere Conduit (Article 3): Services consisting of the transmission of information in a communication network (e.g., internet access providers, VPNs) are not liable for the information transmitted.
» Caching (Article 4): Services involving the automatic, intermediate, and temporary storage of information for the sole purpose of making its onward transmission more efficient are not liable for that information.
» Hosting (Article 5): Services that store information provided by a user are not liable for the information stored at the request of a user.
The critical condition for hosting immunity is that the provider does not have actual knowledge of illegal activity or illegal content and, upon obtaining such knowledge, acts expeditiously to remove or disable access to the content. This "notice-and-action" principle remains the bedrock of intermediary liability. The DSA does not create a general monitoring obligation; platforms are not required to proactively scan all content for potential illegality.
2.2. The Tiered Approach
The DSA's obligations are not one-size-fits-all. They escalate in stringency based on the function and size of the service provider. The tiers are as follows:
» All Intermediary Services: The most basic obligations apply to all entities falling under the DSA's scope.
» Hosting Services (including online platforms): A subset of intermediaries that store user data. This includes cloud storage, app stores, and social media platforms. They have additional obligations beyond mere conduits.
» Online Platforms: A subset of hosting services that disseminate information to the public. This is a crucial distinction. It includes social media, content-sharing platforms, app stores, and online marketplaces. They face even more stringent rules, particularly concerning content moderation and user redress.
» Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs): This is the most significant tier. The DSA designates platforms and search engines with 45 million or more average monthly active recipients in the EU as "very large." This threshold is designed to capture platforms whose scale poses systemic risks to society. They are subject to the most rigorous obligations, including systemic risk management and independent auditing.
This tiered structure ensures that regulatory burdens are proportionate to the capacity of the service and the potential harm it can cause.
3. New Responsibilities for All Intermediary Services (Chapter III)
The DSA establishes a baseline of obligations for every intermediary service provider offering services in the EU single market, regardless of where they are established. These are designed to ensure basic transparency and accountability.
3.1. Point of Contact and Legal Representative (Articles 10 & 11)
» Point of Contact: All providers must designate a single point of contact for direct communication with member state authorities, the European Board for Digital Services, and the European Commission. This ensures regulators can contact the service quickly and effectively.
» Legal Representative: Providers established outside the EU must appoint a legal representative within one of the EU member states where they offer services. This representative can be held liable for non-compliance with the DSA, ensuring that non-EU companies cannot evade enforcement.
3.2. Transparency in Terms and Conditions (Article 12)
Providers must draft and apply their terms and conditions in a plain, intelligible, and unambiguous manner. They must also inform users of any restrictions imposed on the use of their service, including penalties, in a clear and detailed manner. This aims to empower users by making platform rules more understandable.
3.3. Transparency Reporting Obligations (Article 13)
All intermediary services (except micro and small enterprises) must publish annual reports on any content moderation activities they undertake. These reports must include, at a minimum:
» Information on the number of orders received from member state authorities to remove illegal content.
» The number of notices submitted through the notice-and-action mechanisms (and actions taken).
» The number of complaints received through their internal complaint-handling system and the outcomes.
» Any use made of automated means for content moderation.
These reports provide unprecedented public insight into the scale and nature of content moderation across the digital ecosystem.
4. Enhanced Responsibilities for Hosting Services, Including Online Platforms (Chapter III, Section 2)
Hosting services, which store user data, have additional responsibilities due to their greater role in facilitating online content.
4.1. The Notice-and-Action Mechanism (Article 14)
This provision formalizes and standardizes the process for reporting illegal content. While the basic principle is retained from the e-Commerce Directive, the DSA adds crucial procedural safeguards:
» Facilitation of Notices: Hosting services must provide a user-friendly, easily accessible, and electronic mechanism for individuals and entities to notify them of specific items of illegal content.
» Requirements for Notices: The mechanism must allow for notices to contain specific information, such as an explanation of why the content is illegal, a clear indication of the electronic location of the content (e.g., URL), and the contact details of the notifier.
» Confirmation of Receipt: Upon receiving a notice, the hosting service must send a confirmation of receipt to the notifier without undue delay.
» Action on Notices: The service must process the notice in a timely, diligent, non-arbitrary, and objective manner.
4.2. Statement of Reasons for Restrictions (Article 15)
When a hosting service restricts a user's content or account—whether because it violates the law or the platform's own terms and conditions—it must provide a clear and specific statement of reasons to the affected user. This includes identifying the illegal content or the violated rule and informing the user about the possibility of appealing the decision. This is a critical due process right designed to combat arbitrary content moderation.
4.3. Notification of Suspicious Criminal Offences (Article 16)
Hosting services that become aware of any information giving rise to a "reasonably suspicion" that a serious criminal offence involving a threat to the life or safety of a person has been, is being, or will be committed, must promptly inform the relevant law enforcement authorities in the member state(s) concerned. This creates a direct duty to report severe threats like terrorist activity or imminent violence.
5. Specific Responsibilities for Online Platforms (Chapter III, Section 3)
Online platforms, which disseminate information to the public, are at the heart of the DSA's regulatory focus. Their influence on public discourse and individual rights necessitates further obligations.
5.1. Internal Complaint-Handling System (Article 17)
Online platforms must establish an easy-to-use, transparent, and non-discriminatory internal system for users to complain about content moderation decisions. For example, if a user's post is removed, they must be able to challenge that decision.
» The platform must process complaints in a timely, non-arbitrary, and objective manner.
» If a complaint is justified, the platform must reverse its decision without delay.
The system must be free of charge to the user.
5.2. Out-of-Court Dispute Settlement (Article 18)
Recognizing that internal systems may not always be satisfactory, the DSA provides for an independent out-of-court dispute settlement body. If a user is not satisfied with the outcome of the internal complaint, they can turn to a certified, independent dispute settlement body to resolve the issue. The decisions of these bodies are binding on the platform. This provides a low-cost, efficient alternative to judicial proceedings.
5.3. Trusted Flaggers (Article 19)
To improve the quality and speed of removing illegal content, the DSA creates a status of "Trusted Flagger." Entities with demonstrated expertise and competence (e.g., specialized anti-fraud units, child protection organizations) can apply to be certified as Trusted Flaggers by their national Digital Services Coordinator.
» Notices submitted by Trusted Flaggers must be processed by platforms with priority and without delay.
» Platforms cannot give Trusted Flaggers preferential treatment regarding the outcome, but they must prioritize the review process. This ensures that credible reports from experts are acted upon swiftly.
5.4. Measures for Protection of Minors (Article 20)
Online platforms accessible to minors must implement appropriate and proportionate measures to ensure a high level of privacy, safety, and security for minors. This could include defaulting to higher privacy settings for young users or implementing more stringent measures to prevent targeted advertising based on profiling of minors.
5.5. Transparency of Online Advertising (Article 21)
This is a landmark provision addressing the opaque nature of online advertising. For each advertisement presented on their interface, online platforms must ensure that recipients can identify, in real-time, that it is an advertisement, on whose behalf it is presented, and who paid for it. They must also provide meaningful information about the main parameters used to determine the recipient to whom the advertisement is displayed. This aims to reduce manipulative practices and increase user awareness.
5.6. Transparency of Recommender Systems (Article 22)
Recommender systems (algorithms that determine what content users see in their feeds) are a primary driver of user engagement and potential harm. Online platforms must clearly explain in their terms and conditions the main parameters used in their recommender systems, and the options for users to modify or influence those parameters. At a minimum, platforms must offer at least one option that is not based on profiling (e.g., a chronological feed).
5.7. Exclusion for Micro and Small Enterprises
Recognizing the administrative burden, the DSA provides certain exemptions for micro and small enterprises (as defined in EU law) from some of the above obligations, such as the requirement for an internal complaint-handling system and out-of-court dispute settlement, unless they are designated as VLOPs.
6. Stringent Obligations for VLOPs and VLOSEs (Chapter IV)
The most transformative part of the DSA is the special regime for VLOPs and VLOSEs. The rationale is that their reach and impact are so vast that they pose systemic risks to society, requiring ex-ante (before-the-fact) regulation, rather than just reactive measures.
6.1. The Designation Process
The European Commission is responsible for designating VLOPs/VLOSEs based on user numbers self-reported by platforms. A platform has 45 million monthly active users if it reaches roughly 10% of the EU's population. Once designated, they have four months to comply with the strictest rules.
6.2. Systemic Risk Assessment and Mitigation (Articles 26 & 27)
This is the cornerstone of the VLOP/VLOSE regime. These entities must diligently identify, analyze, and mitigate the systemic risks stemming from the design and functioning of their services, including their algorithmic systems. The specific systemic risks identified in the DSA are:
» Illegal Content Risks: The dissemination of illegal content through their services.
» Fundamental Rights Risks: Any actual or foreseeable negative effects on the exercise of fundamental rights, including human dignity, freedom of expression and information, data privacy, non-discrimination, and the rights of the child.
» Civic Discourse and Electoral Processes Risks: Any actual or foreseeable negative effect on civic discourse and electoral processes, and public security.
he Mitigation Measures: To mitigate these risks, VLOPs/VLOSEs must implement reasonable, proportionate, and effective measures. These could include:
» Adapting their terms and conditions.
» Enhancing their content moderation processes and algorithms.
» Redesigning their services (e.g., changing how content is recommended or how viral content spreads).
» Launching or supporting literacy programs and fact-checking initiatives.
» They must report the results of their risk assessments and the mitigation measures taken to the Commission annually.
6.3. Independent Auditing (Article 28)
VLOPs/VLOSEs are subject to mandatory, independent annual audits to assess their compliance with all DSA obligations. These audits must be performed by organizations independent of the platform and certified by a Digital Services Coordinator. The audit reports are submitted to the Commission and the Board, and a non-confidential version is published. This external scrutiny is designed to prevent platforms from marking their own homework.
6.4. Recommender System Transparency and User Control (Article 29)
VLOPs must provide users with options for recommender systems that are not based on profiling. They must also ensure that the options available to users for influencing the recommender systems include a option which is based on the user's ability to select preferred criteria (e.g., "show me more content from this creator," "show me less political content").
6.5. Data Access and Scrutiny for Researchers (Article 30)
To foster independent research into systemic risks, VLOPs/VLOSEs must provide access to data to vetted researchers affiliated with academic institutions. Upon a duly reasoned request from researchers, and subject to strict privacy safeguards, platforms must provide data to understand systemic risks. This is a revolutionary step towards enabling evidence-based scrutiny of platform effects on society.
6.6. Crisis Response Mechanism (Article 31)
This provision gives the European Commission extraordinary powers. In the event of a crisis (e.g., a war with associated disinformation campaigns, a pandemic, a major terrorist event), the Commission, after consulting the Board, can adopt a decision requiring VLOPs/VLOSEs to implement urgent and proportionate measures to address the specific crisis. This ensures a coordinated, EU-wide response to extraordinary threats.
6.7. Compliance Function and Supervisory Fee
VLOPs/VLOSEs must establish an independent compliance function within their organization and appoint a Compliance Officer. Furthermore, they are required to pay an annual supervisory fee to the European Commission to cover the costs of their supervision, which can be substantial.
7. Online Marketplaces: Special Rules for a Special Case (Chapter III, Section 4)
Online marketplaces, as a specific type of online platform, have additional obligations aimed at combating the sale of illegal goods and services.
7.1. Traceability of Traders (The "Know Your Business Customer" Principle) (Article 22)
This is a critical measure to fight illegal products. Online marketplaces must collect and assess essential information about traders using their services to sell products or services (e.g., name, ID, contact details, bank account information). They must make sure this information is reliable.
» For Consumers: Marketplaces must ensure that the trader's name, contact, and address are visible on the product listing page.
» Compliance by Design: This obligation forces marketplaces to design their systems to collect this data upfront, making it harder for fraudulent sellers to operate anonymously.
7.2. Compliance by Design and Mediation
Marketplaces are encouraged to design their interfaces to facilitate compliance by traders with their legal obligations. They are also required to inform consumers about any out-of-court dispute settlement bodies that are available for conflicts arising from purchases.
8. Enforcement and Penalties: Ensuring Bite Behind the Bark (Chapter IV)
A robust regulation is meaningless without strong enforcement. The DSA establishes a novel, cooperative enforcement architecture.
8.1. The Enforcement Architecture
» Digital Services Coordinators (DSCs): Each member state must designate an independent authority as its Digital Services Coordinator. This is the primary national regulator responsible for supervising intermediaries in their territory.
» The European Board for Digital Services (EBDS): Composed of the DSCs, the EBDS ensures the consistent application of the DSA across the EU. It advises the Commission and DSCs and can issue guidelines.
» The European Commission: The Commission has exclusive power to supervise and enforce the DSA for VLOPs and VLOSEs. For all other intermediaries, the primary supervision lies with the DSC of the member state where the service's main establishment is located.
8.2. Investigative and Corrective Powers
Supervisory authorities (DSCs and the Commission) have extensive powers, including the ability to:
» Request information and conduct inspections.
» Interview persons and take statements.
» Order the immediate cessation of infringements.
» Impose interim measures to avoid serious harm.
8.3. Penalties for Non-Compliance
The penalties for violating the DSA are severe and designed to be deterrent:
» Fines of up to 6% of the provider's global annual turnover in the preceding financial year.
» Periodic penalty payments to compel compliance.
» In the case of VLOPs/VLOSEs, the Commission can even seek temporary restrictions on access to the service in the EU in case of serious, repeated violations that threaten the safety of persons.
9. Global Implications and Conclusion
The EU Digital Services Act is far more than a regional regulation; it is a paradigm shift with global ramifications. By setting a high standard for platform accountability, transparency, and user protection, it is poised to become the de facto global standard, much like the General Data Protection Regulation (GDPR) did for data privacy. Non-EU platforms that wish to access the lucrative EU market must comply, which will likely lead them to extend many DSA-mandated features—such as enhanced transparency reports, appeal mechanisms, and advertising disclosures—to their users worldwide.
The DSA represents a fundamental rebalancing of power. It moves away from a model where a few unaccountable tech giants set the rules of online engagement towards a democratically accountable, rules-based system. It empowers users with new rights, provides regulators with powerful tools, and forces the largest platforms to seriously consider the societal impact of their design and operational choices.
However, challenges remain. The effectiveness of the DSA will depend on the resources and resolve of national DSCs and the European Commission. The complex interplay between the DSA, the GDPR, and national laws will require careful judicial interpretation. There are also concerns about potential unintended consequences, such as the risk of platforms becoming overzealous in content removal ("over-removal") to avoid penalties.
Nevertheless, the Digital Services Act marks a historic turning point. It is the most comprehensive attempt yet to tame the wilder aspects of the digital frontier and to build a online environment that is not only innovative and dynamic but also safe, fair, and respectful of fundamental rights. Its implementation will be closely watched by regulators and citizens around the world, as it charts a new course for the future of the internet—one where responsibility is integral to scale.
Here are some questions and answers on the topic:
1. What is the fundamental reason the EU created the Digital Services Act, and how does it differ from the old e-Commerce Directive?
The EU created the Digital Services Act to modernize the legal framework for digital services, which was previously governed by the e-Commerce Directive from 2000. The old directive was designed for a different internet era and established foundational principles like a liability shield for platforms regarding user-generated content. However, it proved inadequate to address the systemic societal risks posed by modern very large online platforms, such as the rapid spread of illegal content, disinformation, and fundamental rights violations. The DSA differs by building upon the e-Commerce Directive's liability shield but introducing a comprehensive, tiered set of proactive obligations focused on transparency, accountability, and risk management, moving from a passive, reactive model to an active, preventative one.
2. How does the DSA's "tiered approach" ensure that regulatory burdens are proportionate to a platform's size and impact?
The DSA's tiered approach carefully calibrates obligations based on the function, size, and societal impact of a digital service. Basic transparency and contact obligations apply to all intermediaries. Hosting services, like cloud storage, have additional duties such as a formal notice-and-action mechanism. Online platforms, which disseminate information to the public like social media sites, face even stricter rules including mandatory complaint-handling systems and transparency for advertising. The most stringent obligations are reserved for Very Large Online Platforms and Search Engines, those with over 45 million users in the EU. These entities must conduct systemic risk assessments, undergo independent audits, and share data with researchers, as their scale means their operations can affect society as a whole. This ensures a small startup is not overburdened while holding tech giants to a higher standard.
3. What are the specific new transparency requirements for online advertising and content moderation under the DSA?
For online advertising, the DSA mandates that platforms must clearly label each ad so users can instantly see that it is an advertisement. They must also disclose who paid for the ad and on whose behalf it is being shown. Furthermore, platforms must provide meaningful information about the main parameters, such as targeting criteria, used to decide why a particular ad was shown to a specific user. Regarding content moderation, platforms must publish detailed annual transparency reports that include data on the number of content removal orders received from authorities, the actions taken on user reports of illegal content, and the outcomes of user appeals against moderation decisions. This dual transparency aims to reduce manipulative advertising practices and demystify how platforms manage content on their services.
4. What unprecedented obligations does the DSA impose specifically on Very Large Online Platforms (VLOPs) due to their systemic risk?
The most unprecedented obligations for VLOPs revolve around managing their systemic risks to society. They are required to proactively identify, analyze, and mitigate foreseeable systemic risks annually. These risks include the dissemination of illegal content, negative effects on fundamental rights like freedom of expression and children's rights, and manipulation of electoral processes and civic discourse. To mitigate these risks, they must implement proportionate measures, which could include changing their algorithmic recommender systems or interface designs. They are also subject to mandatory independent audits to verify their compliance, and they must provide vetted academic researchers with access to data to study these systemic risks. This represents a shift from merely reacting to illegal content to being responsible for the broader societal impact of their service's design and operation.
5. How does the DSA's enforcement mechanism work, and what are the potential penalties for non-compliance?
The DSA establishes a sophisticated enforcement architecture involving national and EU-level authorities. Each EU member state designates an independent Digital Services Coordinator (DSC) as the primary regulator for most intermediaries within its territory. For the supervision of Very Large Online Platforms and Search Engines, the European Commission has exclusive authority, ensuring consistent oversight of these global entities. The European Board for Digital Services, composed of all DSCs, advises and coordinates to ensure the DSA is applied consistently across the single market. Supervisory authorities have strong investigative powers, including the ability to request data and conduct on-site inspections. Penalties for non-compliance are severe, with fines reaching up to 6% of a company's global annual turnover. In the most serious cases of repeated violation by a VLOP, the Commission can even seek a temporary restriction of access to the EU market.
Disclaimer: The content shared in this blog is intended solely for general informational and educational purposes. It provides only a basic understanding of the subject and should not be considered as professional legal advice. For specific guidance or in-depth legal assistance, readers are strongly advised to consult a qualified legal professional.



Comments