As of February 17, 2024, the Digital Services Act (DSA) replaces significant portions of the old E-Commerce Directive, restructuring the regulation of the digital services sector across the EU. Together with the Digital Markets Act (DMA), the DSA forms a comprehensive regulatory framework designed to address modern digital economy challenges. While the DMA targets monopolistic practices of dominant platforms, the DSA imposes strict obligations on digital service providers, particularly large platforms, to combat misinformation, illegal content, and market dominance, aiming to create a fairer, safer, and more transparent digital environment in the EU.
In Germany, enforcement of the DSA will involve the Federal Network Agency, the Federal Center for the Protection of Children and Young People in the Media, and the Federal Commissioner for Data Protection and Freedom of Information. This collaboration will ensure the protection of digital rights while balancing security and freedom of expression.
The DSA's implementation in Germany supersedes much of the Netzwerkdurchsetzungsgesetz (NetzDG), signaling the country’s commitment to a safer online environment in alignment with EU standards.
The Act's rapid adoption reflects the EU’s determination to address evolving digital business models and technologies. By extending liability regulations and imposing new obligations on very large online platforms, the DSA ensures the effective implementation of the EU’s digital agenda.
Context of origin
While digital services enhance daily life by providing access to information and expanding markets, they also present regulatory challenges. A significant concern identified during the DSA’s impact assessment was the misuse of online services for illegal activities, including the sale of illegal goods and the spread of misinformation through manipulative algorithms.
The DSA was also created to address the lack of cooperation among national authorities, which hampers effective oversight due to inadequate mechanisms for information sharing across Member States. Additionally, the European Commission intended the DSA to harmonize regulations across the EU, preventing legal fragmentation that could create barriers for digital services, particularly for SMEs and innovative start-ups. The DSA aims to enhance protection standards while scaling obligations to avoid burdening micro-enterprises.
Basic Structure and Enforcement Mechanisms
The DSA establishes a tiered regulatory framework, with four distinct levels of obligations tailored to the nature and scale of the service providers.
At the foundational level, the regulations apply broadly to all intermediary service providers, setting baseline requirements. As the scope narrows, more stringent obligations are imposed on hosting service providers, extending further to encompass online platforms and, in certain instances, online search engines. The most rigorous standards are reserved for very large online platforms and very large search engines, reflecting their significant impact on the digital ecosystem.
Enforcement of these regulations is entrusted to independent Digital Services Coordinators within each Member State, ensuring localized oversight. The European Commission is endowed with special enforcement powers, particularly concerning very large online platforms and search engines, to maintain consistency across the EU. Complementing these efforts, the establishment of a European Digital Services Board will facilitate coordination and provide strategic guidance.
The DSA also introduces a range of supervisory tools, including the imposition of fines and periodic penalty payments, to ensure robust compliance across the digital services landscape.
Obligations
Despite being a regulation, the DSA does not ask for unilateral implementation, disregarding personal characteristics of the service providers. Instead, it introduces a framework with asymmetric obligations tailored to the size and impact of digital service providers. Thus, there are four regulatory levels, with increasingly stringent provisions for a smaller group of providers. These levels range from general intermediary services to very large online platforms and search engines. The highest level of regulation includes enhanced oversight and compliance mechanisms to ensure proper enforcement.
Specifically, providers of intermediary services are subject to continued liability privileges and prohibited from engaging in general monitoring obligations. However, voluntary investigations can be conducted, provided they are done in good faith.
Moreover, hosting service providers must implement systems for reporting illegal content. Under specific conditions, they must also report certain crimes to authorities, with these obligations fully harmonized across the EU, needing updates to national regulations like the NetzDG.
Illegal Content Regulation
A central reference point of the DSA is the concept of 'illegal content', which is defined very broadly and includes all information that does not follow Union law or the law of a Member State. This includes not only inherently illegal content but also activities that violate consumer protection rights. The term 'hate speech' occupies a special position as it is not a uniformly defined legal term. The recitals refer to the criminal law systems of the Member States.
The Regulatory Levels of the DSA in Detail
1. Regulations for Providers of Intermediary Services:
This category encompasses all information society services that are typically offered electronically in exchange for payment. Providers within this scope must show a substantial connection to the European Union.
2. General Liability Privileges:
The DSA maintains the existing liability privileges for service providers. Specifically, access providers cannot generally be held liable, while caching providers are only liable after becoming aware of illegal content. Hosting providers will keep their liability privileges as long as they do not take an active role that grants them knowledge or control over the hosted data.
3. Prohibition of General Monitoring Obligations and Voluntary Investigations:
The DSA upholds the ban on imposing general monitoring obligations on service providers. Notably, providers do not forfeit their liability privileges if they undertake voluntary investigations to detect illegal content, provided these actions are conducted ‘in good faith and diligently’.
4. Court and Official Orders:
The DSA allows public authorities and courts to issue orders, as long as these orders comply with specific minimum requirements. Typically, such orders are confined to the jurisdiction of the issuing Member State.
5. Contact Points for Authorities and Users:
All providers must set up a central electronic contact point to facilitate communication with authorities and users. Providers based outside the EU must designate a legal representative within the Union.
6. General Terms and Conditions and Transparency Obligations:
The DSA mandates the establishment of minimum standards for terms of use, particularly concerning content moderation. Additionally, providers must annually publish a transparency report detailing their content moderation activities.
7. Regulations for Hosting Service Providers:
Hosting service providers are required to implement a system for reporting and processing illegal content. They must also provide justifications for any usage restrictions imposed and, under certain conditions, report criminal offenses to the relevant authorities. The reporting obligation applies specifically to offenses that pose a threat to life or personal safety. The DSA harmonizes these reporting obligations across the EU, causing adaptations to national laws such as the NetzDG.
Conclusion
Companies must thoroughly assess their roles and responsibilities under the DSA, adjusting their practices to fulfill these obligations and contribute to a more secure and accountable digital environment within the EU.
While the DSA provides a cohesive regulatory structure, it also raises concerns about potential threats to freedom of expression. The broad definitions of 'illegal content' and 'disinformation' could lead to excessive control, potentially restricting democratic discourse. As such, businesses must remain vigilant in their compliance efforts, ensuring that these do not undermine the protection of fundamental rights.