Data Protection lawyers with 50+ years of experience

Free initial consultation
/insights

Updated Tuesday, February 6, 2024

Updated Tuesday, February 6, 2024

Digital Service Act (DSA): new era of digital platform rules for consumer protection

On 17 February, the Digital Service Act comes into force for all affected online companies. In this Insights article, we provide an overview of the content and the most important requirements.

Boris Arendt

Salary Partner (Attorney-at-law)

Ana Combei

Scientific Research Assistant

What issues were the basis for the DSA’s development?
Requirements and Obligations
Implementation, Enforcement, and Oversight
Conclusion

Get assistance from our lawyers

Data Protection compliance can be complicated. Let our experienced team simplify it for you.

Free initial consultation

The European Union has adopted the Digital Services Act (DSA), a regulatory framework addressing challenges arising from the evolving digital landscape. The DSA becomes effective for all covered online businesses from the 17th of February 2024, 15 months after its entry into force. The Act introduces stringent guidelines across the EU with a particular focus on combating misinformation and hate speech. The Bundestag conducted a voting session on January 18, 2024, advancing the integration of this legislative initiative at the national level. In Germany, the federal government's draft law takes a localized approach, outlining specific responsibilities for authorities within the country.

Aligned with the DSA, the German draft law incorporates amendments to the Directive on electronic commerce (2000/31/EC) and implements Regulation on platform-to-business relations (2019/1150) dated June 20, 2019. The latter focuses on fostering fairness and transparency for commercial users of online intermediation services, necessitating modifications to various laws.

The Federal Network Agency is set to oversee and enforce the DSA, requiring close collaboration with regulatory authorities in Brussels and other EU member states. Specific responsibilities are allocated to entities such as the Federal Center for the Protection of Children and Young People in the Media, in accordance with the media law provisions of the federal states, and the Federal Commissioner for Data Protection and Freedom of Information.

Beyond delineating responsibilities, the legislation establishes clear parameters governing fines and penalty payments for DSA violations. The federal government asserts that this draft law covers the spectrum of sanctions outlined by the DSA for violations, potentially subjecting platform operators to penalties amounting to a maximum of six percent of their annual turnover.

The DSA provides guidelines on penalties not limited to fines, with context and clarification given in the preamble (paragraphs 114 and 144 particularly). The rules delimiting the margin of liberty Members States have over the matter of the penalties’ respective limitations are set out in Article 52. The Digital Services Coordinator and the Commission will have the authority to demand prompt action when necessary to address particularly substantial damages, and platforms may provide guarantees on how they will resolve them. This means that the enforcement mechanism is not just restricted to financial penalties. If rogue platforms fail to fulfill crucial responsibilities, putting people's lives and safety in jeopardy, the last step will be to request a temporary service suspension from the court, following consultation with all pertinent parties.


What issues were the basis for the DSA’s development?

While digital services enrich and ease daily life, be it in the form of access to information or market expansion for companies, there are also issues within the regulatory aspect of those services. A key concern identified in the impact assessment conducted to support the DSA was the misuse of online services for illegal purposes. That concerns the sale and trade of illegal goods and services, as well as the increased use of popular platforms by manipulative algorithmic systems, which can spread misinformation and harmful content.

Another reason for the creation and implementation of the DSA was to minimize the current lack of cooperation between national authorities. The improper set up of detailed mechanisms for cooperation or information sharing across Member States hampers effective oversight.

Moreover, the European Commission aimed this piece of legislation as a harmonization tool in lieu of the numerous separated legislative procedures Member States have adopted internally, which could potentially lead to legal fragmentation and barriers for digital services. The Commission particularly focuses on the risks such barriers and fragmentation would bear on SMEs, including innovative start-ups. Thus, while the standard of protection is increased, the burden is scaled accordingly to not negatively affect micro-enterprises.


Requirements and Obligations

Contrary to what is usually expected from a rigid legal act such as a regulation, the DSA does not ask for unilateral implementation regardless of personal characteristics of the service providers. Instead, it introduces a framework with asymmetric obligations tailored to the size and impact of digital service providers such as internet service providers, cloud services, messaging, marketplaces or social networks. Regarding due diligence, there are more particular requirements pertaining to hosting services, and even more so online platforms with a high usage rate i.e. app stores, housing platforms, content-based platforms etc.

Obligations for All Providers of Intermediary Services

Regarding providers of intermediary services, the DSA sets out a standard package of obligations for the purpose of transparency and fundamental rights protection. Intermediary services are defined in Article 3 (g) as: “(i), a ‘mere conduit’ service, (ii), a ‘caching’ service, (iii), a ‘hosting’ service”. As such, those obligations also refer to algorithmic decision-making review (Art. 12), reports on removal/disabling of illegal information or against the provider’s T&Cs (Art. 13), establishment of a single point of contact to further communication (Art. 10) and appointment of an EU based legal representative (Art. 11)

Obligations for Online Platforms and Hosting Service Providers

In this case, the DSA aims to focus more on the adequacy and accuracy of appeal and notice and action mechanisms. This would facilitate both the increasing of user rights protection, and the dismantling of illegal content spread. It is mandatory for online platforms and hosting providers to establish notice and action mechanisms that allow third parties to report the existence of allegedly illegal content (Article 14). Additionally, they must provide a justification for any removals or deactivations of specific information (Article 15).

Online platforms will also need to adhere to a new set of rules to guarantee the security and credibility of the goods and services they offer. Article 17 requires them to set up a user-friendly internal complaint handling process, and Article 18 requires them to interact with extrajudicial dispute resolution organizations to settle disagreements with its consumers.

The plan also adds the notion of trusted flaggers, which are firms appointed by Member State authorities with specific knowledge and skill in combating unlawful content.

Online platforms must prioritize processing notices from trusted flaggers (Article 19) and notify authorities if they suspect serious criminal offenses involving a threat to life or safety (Article 21). The regulation includes a 'know your business customer' approach, requiring platforms to collect and verify identification information from traders before allowing them to use their services (Article 22). In conclusion, the DSA’s new regulations regarding online advertising aim to provide users of digital platforms with relevant details about the advertisements they view, including the reason behind an individual’s targeting of a particular commercial. To guarantee that people utilizing their services are aware of the origins of the ads, the reason behind their targeting, and how to recognize the ad "in a clear and unambiguous manner and in real time," online platforms that display advertisements online will be held accountable for openness (Article 24). These guidelines complement the European Democracy Action Plan's measures, such as tightening the code of practice on disinformation and enacting laws to improve transparency in sponsored content in politics.

Obligations for Very Large Online Platforms (VLOPs)

For VLOPs (e.g. like Facebook, Instagram, TikTok, Amazon etc), the general rules apply as mentioned above, however, the threshold of transparency and accountability is raised proportionally to the increased impact of such large online platforms and engines. Systemic risk assessments and analyses shall be carried out by VLOPs (Article 34) to ensure that proper mitigation measures can follow, with a particular focus being on the increasing of protection of minors, fundamental rights and limit possible misuse for manipulation purposes.

Moreover, to ensure and check compliance as objectively as possible, very large platforms and engines shall undergo at their own expense once a year through the process of an independent audit, pertaining to Article 37. To ensure an extra layer of transparency, higher standards for advertising on VLOPs and large online engines are set out in Article 39.

The Commission has direct monitoring and enforcement authority over particularly big internet platforms and online search engines, and in the most egregious situations, it can impose fines of up to 6% of a service provider's global sales.


Implementation, Enforcement, and Oversight

Member states must appoint independent digital services coordinators with special oversight powers. These coordinators serve as vital liaisons between service providers and regulatory bodies.

Regarding complaint handling, coordinators have the capacity to receive complaints against intermediary service providers, allowing users to communicate directly with regulatory organizations.

Moreover, coordinators are urged to work with counterparts from other Member States, conduct joint investigations, and contribute to a consistent approach to enforcing the DSA.

Furthermore, the involvement of the European Data Protection Board (EDPB) ensures that the DSA is effectively coordinated and consistently applied across the EU. This central monitoring body is critical to ensuring that legislation is interpreted and enforced consistently.

There is also an increased level of Commission intervention insofar as it has strengthened supervisory powers over VLOPs and can intervene in cases of chronic infringements. The Commission has the authority to conduct investigations, including on-site inspections, and can take interim steps to address immediate problems. In cases of noncompliance, the Commission has the jurisdiction to issue noncompliance rulings, impose fines, and enforce monthly penalty payments, stressing the significant repercussions of violating the DSA.


Conclusion

The DSA represents a comprehensive effort by the EU to address the challenges posed by digital services. Its proposed framework aims to create a balance between fostering a competitive digital environment and safeguarding fundamental rights. Businesses should carefully assess their roles and responsibilities under the DSA, adapting their practices to comply with the proposed obligations and contributing to a more secure and accountable digital space in the European Union. Most importantly, businesses should aim at expanding their understanding of the technologies used, so as to have a forwarding thinking policy beyond the bare minimum for compliance.

Legal advice

Simpliant Legal - Wittig, Bressner, Groß Rechtsanwälte Partnerschaftsgesellschaft mbB


© 2019 - 2024 Simpliant