Both Acts’ main objectives are the protection of users’ fundamental rights in digital space and the creation of equal competitive conditions in order to promote innovation, growth and competitiveness.
As a result of these Acts, affected service providers are facing increased compliance requirements. In another article, we will present the topic of DSM in more detail. For the time being, the first question we would like to raise is: What legal requirements are associated with the DSA?
I. What’s the DSA's purpose?
The regulation’s purpose is to counter threats to fundamental rights within the course of the digital transformation. In particular, the EU intends to prevent the exchange of illegal goods, services, and online content. Besides that, any misuse of online services by manipulative algorithms and thus the spread of disinformation is to be combated. Furthermore, certain enforcement deficits of the GDPR are to be addressed.
II. Who is affected by the DSA?
The DSA relates to providers of digital intermediary services offering their services within the EU, irrespective of their registered office. Beyond that, however, the regulation is also to take effect within the European Economic Area (EEA). The regulation distinguishes between the service provider categories of “mere conduit”, “caching” and “hosting”. In addition, “online platforms” and “online search engines” are explicitly mentioned as subcategories. The service providers specifically affected include, for example, online marketplaces, social networks, content sharing platforms, app stores, and online travel and accommodation platforms. Smaller companies with less than 50 employees and annual sales below EUR 10 million, on the other hand, are explicitly excluded from the scope of application.
III. What obligations does the DSA entail?
In addition to reporting obligations (cf. Section IV. for more details), the DSA also entails comprehensive due diligence obligations for service providers. Such obligations’ scope depends on the type and size of the relevant service provider, thus qualifying as graded regulatory system.
First of all, Articles 11 to 15 of the Regulation define general obligations applicable to all intermediary service providers. Furthermore, hosting service providers are subject to more extensive obligations, cf. Articles 16 to 18. Besides these obligations, additional requirements apply with regard to online platforms, as a subcategory of hosting services, according to Articles 19 to 32. For services classified as very large online platforms and search engines, the most extensive provisions of the Regulation apply, which are additionally set out in Articles 33 to 43.
In detail, the obligations have the following material characteristics:
1. How to respond to illegal content and liability
In case of any hints to illegal content by an authority or court, intermediary service providers must generally take measures in order to examine and, if applicable, delete such content. For (hosting) service providers this means, in particular, that they still do not have to monitor their platforms with regard to illegal content or their users’ goods and services. The common practice of “notice and takedown” remains. However, they are obliged to provide a procedure enabling notifications as well as effective remedies when infringements have been identified.
Provided intermediary service providers meet these requirements, they benefit (as they already did under the E-Commerce Directive) from a liability privilege. The DSA further clarifies that such privilege will remain in effect if service providers undertake, at their own initiative, certain investigations or compliance measures. Hosting providers enabling the conclusion of contracts between entrepreneurs and consumers should be careful. If the platform’s offer leads consumers to the conclusion that the subject matter of the contract or the information relating to such subject matter are attributable to the service provider or one of its affiliates, the liability privilege does not apply. Hosting service providers should therefore present their platforms in a manner allowing a clear distinction from third-party companies’ offers.
Similar to the national Network Enforcement Act (NetzDG), the DSA stipulates specific requirements for moderating content on online platforms and regarding the establishment of complaint management procedures. In this regard, service providers will also have to provide information in their general terms and conditions in the future. Service providers are therefore well-advised to review their general terms and conditions in this regard and, if necessary, add any mandatory information that is missing.
2. Structuring requirements
In addition, the Regulation contains new structuring requirements. For example, it stipulates that measures must be taken in order to protect minors and that online advertising must be properly labeled. With regard to online advertising, more extensive information will also be required in the future about the respective client of the advertisement as well as about the most important parameters for determining the target group. The use of profiling-based online advertising is also prohibited if it targets minors or is based on sensitive data, such as health data.
In future, there will also be a ban on so-called “dark patterns” and so-called “nudging”, which are manipulative presentation elements that prevent the user from making a free and informed decision. Which specific elements are covered by this ban remains to be seen in the light of the further development of the Commission's guidelines.
3. Transparency obligations
Depending on the service provider’s or his platform’s type and size, the DSA imposes regular reporting obligations (cf. Art. 15, 24 and 42 DSA). For example, the number of official or judicial orders received and the respective measures taken, details on the moderation of content taken at the provider’s own initiative, the automated means used for the moderation of content, and the number of users and of complaints received must be documented and disclosed.
4. B-to-C-market places
Operators of B-to-C marketplaces must ensure that companies trading on their marketplace are trustworthy by requesting contact and payment data as well as proof of identity. For consumer protection purposes, the service provider is obliged to remove the offer of the company in question from its marketplace in the event of incomplete or inaccurate data.
5. Consequences of violations
When breaches of obligations under the DSA occur, supervisory authorities have various enforcement mechanisms at their disposal. In the worst case, the services concerned may be blocked and fines imposed (up to 6% of annual sales). In addition, the regulation also opens up the possibility for private participants to enforce their rights themselves or represented by associations.
IV. When do the legal obligations take effect?
16.11.2022 DSA enters into force
17.02.2023 Start of the obligation to publish user figures ==> Identification of very large online platforms and search engines (more than 45 million active users) by EU Commission Decision.
Since February 17, 2023, all providers of online platforms and search engines covered by the DSA are obliged to publish the number of average monthly users of their service and to report it to the EU Commission. The publication must be updated every 6 months. Subsequently, the EU Commission determines by decision which service providers qualify as “very large online platforms or search engines”. This happens if the respective platform or search engine has more than 45 million active users.
4 months after Decision Affected service providers must fully comply with DSA obligations
Service providers classified as such must comply with the DSA’s obligations no later than 4 months after the Commission’s Decision.
17.02.2024 DSA takes effect for all digital service providers / Deadline for appointment of a Digital Services Coordinator
From February 17, 2024, the provisions of the DSA will also take effect for all other providers of digital services. In addition, a Digital Services Coordinator must have been appointed in each of the EU member states by this date. As the competent national authority, this Coordinator is responsible for monitoring the implementation of the DSA and acting as an interface between the supervisory authorities. In addition, the Digital Service Coordinator will act as complaints office for all users and ensures exchange with the Commission.
IV. What is the relationship with existing regulations at national and EU level?
The Regulation supplements and updates the E-Commerce Directive previously applicable in the EU and, in Germany, also the Telemedia Act (TMG). In view of the comprehensive conformity and the extensive waiver of opening clauses in the course of the DSA, the Network Enforcement Act (NetzDG) previously applicable in Germany will presumably only be applied to a very limited extent – also in light of the EU’s goal of full harmonization.
Taking into account the comprehensive design of the DSA and the goal of full harmonization within the EU, there will be only limited room for maneuver for national legislators in the future. It remains to be seen whether and to what extent the NetzDG will be adapted and possibly continue to apply.
Service providers should promptly check the extent to which their company is affected by the new obligations and, if necessary, adapt compliance structures in good time, make any necessary changes to their offerings, and update their general terms and conditions.
Please note that this article only provides a general overview regarding the requirements for digital online services. In individual cases, the new regulation may entail far-reaching obligations in terms of transparency, moderation and design of a platform’s content.
Another article will be available here shortly, focusing on the regulations and implications of the Digital Markets Act.