Papers
The DSA Rationale and the Need for a Cross-sectoral Approach
“Allons, courage, le Printemps de l'Europe est toujours devant nous!”[1]
Jacques Delors
Index: 1. Navigating the DSA landscape. 1.1. An overview. 1.2. The implementation phase: off to a rocky start? 1.3. Challenges and gaps ahead. 2. The DSA’s rationale. 2.1. The purpose behind regulation. 3. The need for a cross-sectoral approach. 3.1. Two layers of systematic analysis. 3.2. The DSA within the EU digital framework. 3.3. The fundamental rights’ assessment. 4. Conclusions.
Abstract
The DSA is, as of 17 February 2024, applicable to all member States. The questions surrounding its applications are, however, yet difficult to answer. At its core, we identify three problematic areas of most importance: i) issues regarding the DSA’s scope and its intersection with the current EU digital framework; ii) challenges deriving from the DSA’s global reach, which impacts enforcement and EU centralization of powers; iii) the ethical dimension that runs through the DSA with particular focus on the systemic risk assessment. In this paper, we propose to take a step back on the DSA’s analysis so to cover mostly its rationale and the legal challenges from a cross-sectoral perspective.
Keywords: EU Digital framework; EU; Digital Services; Digital Services Act; DSA.
1. Navigating the DSA Landscape
1.1. An Overview
The DSA[2] provides for a regulation on a single market for digital services. Its focus is on intermediary services provided to recipients of the service that have place of establishment or residence in the EU,[3] for which it establishes a new framework of obligations depending on the type of service and the nature of the providers. We can, therefore, distinguish general provisions divided between a first part, in Chapter II, regarding the liability of different services, and a second part, in Chapter III, setting out due diligence obligations that are applicable to all providers. This second layer sets out additional provisions to: providers of hosting services, providers of online platforms that allow consumers to conclude distance contracts with traders, and finally, providers of Very Large Online Platforms (VLOPs) and of Very Large Online Search Engines (VLOSEs).
1.2. The Implementation Phase: Off to a Rocky Start?
I. While the majority of the DSA’s provisions is only applicable since 17 February 2024, some have been in complete force since 16 November 2022 (namely, regarding transparency reporting obligations, additional obligations for providers of VLOPs and VLOSEs, rules on the performance of independent audits, technical conditions for providers of VLOPs or VLOSEs to share data, the supervisory fee’s provision and rules on supervision, investigation, monitoring, and enforcement).[4] In general, those provisions are either related to the Commission to set out implementation rules or to very large online platforms and very large online search engines, to which additional requirements are established under the DSA. We can, therefore, state that the DSA has been designed to operate in two different stages: the implementation phase and the operational phase.
As we are now taking the first steps in the operational phase, a general assessment is to be made: first, regarding the success of its implementation and, secondly, focusing on the purposes the DSA seeks to achieve.
II. The implementation phase was established mainly to allow the providers of VLOPs and VLOSEs to adjust to the requirements set out in the DSA. For such purposes, the Commission was to provide resources and tools to guide the market operators in their compliance requirements. The goal was not, as per our view, successfully accomplished, either due to lack of response of the Commission or to the difficulties felt by the member States to set in motion the necessary regulatory mechanisms.
III. Starting with the compliance with article 24, the providers were dependent on: i) information regarding the method and calculation of the number of average monthly active recipients of the service, to which the DSA provided for a first deadline due 17 February 2023; ii) the designation of the Digital Services Coordinator to comply with the communication requirements; iii) the Commission confirmation on the templates for reports.
As to the first point, a guidance document was provided, on 31 January 2023 (in all EU official languages), although merely expressing the views of the Commission services. The document addresses the questions received by the operators on identification and counting of active recipients.[5] Among those information, it is clarified that, even though the DSA does not require providers to notify the said information on recipients to the Commission or to the Digital Service Coordinator, “[in] the interest of transparency and in order to facilitate the monitoring of compliance with the DSA during the initial period of its application, all such providers are encouraged to communicate that information to the Commission”.[6] Also, providers are therein advised to avoid double-counting. For such purposes, it is highlighted that “[the] obligation to count active recipients of the service does not require nor permit providers to profile and track users to avoid “double counting”. The DSA may not be understood as providing a ground to process personal data or track users”.[7]
As to the designations of Digital Services Coordinators by the member States, the process was also not as quickly as initially expected by the DSA.[8] In Portugal, for example, the coordinator was only approved and announced on 8 February 2024,[9] under which ANACOM (the national regulatory authority for communications) was designated as the digital services coordinator in Portugal, ERC (the national regulatory authority for the media) was designated as the competent authority in matters of media and other media content, and IGAC (the general inspectorate of cultural activities) was indicated as the competent authority in matters of copyright and related rights.[10] In the assessment of time, it should also be noted that no member State has created new entities for the purpose of the DSA; instead, the option was to widen the competences of existing authorities across EU, based upon a new governance structure. In addition, the EU Board for the monitoring of digital platforms have first met (officially) on 19 February 2024.
Finally, regarding the templates for transparency reporting obligations, a final version of the implementing regulation is still pending. The Commission provided for a draft act which was subject to the feedback during 8 December 2023 and 24 January 2024.[11] A final version is expected to be published during the first quarter of 2024.
IV. Three other instruments were expected to be released shortly after the DSA approval: i) decision on the designation of both VLOPs and VLOSEs; ii) rules for the performance of independent audits together with its templates; iii) delegated acts on the technical conditions under which VLOPs or VLOSEs are to share data and for which purposes the data may be used.
The decision on the VLOPs and VLOSEs was released on 25 April 2023,[12] as the Commission designated a first set of entities contemplated in those categories. With no surprise, there were identified companies that have, at least, 45 million monthly active users, of which Facebook, Instagram, Twitter, YouTube, and Google Search are examples. To those entities, it was conveyed that 4 (four) months after the notification of the designated decisions, the platforms identified would have to adapt their systems and processes for compliance and to report their first annual risk assessment.[13] Even though the DSA’s purpose is uniformization, very different entities are being treated equally for the purposes of imposing additional requirements under the DSA. The current list of VLOPs and VLOSEs comprise a universe of entities that are not only very different in terms of activity but, more importantly, in the way they moderate the content. This raises challenges in terms of actual applicability of requirements but also in relation to the balance (equality) of obligations under the DSA.
The rules on audits, on the other hand, were established by 20 October 2023, at which time the Commission published the Delegated Regulation supplementing Regulation (EU) 2022/2065 of the European Parliament and of the Council, by laying down rules on the performance of audits for very large online platforms and very large online search engines.[14]
Lastly, as to the Delegated Regulation on data access and usage, a call for evidence was released during 25 April 2023 and 31 May 2023. The draft act is still expected, and the adoption by the Commission is planned for the first quarter of 2024.
V. All things considered, the path for operation of the DSA is still not clear to the providers nor to the market. Indeed, on 15 February 2024 the Commission was still issuing guidance on the DSA application, namely by way of the Commission Implementing Regulation (EU) 2024/607 of 15 February 2024 on the practical and operational arrangements for the functioning of the information sharing system pursuant to the DSA.[15] That was an important step towards the establishment and densification of ‘agora’, as prescribed to under article 85 of the DSA, intended to operate as a communication application between Digital Services Coordinators, the Commission and the Board.
Notwithstanding the late news on the DSA, formal proceedings against platforms are already in course. As an example, on 18 December 2023, the Commission has communicated formal proceedings against X (designated as a VLOP) under the DSA, mainly regarding risk management, content moderation, dark patterns, advertising transparency and data access for researchers.[16] Moreover, on 18 January 2024, the Commission informed on the information requests that have been sent to 17 (seventeen) VLOPs and VLOSEs under the DSA, especially on the “measures they have taken to comply with the obligation to give access, without undue delay, to the data that is publicly accessible on their online interface to eligible researchers”.[17]
As the final regulations are yet to be adopted, a practical assessment has to be made on the platform’s compliance with the DSA.
1.3. Challenges and Gaps Ahead
I. As stated in its article 1(1), the DSA aims to contribute to the proper functioning of the internal market for intermediary services and to set out uniform rules for a safe, predictable and trusted online environment. In this sense, the DSA is to be read together with the Directive 200/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (the “E-Commerce Directive”), to which it is a complementary regulation.
The simplicity of the wording hides, perhaps, the extension of the analysis to be made to provide for a complete and systemically consistent assessment of the DSA’s rules and objectives. Indeed, the DSA embodies a joint and multi-sectoral effort to regulate a reality that is, by nature, difficult to regulate and yet unknown at its fullest. Furthermore, the design of the DSA is not free from doubts regarding the structure of the obligations set out therein. Among the problematic areas surrounding the DSA, we consider three of most importance: i) its scope and the role of the DSA among the current EU digital framework; ii) enforcement and centralization, mostly considering the DSA’s global reach; iii) the ethical order the DSA embodies and systemic risk approach that underlies its creation.
II. In this paper we intend to focus on the first problematic area, by specifically covering the DSA’s rationale and purpose and the need for a cross-sectoral approach.
2. The DSA’s Rationale
2.1. The Purpose Behind Regulation
I. The DSA’s main goal can be described, as per our view, as “accountability”. First, it stands out as a mission to address the proliferation of platforms within society. Indeed, the way those have contributed to shape and modify social patterns, information channels and opinion worldwide created the need, on regulators and policy makers, to structure a framework that is consistent with (and can be adapted to) new tech and society phenomena. Platforms impact cannot be therefore narrowed down to a particular sector: platforms and the new digital way of consuming and providing services directly affect all sectors throughout with most impact on politics, media and contracting. The uniformity is therefore an expected strategy, which not only responds to the need for a general and constantly growing framework, but also addresses the concern of disparity among member States when dealing with such sensitive issues. In addition to this, the accountability rationale can also be revealed by the transparency and reporting obligations laid down in the DSA, together with the risk-based approach towards content moderation, in which policy makers state a joint and united front against the disinformation issue. Said reporting standards and obligations show an effort on the adjustment made by authorities of the rules on platform responsibility that have been construed over the years to the specific case of digital services and, mainly, to the platforms that “act as ‘gatekeepers’ between businesses users and end users”.[18]
Accountability is, however, quite sensitive as a purpose. As platform regulation is structured upon combination of knowledge of platforms towards a specific conduct and the incentives to make platforms act by taking measures on such knowledge by reporting and/or removing such conduct (content), this approach can turn out to be the failure of the system.
This is the issue the “moderators’ dilemma” intends to reflect, i.e., the potential risk of having platforms not wanting to react only not to gain knowledge on any conduct that can be circled back to the platform’s liability sphere. As a consequence, two different scenarios may arise from the platform’s action point of view: platforms may either prefer to remove contents based on mere suspicions (just to be safe) or platforms’ will to act on refraining or removing any content may be jeopardized due to the possibility of, by doing so, gaining (more) knowledge that ultimately could demand more actions to be performed. In this sense, accountability has the potential to backfire against the DSA’s agenda (on monitoring behaviors and content and reporting) and purposes by creating either inactivity or over activity in platforms.
II. Articles 7 and 8 of the DSA are designed to address this issue, by establishing, first, that providers “shall not be deemed ineligible” for liability exemptions just by carrying out voluntary own-initiative investigations or taking other measures and, second, that no general monitoring or active fact-finding obligations are set out under the DSA. In addition to the difficulties surrounding the reaching of provisions (e.g., for not addressing specifically terms of services), its interpretation together with article 9 of the DSA raises doubts as to the reach of “preventing content filtering mechanisms”.[19] At its core, article 8 of the DSA is not surprising: indeed, article 15 of the E-Commerce Directive contains a similar provision regarding the absence of a general obligation to monitor. However, by replicating the same rationale, DSA chose not to address the issues that were already being discussed under the E-Commerce Directive. On this matter, the Eva Glawischnig‑Piesczek v. Facebook Ireland Limited case[20] is to be recalled. Under such proceedings, the Court of Justice of the European Union (the “CJEU”) ruled that:
“(…) Article 15(1), must be interpreted as meaning that it does not preclude a court of a Member State from: ordering a host provider to remove information which it stores, the content of which is identical to the content of information which was previously declared to be unlawful, or to block access to that information, irrespective of who requested the storage of that information; ordering a host provider to remove information which it stores, the content of which is equivalent to the content of information which was previously declared to be unlawful, or to block access to that information, provided that the monitoring of and search for the information concerned by such an injunction are limited to information conveying a message the content of which remains essentially unchanged compared with the content which gave rise to the finding of illegality and containing the elements specified in the injunction, and provided that the differences in the wording of that equivalent content, compared with the wording characterizing the information which was previously declared to be illegal, are not such as to require the host provider to carry out an independent assessment of that content, or ordering a host provider to remove information covered by the injunction or to block access to that information worldwide within the framework of the relevant international law”.[21]
This heritage is, therefore, passing through the DSA.
3. The Need for a Cross-sectoral Approach
3.1. Two Layers of Systematic Analysis
I. In order to preserve the consistency of the legal system, regulation has to be read widely, which entails the need to both interpret the regulations (and its concepts) based on a joint understanding of the surrounding (and previous) regulatory environment and consider the cross-impact that said regulation (and concepts newly created) have on other instruments. This constant communication of concepts and principles is what ensures that, at its core, an application of any regulation is aligned with the system. As a recent regulation, the DSA will encounter this challenge, which is highly increased by the massive and successive production of legal instruments to this day.
II. Having this in mind, we identify two areas of needed cross-sectoral analysis: an external layer, that considers the role of the DSA within the EU digital framework and an internal layer, that focuses on the legal communication that has to be made for the purposes of the fundamental rights assessment laid down in the DSA.
3.2. The DSA Within the EU Digital Framework
I. The need for a joint and comprehensive analysis of all legal instruments (as a requirement for its individual application) comes as a challenge not only in relation to the cross-impact of the compliance framework established in each legal act, but also due to the amount of regulation and information that is provided simultaneously. As an example, only on 15 February 2024, the Commission released: i) the Commission Implementing Regulation (EU) 2024/607 of 15 February 2024 on the practical and operational arrangements for the functioning of the information sharing system pursuant to the DSA;[22] ii) new measures on cross-border enforcement under GDPR.[23] Naturally, this creates an evident obstacle to the assessment to be made either by advisers or market operators.
This is, however, a required step. Therefore, to capture the reach and extension of the DSA we are to position the DSA among the current EU digital framework. Indeed, the DSA’s innovative approach regards specifically the aim of providing a comprehensive hence complete regulatory framework, which, by nature, depends on a joint analysis of different regulations.
II. This is particularly relevant when it comes to the DSA due to three main reasons.
First, on the account of the uniformity agenda behind the DSA, which is stated right in Recital (4) of the Regulation.[24] The DSA’s main objective is stated as to provide for fully harmonization which prevents the member States from adopting or maintaining “additional national requirements” in relation to the matters regulated by the DSA.[25]
Second, considering the complementary relationship between the DSA and other (national and/or EU) regulations. In addition to the E-Commerce Directive, the application of the DSA shall be crossed with, among others: i) the Directive (EU) 2015/1535 of the European Parliament and of the Council of 9 September 2015, regarding the provision of information in the field of technical regulations and of rules on Information Society services;[26] ii) consumer protection regulation;[27] iii) data protection regulation;[28] iv) rules on recognition and enforcement of judgments in civil and commercial matters;[29] v) copyrights and related rights regulation;[30] vi) the Directive (EU) 2018/1972 of the European Parliament and of the Council of 11 December 2018 establishing the European Electronic Communications Code; vii) Code of conduct on countering illegal hate speech online of 2016;[31] viii) the Directive 2010/13/EU of the European Parliament and of the Council of 10 March 2010 regarding to the audiovisual media services;[32] and, last but not the least, ix) the case-law of the CJEU. Therefore, in light of the nature of the services that are aggregated in the DSA, its uniform application requires a multi-sectoral and level analysis to be made. In this universe, copyright regulation is to be highlighted as copyright infringement represents a relevant parcel within content moderation.
Third, it is to be noted that the structure on which the DSA is drafted presupposes a balance to be made prior to its provisions’ effective application. Indeed, the framework of obligations set out therein are not triggered only based on a specific conduct. Differently from other regulations, the DSA deals with risks that are not economic rather intangible or not quantitative, which puts the onus on the agents to assess the risks. On this regard, particular reference has to be made to the Charter of Fundamental Rights of the European Union[33] and to the AI Act. For different reasons, these instruments will be deemed fundamental to the risk assessment under the DSA, hence to its application.
3.3. The Fundamental Rights’ Assessment
I. The DSA sets out the obligation for VLOPs and VLOSEs to “assess the systemic risks stemming from the design, functioning and use of their services, as well as from potential misuses by the recipients of the service”.[34] Those systemic risks are aggregated in four categories, as follows: i) risks of dissemination of illegal content;[35] ii) impact of services on fundamental rights;[36] iii) the actual or foreseeable negative effects of the services on democratic processes, civic discourse and electoral processes, as well as public security”;[37] iv) regarding risks “with an actual or foreseeable negative effect on the protection of public health, minors and serious negative consequences to a person’s physical and mental well-being, or on gender-based violence”.[38] When doing such assessment, the providers are namely to consider the impact on freedom of expression.[39]
This approach shall be proven challenging.
First, due to the approach of allocating the onus of the fundamental rights’ assessment to the operators is to be questioned under both the EU’s autonomy principle and the EU’s role ultimately as the entity that must safeguard the compliance of the EU’s values. It is also to debate whether these operators are in a position to conduct this assessment: while, under the DSA, the providers are considered as acting within EU, most of them are based outside the EU, which comes to state that there might be a non-identity with EU principles and values. The issue on the EU’s autonomy is being construed in a way that points to the EU as the competent authority (together with entities that constitute proper bodies of the EU) to conduct interpretation and rule on such matters. That is one of the main lessons learnt from the Achmea case[40] which, despite dealing with different legal subjects, has been recognized as a landmark in the construction of the autonomy principle. Indeed, it “focuses on the fact that the EU legal system functions as an autonomous legal order, since it has the capacity to operate as a self-sufficient system of norms”.[41]
Furthermore, the assessment that DSA wishes platforms to be responsible for regards, ultimately, to the assessment of how such risks can be used against them. In other words, we would have the platforms and providers to set the level of risk that they would be held liable for, which creates an obvious conflict of interest.
Also, human rights framework requires densification to serve as a legal standard within content moderation. Indeed, following Niklas Eder, “[the] societal harms of content moderation are not easily quantifiable. We lack objective standards and empirical measures to assess these harms (…)”.[42] Additionally, there are areas that require more attention than others, having consumer law as an example.[43] In any case, this analysis cannot be crystallized in a point in time. Differently, it must comprise a flexibility that allows a strong rationale to be extended and updated in face of how reality evolves. For such purposes, the AI Act can serve as a useful tool in the definition of systemic risks under the DSA, especially should we consider the developments of generative AI and how it has penetrated platforms throughout. Platforms and providers are mostly AI driven so not only these two realities are to communicate towards a common assessment but, additionally, AI is highly expected to strengthen these risks. The DSA and the AI Act, while complementary, do not substitute one another for the AI Act deals with the criteria for operation, implementation in the market, the usage of the system and, differently, the DSA considers the effects deriving from the usage and implementation in the market.
4. Conclusions
I. Throughout this analysis, we have taken the opportunity to address both the DSA’s agenda and the cross-sectoral perspectives that surround the DSA. These do not, however, exhaust the list of issues. Other challenges could be analyzed, either under a competition perspective (mainly focused on both the need to ensure consistency of enforcement mechanisms and solutions inside the EU towards the appropriate application of the DSA and the effectiveness of the regulation for small digital operators and start-ups) or, for instance, considering the impact of algorithms on user experiences. Further on this path, the proper scope of the DSA is not beyond doubts, especially in light of the dynamic nature of online content.
This means that, despite its effective application, the discussions around the DSA are now more important than ever. To this end, the practice will reveal more implications that are not, perhaps, yet expected. Therefore, the first challenge of the Commission, together with the Digital Services Coordinators, will be the need for a constant and direct communication with the market so to better evaluate the success of the DSA throughout.
II. At this point, the question on regulation of platforms is not on “who”, “when” or “why”. Our legal system, as it does in relation to all other rights, must be summoned and strengthened to respond to the everyday issues that surround the digital era. Therefore, we are to make the DSA a joint effort and to help the authorities with this difficult task. After all, the DSA is a landmark intended to safeguard the rights and values that are common to all EU member States. This is a common achievement that has to be congratulated and recognized as at most importance.
[1] Jacques Delors, Address to the European Parliament, 19 January 1995, p. 10. Available at: https://ec.europa.eu/dorie/fileDownload.do;jsessionid=hB6TTXbJ1GG4M4yc6gJQjrJD3Cf1yH6YQpffLk2wlY9LJQ5jJLC6!-79809760?docId=340975&cardId=340975 (last visited on 8 February 2024).
[2] Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market for Digital Services (the “Digital Services Act” or “DSA”). ELI: http://data.europa.eu/eli/reg/2022/2065/oj.
[3] Article 2(1) of the DSA.
[4] i.e.: i) article 24(2), which obligated providers to publish in their online interfaces information regarding the average monthly active recipients of the service in the EU for each online platform or online search engine; ii) article 24(3), requiring providers of online platforms or online search engines to communicate to the Digital Services Coordinator of establishment and the Commission the active recipients’ information (and additional information, if required); iii) article 24(6), allowing the Commission to adopt implementing acts to lay down templates of reports; iv) article 33 (3) to (6), allowing the Commission to adopt delegated acts to supplement the provisions of the DSA and, to decide on the designation requirements of very large online platforms or very large online search engines; v) article 37(7), also allowing the Commission to adopt delegated acts to supplement the provisions of the DSA on the rules for the performance of independent audits together with its templates; vi) article 40(13) for the Commission to adopt delegated acts on the technical conditions under which very large online platforms or very large online search engines are to share data and for which purposes the data may be used; vii) article 43 regarding the supervisory fees the Commission shall charge to providers of very large online platforms and of very large online search engines; viii) on the Chapter IV regarding the implementation, cooperation, penalties and enforcement, sections 4 (on the supervision, investigation, enforcement and monitoring in respect of providers of very large online platforms and of very large online search engines), 5 (providing common provisions on enforcement which contemplates professional secrecy, information sharing system and representation), and 6 (on delegated and implementing acts, as the power to adopt delegated acts is attributed to the Commission).
[5] Available at: https://digital-strategy.ec.europa.eu/en/library/dsa-guidance-requirement-publish-user-numbers (last visited on 18 February 2024) (the “Guidelines on active recipients”).
[6] Question 3, para. 2, of the Guidelines on active recipients.
[7] Question 12, para. 2, of the Guidelines on active recipients.
[8] Even though the Commission established 17 February 2024 as the deadline for the member States to designate the Digital Services Coordinator. In the same communication, it was confirmed that “[that] same date is also the deadline by which all other platforms must comply with their obligations under the DAS and provide their users with protection and safeguards laid down in the DAS”. Communication available at: https://ec.europa.eu/commission/presscorner/detail/en/ip_23_2413 (last visited on 18 February 2024).
[9] Communication from the Portuguese Council of Ministers dated 8 February 2024. Available at: https://www.portugal.gov.pt/pt/gc23/governo/comunicado-de-conselho-de-ministros?i=599 (last visited on 18 February 2024).
[10] As of 16 February 2024, still no official designation was made, for instance, in The Netherlands or France. In France, the Audiovisual and Digital Communication Authority (ARCOM) was communicated to become the main Digital Services Coordinator (in coordination with the French data protection authority (CNIL) and the Directorate-General for Competition, Consumer Affairs and Fraud Control (DGCCRF)), but still no designation has been published. Instead, ACM and ARCOM have been appointed with a limited scope of action towards DSA. On this regard, the Commission has decided to sign with France and Ireland administrative arrangements to support its supervisory and enforcement. See press release dated 23 October 2023. Available at: https://ec.europa.eu/commission/presscorner/detail/en/ip_23_5196 (last visited on 18 February 2024).
[11] Available at: https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/14027-Digital-Services-Act-transparency-reports-detailed-rules-and-templates-_en (last visited on 18 February 2024).
[12] Available at: https://ec.europa.eu/commission/presscorner/detail/en/ip_23_2413 (last visited on 18 February 2024).
[13] Ibid.
[14] Commission Delegated Regulation C(2023) 6807 final, Brussels, 20 October 2023. Available at: https://digital-strategy.ec.europa.eu/en/library/delegated-regulation-independent-audits-under-digital-services-act (last visited on 18 February 2024).
[15] Commission Implementing Regulation C/2024/865. ELI: http://data.europa.eu/eli/reg_impl/2024/607/oj.
[16] ‘Commission Opens Formal Proceedings against X under the Digital Services Act’, Press release dated 18 December 2023. Available at: https://ec.europa.eu/commission/presscorner/detail/en/IP_23_6709 (last visited on 18 February 2024).
[17] ‘Commission Sends Requests for Information to 17 Very Large Online Platforms and Search Engines under the Digital Services Act’, Press release dated 18 January 2024. Available at: https://digital-strategy.ec.europa.eu/en/news/commission-sends-requests-information-17-very-large-online-platforms-and-search-engines-under (last visited on 18 February 2024).
[18] C. Busch, ‘Platform Responsibility in the European Union’, Defeating Desinformation: Advancing Inclusive Growth and Democracy through Global Digital Platforms, p. 7. Available at: https://digitalplanet.tufts.edu/wp-content/uploads/2023/02/DD-Report_2-Christoph-Busch-11.30.22.pdf (last visited on 18 February 2024).
[19] M. Rojszczak, ‘The Digital Services Act and the Problem of Preventive Blocking of (Clearly) Illegal Content’, in Institutiones Administrationis, n. 3(2), 2023, pp. 44-59.
[20] European Court of Justice, Eva Glawischnig‑Piesczek v. Facebook Ireland Limited [C‑18/18], ECLI:EU:C:2019:821 (the “Case”).
[21] Id., para. 53.
[22] Commission Implementing Regulation C/2024/865. ELI: http://data.europa.eu/eli/reg_impl/2024/607/oj.
[23] ‘New Measures to Strengthen the Cross-border Enforcement of the GDPR’, Press release dated 15 February 2024. Available at: https://www.europarl.europa.eu/news/en/press-room/20240212IPR17631/new-measures-to-strengthen-the-cross-border-enforcement-of-the-gdpr (last visited on 18 February 2024).
[24] It reads, as follows: “(…) in order to safeguard and improve the functioning of the internal market, a targeted set of uniform, effective and proportionate mandatory rules should be established at Union level. This Regulation provides the conditions for innovative digital services to emerge and to scale up in the internal market. The approximation of national regulatory measures at Union level concerning the requirements for providers of intermediary services is necessary to avoid and put an end to fragmentation of the internal market and to ensure legal certainty, thus reducing uncertainty for developers and fostering interoperability. By using requirements that are technology neutral, innovation should not be hampered but instead be stimulated”.
[25] See Recital (9).
[26] Also, “(…) Directive 2010/13/EU of the European Parliament and of the Council including the provisions thereof regarding video-sharing platforms, Regulations (EU) 2019/1148, (EU) 2019/1150, (EU) 2021/784 and (EU) 2021/1232 of the European Parliament and of the Council and Directive 2002/58/EC of the European Parliament and of the Council, and provisions of Union law set out in a Regulation on European Production and Preservation Orders for electronic evidence in criminal matters and in a Directive laying down harmonised rules on the appointment of legal representatives for the purpose of gathering evidence in criminal proceedings”. See Recital (10) of the DSA.
[27] As referred to in Recital (10) of the DSA.
[28] ibid.
[29] ibid.
[30] id., Recital (11).
[31] id., Recital (62).
[32] The European Regulators Group for Audiovisual Media Services has, among its proposal to strengthen the DSA, indeed highlighted the need to “[secure] and optimize the interplay between the DSA and the Audiovisual Media Services Directive (AVMSD) thereby alleviating implementation risks”. See ERGA, ‘Proposals Aimed at Strengthening the Digital Services Act (DSA) With Respect to Online Content Regulation’, p. 3. Available at: https://erga-online.eu/wp-content/uploads/2021/06/2021.06.25-ERGA-DSA-Paper-final.pdf (last visited on 18 February 2024).
[33] Charter of Fundamental Rights of the European Union (2016/C 202/02), available at: http://data.europa.eu/eli/treaty/char_2016/oj.
[34] As referred to in Recital (79) of the DSA.
[35] Id., Recital (80).
[36] Id., Recital (81).
[37] Id., Recital (82).
[38] Id., Recital (83).
[39] Id., Recital (86).
[40] European Court of Justice, Slowakische Republik (Slovak Republic) v. Achmea BV [Case C-284/16], ECLI:EU:C:2018:158.
[41] K. Lenaerts, The autonomy of European Union Law, Cacucci Editore, AISDUE, 2019, p. 5. Available at: https://www.aisdue.eu/wp-content/uploads/2019/04/001C_Lenaerts.pdf (last visited on 18 February 2024).
[42] N. Eder, ‘Making Systemic Risk Assessments Work: How the DSA Creates a Virtuous Loop to Address the Societal Harms of Content Moderation’, forthcoming in German Law Journal, 2023, p. 9. Available at SSRN: https://ssrn.com/abstract=4491365 (last visited on 18 February 2024).
[43] On this matter, see C. Busch, V. Mak, ‘Putting the Digital Services Act into Context: Bridging the Gap between EU Consumer Law and Platform Regulation’, in 10 Journal of European Consumer and Market Law (EuCML), 2021, 109, European Legal Studies Institute Osnabrück Research Paper Series, n. 21-03, 2021. Available at SSRN: https://ssrn.com/abstract=3933675 (last visited on 18 February 2024).