Since August this year, very large online platforms need to comply with the Digital Services Act (‘DSA’). The European Commission has the power to supervise and enforce the DSA against these platforms and is currently very visibly, through open letters and tweets by Commissioner Thierry Breton, taking up this task. This blogpost gives an overview of the developments and sets out in broad lines what might be next. The blog concludes with a short reflection on enforcement by the Commission of the big platform’s content moderation obligations.
Letters and requests for information
The Commission’s enforcement show started with a tweet by Commissioner Breton on 10 October. The tweet was addressed to Elon Musk and stated that the Commission has indications that X is being used to disseminate illegal content and disinformation in the EU. The letter also reminded X of its ‘very precise obligations regarding content moderation’ under the DSA. Two days later, the European Commission sent a formal request for information to X.
More public letters and requests for information followed. On 11 October, Commissioner Breton tweeted a letter to Mark Zuckerberg, which was also followed by a request for information by the Commission. Likewise, TikTok’s CEO was served a tweet from Commissioner Breton, and consequently, TikTok received a request for information. The requests for information to Meta and TikTok concern the platforms’ approaches toward protecting the integrity of elections and minors online and addressing illegal content and disinformation. Commissioner Breton also tweeted a letter to the CEO of YouTube. The Commission has not yet published a press release on a request for information to YouTube.
Information requests are part of the Commission’s enforcement tools. As mentioned above, the Commission is empowered to enforce the DSA against very large online platforms. The precise legal arrangement is that the Commission has the exclusive power to enforce Section 5 of Chapter III of the DSA (art. 56(2) DSA), which contains the risk management and crisis response rules for very large online platforms. The Commission also has the power to enforce the other DSA provisions against the large platforms (art. 56(3) DSA), but it shares this power with the Member States. The Member State in which the main establishment of a very large online platform is located can enforce the DSA, other than Section 5 of Chapter III, with respect to these platforms as long as the Commission has not initiated proceedings for the same infringement (art. 56(4) DSA). This governance structure was introduced by the Council during the legislative process.
To enforce the DSA, the Commission can initiate proceedings on the conduct of a platform if the Commission suspects that the platform has infringed the DSA (art. 66(1) DSA). Even before initiating proceedings, the Commission can exercise its investigatory powers to examine compliance of very large online platforms (art. 65(1) DSA). Among others, the Commission can request a very large online platform to provide information related to the suspected infringement within a reasonable period (art. 67(1) DSA). The platform must supply the information requested (art. 67(4) DSA) and if it supplies incorrect, incomplete, or misleading information or fails to reply within the set period, the Commission can impose a fine (art. 74(2)(a) and (b) DSA).
The letters tweeted by Commissioner Breton are difficult to qualify with respect to the enforcement system set up by the DSA. These public letters seemed more meant to put informal pressure on the platforms and generate political publicity for Breton himself. The requests for information that followed later were formal steps in the enforcement process with a legal basis in the DSA. It should be emphasised that the Commission has not yet initiated proceedings into the conduct of the platforms. Therefore, the BBC headline ‘EU opens investigation into X over alleged disinformation’ should be understood as referring to the fact that the Commission is using some of its investigative powers, before initiating or ‘opening’ proceedings.
What is next?
What steps the Commission will take next depends on its assessment of the information provided by the online platforms. X must provide the requested information by 18 October 2023 for questions related to its crisis response protocol and by 31 October 2023 on the rest. Meta and TikTok must provide the requested information by 25 October 2023 for questions related to their crisis response, and by 8 November on the rest. If the Commission deems it necessary, it can initiate proceedings against one of the platforms pursuant to Article 66 of the DSA.
In the context of such proceedings, the Commission can exercise several investigatory powers in addition to the requests for information discussed above. The Commission can interview people, conduct on-site inspections, address questions to key personnel, and require a platform to provide access to and explanations “on its organisation, functioning, IT system, algorithms, data-handling and business practices” (arts. 68-69 DSA). The Commission can also exercise these investigatory powers before initiating proceedings.
While the proceedings are ongoing and before a final decision is made, the Commission can already take several steps. If suspected non-compliance of a platform with DSA comes with an urgency due to the risk of serious damage to the platform’s users, the Commission can order interim measures against the platform (art. 70 DSA). Such interim measures could include orders to terminate or remedy a suspected infringement. Furthermore, if a platform during the proceedings offers commitments to ensure compliance with the DSA, the Commission can make those commitments binding on the platform and close the proceedings (art. 71 DSA).
The proceedings continue if a platform does not make any commitments, the Commission rejects its commitments, or there are reasons to reopen the proceedings. The Commission must adopt a non-compliance decision where it finds that the platform does not comply with the DSA, interim measures ordered, or commitments made binding (art. 73(1) DSA). The platform must provide the Commission with a description of the measures it has taken to ensure compliance with the decision (art. 73(4) DSA). Where the Commission finds no non-compliance, it must close the investigation (art. 73(5) DSA). If the Commission plans to adopt a non-compliance decision in relation to Section 5 of Chapter III, it must make use of the ‘enhanced supervision system’, which involves, among others an action plan and audit for the involved platform (art. 75).
In a non-compliance decision, the Commission can impose a fine of a maximum of 6% of its total worldwide annual turnover on a platform if it finds that the platform infringes the DSA, fails to comply with interim measures, or fails to comply with a commitment made binding (art. 74 DSA). These maximum fines are higher than under the GDPR, which permits data protection authorities to impose administrative fines up to 4% of the total worldwide annual turnover. The Commission can also impose periodic penalty payments up to 5% of the average daily income or worldwide annual turnover to compel a platform to comply with a non-compliance decision (art. 76 DSA).
At this point, one may hope that an online platform has remedied the infringements of the DSA. However, if the Commission has exhausted all its powers to end an infringement but the infringement persists and causes serious harm, the Commission can request the Digital Services Coordinator of the Member State where the platform has its main establishment to take a few more steps (art. 82 DSA). The Digital Services Coordinator can be asked to involve the management body of a platform to terminate the infringement via an action plan, or, ultimately, to request a judicial authority to order the temporary restriction of access to the platform (art. 51(3) DSA).
Concluding remarks
The decision of the EU legislator to empower the Commission to supervise and enforce the DSA against very large online platforms was innovative. Other options were to entrust such enforcement entirely to the Member States or, as considered in the DSA Impact Assessment, an independent EU body with investigatory and sanctioning powers (see also, Jaursch in his report ‘The DSA draft: Ambitious rules, weak enforcement mechanisms – Why a European platform oversight agency is necessary’). Both options were discarded in the legislative process.
The chosen governance structure, with centralised enforcement power for the Commission, has been critiqued with a view toward the fact that the Commission is also the main executive body of the EU (see also, Buri in her chapter ‘A regulator caught between conflicting policy objectives: Reflections on the European Commission’s role as DSA enforcer’). In its enforcer role, the Commission needs to assess, among others, whether very large online platforms expeditiously remove illegal content upon obtaining knowledge of it (see art. 6 DSA) and whether they diligently assess and mitigate any systemic risks stemming from illegal content, hate speech, disinformation, and other types of harmful material (see art. 34 DSA). As Wilman points out, the enforcement of the Commission’s obligations in relation to content-related platforms can lead to controversies. A fear is that the Commission’s enforcement of content moderation obligations becomes political, which could be specifically problematic in the area of ‘lawful but awful’ speech, such as disinformation.
At the same time, the Commission’s enforcement decisions are subject to judicial review by the Court of Justice of the EU, which will have ‘unlimited jurisdiction’ in this regard (art. 81 DSA). Unlimited jurisdiction means that the Court can go beyond a simple review of the legality of a decision and also reduce or increase the amount of the fine imposed (Limburgse Vinyl Maatschappij and Others v. Commission). Nonetheless, it will be important to closely monitor how the Commission is going to use its investigatory and enforcement powers against very large online platforms. The letters tweeted by Commissioner Breton seemed a political move and were fortunately followed by formal requests for information, marking the beginning of the Commission’s first investigation into the content moderation practices of the big platforms. In the process that follows, the Commission will need to move carefully to preserve its legitimacy as an enforcer of the DSA.