Digital Services Act: the challenges of content regulation
While the role of online platforms as de facto private regulators has become more evident with their recent decisions to ban Donald Trump and other politicians, the European Union wants its future Digital Services Act to be a paradigm on regulating that digital sphere. To ensure that, it will have to overcome the challenges present in the current proposal of the European Commission.
The riots at the US Capitol in January have initiated an important number of debates on topics that range from the profound division within the American population to the stability of the Western liberal democracies. Social media and digital platforms have been at the centre of these debates, as policymakers and civil society groups have discussed about their liability on the spreading of conspiracy theories, the danger they represent by enabling online organisation of violent groups or their rule-setting powers that allow them to ban any user – including the President of a democratic country.
These debates have crossed the Atlantic and landed strongly at the European level. In a recent article on Politico, Commissioner Thierry Breton – responsible for the EU policies on the internal market – compared the consequences that the riots will have on digital platforms to the paradigm shift that 9/11 meant for global security and stated that the US legislation on the topic “has collapsed”. Adding on this, Members of the European Parliament (MEPs) have also spoken about the need to regulate more the companies and their powers on the digital sphere. For instance, European Parliament Vice-President Marcel Kolaja defended that social networks should not be able to “arbitrary create rules” and called for independent courts to decide on what should be illegal.
In both cases, the European Commission’s proposals on a Digital Services Act (DSA) and a Digital Market Act (DMA) have been presented as the first attempt of regulating the digital platforms, with the objective of making “what is illegal offline should be illegal online” by establishing a clear set of obligations and responsibilities for companies operating online platforms.
Given the importance of these proposals, it is necessary to explore how they will tackle the issue of content regulation and what are the potential challenges they will face.
Who are the trusted flaggers?
The DSA will set new obligations for private companies that operate online platforms, making them liable for the content that their users’ posts. As Commissioner Breton points out, the primary goal of the DSA is to solve the issue of “illegal viral content”. These obligations include giving access to internal data to regulators and establishing measures to prevent the dissemination of illegal content. The creation of a system of “trusted flaggers” has been presented as a flagship measure.
According to the proposal for the DSA, platforms will have to establish “notification and action mechanisms” that will prioritise the notices of trusted flaggers on illegal content. Hence, these trusted flaggers are given a key role in assisting the platforms and providing guidance on what are cases of illegal content, although it is ultimately up to the company to decide if the notified content is illegal. The Commission’s proposal defines them as entities (not individuals) who are awarded by the national Digital Services Coordinators after a successful application for the role.
Following this logic of having external actors participate in the supervision of the content regulation, the DSA establishes further obligations for “very large online platforms”, defined as those platforms used by at least 10% of the EU population, and currently this applies to platforms which have 45 million users across the Member States. These very large platforms must conduct, at least once a year, a risk assessment on how they deal with the dissemination of illegal content, negative effects to fundamental rights and intentional manipulations of their services to harm the public, with a special mention to democratic elections. Additionally, they must appoint an independent auditor (also at least on a yearly basis) to elaborate a report on their compliance with the norms of the regulation. If the audit report is not positive, the company will have to adopt the operational recommendations addressed to them and take measures to implement them. This consideration must appear in an audit implementation report, for which the company has one month to produce. In the case that the company does not implement the recommendations, it will have to explain the reasons in said report and set alternative measures to resolve the issues of non-compliance with the regulation.
While these measures depend on external actors to assist, cooperate and supervise platforms in regard to content regulation, the proposal of the Commission does not offer substantive indications on how to define these actors or how the procedures to appoint them should be, which might create confusion during the implementation of these measures. In the case of trusted flaggers, while the Commission does give the responsibility to the Digital Services Coordinators of awarding this role and suggests the establishment of an application procedure, it does not enter into further details about who can apply for this position aside from the conditions of being independent to the platform and having expertise on the topic. Moreover, the DSA does not clarify if these trusted flaggers would be platform-specific and/or country-specific.
The same happens with the auditors that platforms are expected to appoint. Apart from demanding that they are independent and experts on the field, these auditors should “have proven objectivity and professional ethics”. With these criteria, there are no clear indications on how platforms should choose them (for instance, how are professional ethics measured?) or how the auditors should carry their investigations.
The importance of national regulators
Given that Member States are the first point of contact, the DSA puts an especial emphasis on providing them with sufficient competences to enforce the rules. As mentioned before, the national Digital Services Coordinator will have a critical role in the supervision of digital platforms. Whereas Member States can choose to have multiple authorities with competences related to the digital services, they can only appoint one as their Digital Services Coordinator, who will be the point of contact of the European Commission and the other Member States. This Coordinator must be independent of other private and public bodies, although the role can be assigned to an already existing authority.
Aside from the power of approving trusted flaggers, these Coordinators will be granted enough powers within their jurisdiction to investigate platforms, with requests of information and on-site inspections, and enforce the DSA through the imposition of fines and legal actions. This aspect is one of the most debated ones of the DSA, as Member States are allowed to impose fines of up to 6 % of the annual income or turnover of the platform.
The importance of the Digital Services Coordinator does not end with its territorial jurisdiction, as it is also in charge of cooperating with its counterparts of other countries. This becomes clearer with the creation of a European Board for Digital Services. This Board will advise the Digital Services Coordinators and the Commission and will be composed by representatives of the former. The Board will assist in the supervision of very large platforms and will give advice on the correct application of the DSA. Therefore, not only the Commission, but also Members States will have a say in the functioning of the European regulatory framework concerning digital services.
Hence, it seems unavoidable that the conflicts of interest between Member States will exceed the stages of negotiation of the DSA at the Council and will be a recurring factor within the EU digital services regulation. The most significant example might come from Ireland, where a majority of digital companies have their base for the EU market. This situation might de facto make Irish decisions more important than that of other Member States, while also differentiating even more the positions between the different European governments.
As it was with the case of the General Data Protection Regulation (GDPR), the EU aims to become a trend-setter with regards to regulating digital platforms. While the DSA and the DMA will regulate aspects of the digital market exceeding online content moderation (i.e., targeted advertising, competition rules, etc.), supervising that platforms are effectively acting to stop the dissemination of illegal content is a central issue that worries the European regulators.
After the proposal, now it is the time of the European Parliament and the Council of the EU to establish their positions and engage in negotiations on the final form of the regulations. As some policymakers have argued, this is a chance to solve some of the challenges that the original proposal might have and provide clearer rules, obligations and rights to public authorities, digital platforms and consumers.
Miguel Ángel Zhan Dai, Consultant