Re-Subjecting State-Like Actors to the State – Potential for improvement in the Digital Services Act

From the quasi-monopolistic power of platform companies, several side effects are evolving into serious problems. To name only a handful: While Facebook and Twitter have been troubled by Fake News, attempts at electoral fraud, and radicalisation for a couple of years now, Instagram and TikTok, popular especially among people under 25, have also been criticised for censorship and insufficient child protection measures. Taken together, these services have a persistent, significant influence on society, and their associated problems do not seem to solve themselves. What is the EU planning to change about this through regulation?

An article by Hannah Ruschemeier

Critics claim that Facebook is acting like a state in respect to the moderation of content, without being held responsible for its actions. Usually, decisions over speech are made by courts and not by private actors. Likewise, public governments are primarily responsible for ensuring democratic rules and the possibility to exercise fundamental rights, such as the freedom of expression, within a democratically legitimized legal framework.

Following this, the Digital Services Act (DSA) aims to limit the power of the large platform companies and to place more responsibility on them to control the content which is posted on their websites. In theory, the DSA is following the right approach, but the proposal has been shying away from imposing concrete legal obligations on the platforms when it comes to systemic risks. The focus on the broad phrased and abstract systemic risks could dilute the focus on specific tasks to fight the mentioned problems. Instead, the DSA should strengthen the interference opportunities of public authorities rather than providing even more power to the platforms via de facto self-regulation.

Concentration of Power

The similarities between the platforms and the state end with the one thing they have in common: a certain concentration of power. This concentration of power does not correspond to a monopoly of force on the side of the platforms, and it should not. Otherwise, a direct binding effect of fundamental rights would most likely not solve the problems associated with the platforms. Instead, it would legitimate the platforms in an undesirable way.

The situation of the platforms is not directly comparable to the ‘situational state-like binding of fundamental rights,’ established in the Stadionverbot decision of the Federal Constitutional Court. Above all, the power of the platforms has resulted from the systemic digital environment they created with millions of users. This is not situational but universal. Yet, the relationship between platforms and their users is more similar to the citizen-state relationship than that between two private actors, even considering costumer protection rules and other situations of disparity. Hence, the platforms are neither comparable to the state nor merely a powerful private actor who is exploiting a structural (dis)advantage, but something in-between, due to their systemic power and influence. Therefore, solving the problems of the digital platform-sphere is a task for public law.

For now, the public authorities must be strengthened to effectively regulate the digital sphere, not be replaced by private companies. This leads to the question which concrete instruments – whether self-regulation, public-regulation, breaking up Big Tech companies, etcetera – are effective and feasible.

Regulating Big Tech Players

The current legal framework, the e-Commerce Directive, does not regulate these issues. Yet recently, the Big Tech companies’ business model is under fire: In the European Union and the United States, political initiatives are pushing for more regulation, the (postponed) discussion about a global digital tax reform is just one example. The DSA and the Digital Markets Act (DMA) are following this strategy. Certainly, the idea of the DSA and the DMA is to establish responsibility of platforms for the content of their users.

Ensuring responsibility in the digital sphere is one of the biggest challenges for law in the era of digitalisation. Foremost, platforms’ concentration of power undermines the rule of law. For one, the law loses its effectiveness if it can be circumvented, ignored or if it is not implemented or enforced by public authorities.

Here, platforms are criticised for acting ‘state-like’: they create their own ‘law’ via their terms of service, in an attempt to avoid legal regulation. As far as possible, large, globally-positioned IT companies are interested in operating with uniform structures that have a global or transnational reach. Regulations that are set down in various legal systems and hence differ from one another constitute an impediment to such business models. For this reason, platforms seek out and exploit opportunities to avoid such regulations. Additionally, their influence on democratic procedures such as voting seems to be substantial. But Facebook is not a state within the state above the rule of law. For another, requiring only platforms to monitor user posts does not only burden the companies, but it also endangers the rights to data protection, privacy, and the freedom of expression, leading to rising concerns about effective remedy and even censorship. Nevertheless, the main state-like Big Tech actors must take a degree of responsibility, even if mandated by law.

Legal Implications for Very Large Online Platforms

The DSA proposal aims to target ‘very large online platforms’ with specific obligations because they are ‘where the most serious risks’ for fundamental rights occur, and have the capacity to absorb this additional burden (p. 11; section 4). Additional obligations are laid down in Art. 4 DSA proposal for those platforms which provide their services to a number of average monthly active users in the European Union equal to or higher than 45 million, Art. 25 (1). This category is most interesting regarding systemic risks for democracy and fundamental rights.

Risk assessment for fundamental rights

Therefore, these platforms shall perform a risk assessment at least once a year, identifying any significant systemic risks stemming from the functioning and use of their services in the Union (Art. 26 DSA), especially with regard to illegal content, negative effects for the fundamental rights of privacy, family life, freedom of expression and information, the prohibition of discrimination, the rights of the child, as well as intentional manipulation. The latter intends to avoid negative effects on public health, minors, civic discourse, or actual or foreseeable effects related to electoral processes and public security. Unquestionably, these obligations explicitly aim to conquer the systemic risks arising from the concentration of power of the mentioned online platforms. The risk assessment and the accompanying mitigation of risks in Art. 27 DSA evoke the state protection obligation of fundamental rights – including the same level of uncertainty if concrete duties or claims could emerge from these obligations. First, the online platforms shall implement reasonable, proportionate, and effective mitigation measures, which may include adapting content moderation, reinforcing the internal processes or supervision or initiating cooperation with other online platforms through the code of conduct (Arts. 27 (1) b), 35 DSA). These obligations only address internal actions chosen by the platforms themselves – the specific nature and scope is up to them. At the same time, the platforms are now responsible for the systemic risks, which empowers them to regulate these risks on their terms – this has been the classic responsibility of public authorities. Theoretically, the requirements can be substantiated by the Commission in cooperation with the Digital Services Coordinators (DSCs), discussed in greater detail below, via general guidelines. This kind of soft law, like the recommendations of the EDPD has become quite important in practice, but it all depends on the quality of the best practice recommendations. Whether the general guidelines for best practices of the Commission and the decisions of the national regulatory bodies or questions of liability will be able to rectify this into a uniform application of the DSA remains unclear at this point.

Consequently, the broad obligations of the platforms to conquer systemic risks give them more power to control themselves, and seem to only marginally extend opportunities for external intervention – meaning by public authorities.

Moreover, the important differences between the state protection of fundamental rights and the risk assessment of the platforms are within the execution of these obligations. Very large online platforms shall be subject to audits to assess compliance with the DSA. These audits ought to be performed by independent organisations with proven expertise and professional ethics, which is very vague. As a result, member states’ public authorities are not necessarily involved in the direct oversight of very large platforms.

Especially in this situation, where the platforms are well known for avoiding legal regulation, the self-regulatory approach is not enough when it comes to the protection of fundamental rights and the legal interests the Commission is describing as endangered in the DSA. Following this, the DSA should provide concrete actions for the platforms to take in addition to the abstract goals of preventing systemic risks. For example, the compliance officer in Art. 32 DSA is not sufficient to replace the requirement of hiring of enough staff to moderate illegal content.

Transparency

Further worth mentioning is the transparency requirement laid down in Art. 29 DSA; platforms shall set out in their terms and conditions the main parameters used in their recommender systems in a clear, accessible, and easily comprehensible manner. Additionally, Art. 30 DSA requires the platforms to make information about online advertisement publicly available until one year after the ad was displayed for the last time. These kinds of obligations are known from the GDPR (e.g. the right to information in Arts. 13, 14 GDPR). But transparency alone does not put the users in control over the content which is prioritised solely by the platform’s own algorithms. This should have been a lesson learnt from the GDPR.

Execution

As a result, the broadly defined requirements of Art. 27 DSA seem to, in fact, be a framework for the self-regulation of the companies in scope. Consequently, this could lead to insufficient regulation or overregulation. Either the risk obligations are too vague to require actions which are not already covered by internal compliance rules, or platforms are likely to aim at reducing their exposure to penalties by deleting content, which is legal, but associated with systemic risk-potential. It remains unclear whether these risks can be mitigated by the audit and the required implementational report in Art. 28 DSA, or the transparency requirements of Art. 29 DSA. Evidently, the efficiency of the DSA is inseparable from the oversight of and execution by the designated authorities.

In comparison, the subsequent data access right in Art. 31 DSA enables public authorities to request data from the platforms via the DSC, for example to the benefit of vetted researchers, for the purpose of conducting research about the systemic risks. The DSC is an innovation of the DSA to ensure the administrative execution of the regulation, following the ‘country of origin principle’ (Chapter IV DSA). In this situation, the DSCs must be independent from any other public authority or any private party, comparable with the federal and state data protection officers. The DSCs’ powers are laid down in Arts. 41, 42 DSA, including information, investigation, and infringement rights, among them, for example, the power to impose penalties of up to 6% of the annual income of the platform provider, further specified in rules by the member states (Art. 42 DSA), or options for enhanced supervision (Art. 50 DSA). Correspondingly, the DSA wants to establish the cooperation with the very large online platforms via soft law, like codes of conduct (Arts. 35-36 DSA). In effect, the DSA weaves an intricate web of responsibilities and oversight measures. The latter centrally rely on Art. 31 DSA, which establishes a new obligation to provide the DSCs or the Commission with the data that are necessary to monitor and assess compliance with the regulation. The platform must answer the request within 15 days (Art. 31 (6) DSA). This is an important and concrete information right, which enables the DSC or the Commission to take actions.

At first glance, therefore, it seems that the member states could play an influential role, enforcing the DSA via the DSCs. Obviously, the opportunity to impose penalties as laid down in Art. 42 DSA is a potent competence, but as the GDPR has shown, high fines alone do not ensure effective execution. The concept of the DSC has the potential to become a powerful, but likely controversial authority. For example, the GDPR’s data protection officers are quite disputed: some say they are blocking innovation, others claim that they have not taken enough action against the large platforms. In practice, it will be crucial whether the DSCs are able to establish a structure with sufficient personnel and financial resources.

The Commission has the final say…

Nevertheless, the Commission has the final say and can initiate own proceedings if the DSC did not take any investigatory or enforcement measures pursuant to the request of the Commission (Art. 51 DSA). In fact, the DSA allows the Commission to take on potentially very broad enforcement measures, including requesting the DSC to go to court (Art. 65 DSA). The European Board of Digital Services appears similar to the EDPB (see Art. 68 GDPR). The ‘Board’ shall advise the DSCs and the Commission to achieve consistent application of the regulation within the Union. Still, the decentralised principle has not always been very successful in Data Protection Law. Similarly, an uncoordinated ‘side-by side’ of different supervisory authorities should be prevented.

What’s next?

All in all, the Commission promotes the DSA with its advantages for citizens, businesses, and providers as well as for the society at large. It promises greater democratic control and oversight over systemic platforms as well as the mitigation of systemic risks, such as manipulation or disinformation, and emphasises the protection of fundamental rights, especially the freedom of speech. On the one hand, the DSA/DMA package has the potential to shake up the digital economy, especially when sector-specific rules including individual rights of affected persons follow. On the other hand, member states’ public authorities play a rather subordinate rule in the proposal (for a critical analysis, see here). The vague wording of the systemic risks could lead to de facto self-regulation and increasing responsibilities of the platforms, which was the goal, but this leads to the strange situation that the DSA, which was drafted to limit the influence of the platforms, actually further empowers them. In addition, the DSCs are challenged with defining their role next to the Commission. All in all, however, the DSA/DMA package is pointing in the right direction of a digital future-proof Europe.

This article has previously been posted at Verfassungsblog – https://verfassungsblog.de/power-dsa-dma-13/. With friendly permission by our colleagues and the autor we cross-publish it here under licence CC BY-NC-ND. 

This Blogpost was written by

Author

  • Hannah Ruschemeier

    Dr. Hannah Ruschemeier is a postdoctoral researcher at the ELSI (ethical, legal, and social issues) unit at CAIS NRW and the Chair of Public Administration, Public Law, Administrative Law and European Law at the University of Administrative Sciences Speyer. She is a board member of RAILS and editor of RAILS-Blog.

Hannah Ruschemeier Written by:

Dr. Hannah Ruschemeier is a postdoctoral researcher at the ELSI (ethical, legal, and social issues) unit at CAIS NRW and the Chair of Public Administration, Public Law, Administrative Law and European Law at the University of Administrative Sciences Speyer. She is a board member of RAILS and editor of RAILS-Blog.